Browser crashes while binding 25k-100k records - kendo-ui

In one of the application, I need a grid to load 100k records without pagination by using Kendo Grid virtualization.
Technologies used: SQL server 2012, asp.net web app, angular.
I have 15 columns in the grid, up to 10K records, browser is loading between 20,000-100,000 records and the browser crashes.
Can I bind 100k records in a Kendo Grid without pagination?
Is there any other way to load the 100k records without pagination?
What is the maximum data size browsers (Chrome, Firefox) support?

According to: http://demos.telerik.com/kendo-ui/grid/virtualization-local-data
They loaded 500K with 5 columns. If you notice in the code they do limit to smaller data on certain browsers. It likely comes down to JavaScript memory and how fast JS can process the volume.
My recommendation:
Use the Grid with server pagination, but also allow server sorting and filtering. I used OData for this at 100 rows at a time, makes the server part easy.
Offer the option to export the data to CSV or Excel.

Related

How Pagination works internally in JQ grid?

Is JQgrid create all records in DOM or store locally in JavaScript object show records on pagination event?
Thanks
We tried JQ grid in our project experience slowness rendering table for large number of records (10,000) with page size of 50.
jqGrid stores the local data in array and display in DOM only the portion set in rowNum parameter.
Personally I think that loading 10k records locally is to big for any grid component. This will consume a lot of memory and will slow dawn any operation in the application.
The best is to store the data on server and request only the portion you want. Look here how we deal with 1 million records

Laravel pagination in Data Table

I am using DataTable plugin in Laravel. I have a record of 3000 entries in some
But when i load that page it loads all 3000 records in the browser then create pagination, this slow down the page loading.
How to fix this or correct way
Use server-side processing.
Get help from some Laravel Packages. Such as Yajra's: https://yajrabox.com/docs/laravel-datatables/
Generally you can solve pagination either on the front end, the back end (server or database side), or a combination of both.
Server side processing, without a package, would mean setting up TOP/FETCH or make rows in data being returned from your server. 

You could also load a small amount (say 20) and then when the user scrolls to the bottom of the list, load another 20 or so. I mention the inclusion of front end processing as well because I’m not sure what your use cases are, but I imagine it’s pretty rare any given user actually needs to see 3000 rows at a time.

Given that Data Tables seems to have built-in functionality for paginating data, I think that #tersakyan is essentially correct — what you want is some form of back-end filtering or paginating of rows of data to limit what’s being sent to the front end.

I don’t know if that package works for you or not or what your setup looks like, but pagination can also be achieved directly from a DataBase returning data via the SQL (using TOP/FETCH for example) or could be implemented in a Controller or Service by tracking pages of data and “loading a page at a time” both from the server and then into the table. All you would need is a unique key to associate each "set of pages" for a specific request.
But for performance, you want to avoid both large data requests and operations on large sets of data. So the more you limit how much data is being grabbed or processed at any stage of your application using it, the more performant your application will be in principle.




Performance issues while exporting kendo grid data to pdf

I have a grid with a big set of records(round 10,000 on an average). I have implemented server side paging on the grid to retrieve 50 records at a time. Everything with the grid is working perfectly fine till I decide to do a export to pdf from the grid.
When I try to do the same the export takes round 5-6 mins on an average to complete. I tried to debug on the server side and realised that multiple calls to the server were being made to retrieve the data for the export which probably was eating up the time. I alternatively tried to modify the set of records retrieved to 1000 at a time so as to reduce the server calls to eventually reduce the time taken for the export. But now while trying to export to pdf the page crashes. I changed the set of records to 500 but the crash still happens when exporting.
On another note the export to excel works pretty fast in round ~4 secs. When trying to debug I found out that there was only a single call to the server made while exporting to excel and this renders the entire set of data as well.
Please note that I am using the kendo defaults for exporting to pdf and excel with the grid.
Thanks.
you need implement server side export.
kendo says :
Important
When the allPages option is set to true and serverPaging is enabled, the Grid will make a "read" request for all data. If the data items are too many, the browser may become unresponsive. Consider implementing server-side export for such cases.
look this page
Full example : link

KendoUi Grid page size performance issue

I am using kendoUi Grid with 20k records, When i am changing page size 20 to 200 grid taking 40 to 50 sec to work. some times it taking min. paging is client side only.
For large datasets, it's better to use server paging, mainly because:
Faster loads
Avoid wasting memory
Avoid unnecessary database and network loads, database just gives you the number of records of the current page
You should consider enabling server paging at datasource level, and then read pagination values on backend side before performing the query to the database.
http://docs.telerik.com/kendo-ui/api/javascript/data/datasource#configuration-serverPaging
good question, i am also faced this type of issue.
Please use MVVM logic instance of MVC for bind the grid,
for more please find this below link.
http://demos.telerik.com/kendo-ui/grid/mvvm

Tablesorter VS PagedList

I'm working on a project where I have to fetch data from a database then present results in tables with at least 200 rows until 3000 rows.
I tried first Pagedlist and noticed that only the fixed number of records per page is displayed in the client side and each time when the user change the page, there is a new request to the server.
For tablesorter, I noticed that all the results are displayed and paging is only visual (only the presentation of table) but everything is in the client side
Is my understanding correct?
What I want to know is which approach is better (in time execution)?
Actually I'm working on localhost and there is only me as user so I can't notice the difference even tablesorter take more time to load in the first time but after it's very quick meanwhile the pagedlist method is faster in loading page but each time it request the server to change page and load the according data
When finished the application will be in the server and many users will have access to the application to add, search delete...
Which of these two approaches is a better choice?
Thanks

Resources