Performance issues while exporting kendo grid data to pdf - kendo-ui

I have a grid with a big set of records(round 10,000 on an average). I have implemented server side paging on the grid to retrieve 50 records at a time. Everything with the grid is working perfectly fine till I decide to do a export to pdf from the grid.
When I try to do the same the export takes round 5-6 mins on an average to complete. I tried to debug on the server side and realised that multiple calls to the server were being made to retrieve the data for the export which probably was eating up the time. I alternatively tried to modify the set of records retrieved to 1000 at a time so as to reduce the server calls to eventually reduce the time taken for the export. But now while trying to export to pdf the page crashes. I changed the set of records to 500 but the crash still happens when exporting.
On another note the export to excel works pretty fast in round ~4 secs. When trying to debug I found out that there was only a single call to the server made while exporting to excel and this renders the entire set of data as well.
Please note that I am using the kendo defaults for exporting to pdf and excel with the grid.
Thanks.

you need implement server side export.
kendo says :
Important
When the allPages option is set to true and serverPaging is enabled, the Grid will make a "read" request for all data. If the data items are too many, the browser may become unresponsive. Consider implementing server-side export for such cases.
look this page
Full example : link

Related

Kendo Grid Export Not working more than 50000 records

I have Kendo-Grid which is holding the records more than 50000.Now I'm going to do export then. I'm getting browser unresponsive.How can I solve this problem. Can we have any work around for this.
We have already built in JSON reedy with holding more than 50000 records,
We don not have any server side code. we have done at client side only.

Laravel pagination in Data Table

I am using DataTable plugin in Laravel. I have a record of 3000 entries in some
But when i load that page it loads all 3000 records in the browser then create pagination, this slow down the page loading.
How to fix this or correct way
Use server-side processing.
Get help from some Laravel Packages. Such as Yajra's: https://yajrabox.com/docs/laravel-datatables/
Generally you can solve pagination either on the front end, the back end (server or database side), or a combination of both.
Server side processing, without a package, would mean setting up TOP/FETCH or make rows in data being returned from your server. 

You could also load a small amount (say 20) and then when the user scrolls to the bottom of the list, load another 20 or so. I mention the inclusion of front end processing as well because I’m not sure what your use cases are, but I imagine it’s pretty rare any given user actually needs to see 3000 rows at a time.

Given that Data Tables seems to have built-in functionality for paginating data, I think that #tersakyan is essentially correct — what you want is some form of back-end filtering or paginating of rows of data to limit what’s being sent to the front end.

I don’t know if that package works for you or not or what your setup looks like, but pagination can also be achieved directly from a DataBase returning data via the SQL (using TOP/FETCH for example) or could be implemented in a Controller or Service by tracking pages of data and “loading a page at a time” both from the server and then into the table. All you would need is a unique key to associate each "set of pages" for a specific request.
But for performance, you want to avoid both large data requests and operations on large sets of data. So the more you limit how much data is being grabbed or processed at any stage of your application using it, the more performant your application will be in principle.




Browser crashes while binding 25k-100k records

In one of the application, I need a grid to load 100k records without pagination by using Kendo Grid virtualization.
Technologies used: SQL server 2012, asp.net web app, angular.
I have 15 columns in the grid, up to 10K records, browser is loading between 20,000-100,000 records and the browser crashes.
Can I bind 100k records in a Kendo Grid without pagination?
Is there any other way to load the 100k records without pagination?
What is the maximum data size browsers (Chrome, Firefox) support?
According to: http://demos.telerik.com/kendo-ui/grid/virtualization-local-data
They loaded 500K with 5 columns. If you notice in the code they do limit to smaller data on certain browsers. It likely comes down to JavaScript memory and how fast JS can process the volume.
My recommendation:
Use the Grid with server pagination, but also allow server sorting and filtering. I used OData for this at 100 rows at a time, makes the server part easy.
Offer the option to export the data to CSV or Excel.

How does ssrs cache reports that contain subreports

I'm trying to figure out how SSRS handles caching of subreports. The report I have has a lot going on: dynamic formatting, dynamic links, dynamic graphs, etc. Because of all this, it takes quite a while to load (about 10 seconds), and every time you click on something in the report, it has to reload everything (another 10 seconds). I looked into caching options, but the problem is I need the data shown in the graphs to be the live data (the rest of the report doesn't display any live data).
I came up with an idea to put the graphs in a subreport, so that I could cache the main report, and only the subreport would have to be reprocessed on every load. My thinking was that this would significantly cut down on the processing time for the whole report, and I could schedule the cache to be preloaded every night.
I know that Reporting Services isn't the ideal way to deliver this type of thing, and creating it as a website would yield much better performance, but my company wants to try and make it work with SSRS first.
Does anyone know if this would work? Does SSRS cache subreports separate from their parent reports, or would caching the parent report also cache the subreport?
Sub reports are processed at the same time as the main report you are calling, and you would need to setup caching on the main report to cache the contents of the sub report.
For the components that do not require live data, you can cache shared datasets to improve report processing time. Seperate the data sources for the items that you need live data feeds from the items you can cache.
https://msdn.microsoft.com/en-us/library/ee636149.aspx
To open the Caching properties page for a shared dataset
Open Report Manager, and locate the report for which you want to configure shared dataset properties:
Point to the shared dataset, and click the drop-down arrow.
In the drop-down list, click Manage. The General properties page for the report opens.
Click the Caching tab.
http://i.stack.imgur.com/vFHQP.png

Tablesorter VS PagedList

I'm working on a project where I have to fetch data from a database then present results in tables with at least 200 rows until 3000 rows.
I tried first Pagedlist and noticed that only the fixed number of records per page is displayed in the client side and each time when the user change the page, there is a new request to the server.
For tablesorter, I noticed that all the results are displayed and paging is only visual (only the presentation of table) but everything is in the client side
Is my understanding correct?
What I want to know is which approach is better (in time execution)?
Actually I'm working on localhost and there is only me as user so I can't notice the difference even tablesorter take more time to load in the first time but after it's very quick meanwhile the pagedlist method is faster in loading page but each time it request the server to change page and load the according data
When finished the application will be in the server and many users will have access to the application to add, search delete...
Which of these two approaches is a better choice?
Thanks

Resources