Slow rendering performance for 500 rows and 12 columns table - performance

I am using Mojarra 2.1.19. I have a simple Facelet which contains only one table with 500 rows and 12 columns. The cells contain only text property data. It needs about 2 seconds to execute the render response phase.
As a result the application is not very responsive. Also all rows must be seen. The client doesn't want pagination.
Any suggestions? Is Mojarra really so slow?

I apologize, there was JSF debug logging turned on, and it made the execution slower. I turned off the logging and the simple table is fast now.

Related

How Pagination works internally in JQ grid?

Is JQgrid create all records in DOM or store locally in JavaScript object show records on pagination event?
Thanks
We tried JQ grid in our project experience slowness rendering table for large number of records (10,000) with page size of 50.
jqGrid stores the local data in array and display in DOM only the portion set in rowNum parameter.
Personally I think that loading 10k records locally is to big for any grid component. This will consume a lot of memory and will slow dawn any operation in the application.
The best is to store the data on server and request only the portion you want. Look here how we deal with 1 million records

Spring JPA - Update - SET - Huge Columns - Performance

I come across this link - Update single field using spring data jpa on search
In My application, one table is displayed in the front-end which has 100 columns, where user changes approximately 5 to 10 columns max.
However the front-end sends all the values and back-end update query has 100 columns in the SET.
Is this is a best practice? Some says - SET with all the columns doesn't impact as the JPA will do delete and insert internally or the DB does it. Is this is true?
What should be the best practice and does having all columns in the SET affects the performance in general?
Thanks
If the user has changed just columns and it is one row updated, then no, the performance would not be affected much. It would be affected, but in most cases optimizing that performance is not necessary unless you're handling a huge amount of updates. And when you're using JPA i would guess you do not actually populate the update yourself but using an entity where you update the affected fields? Then JPA would chose how to actually do the update (most probably sending all fields of the entity to the update).
If it would be 100 rows and the user changes data in 5-10 rows, then it would be better to only pass those 5-10 rows to the database update.

What effect have the number of records of a particular table (in SQL Server) on LINQ Queries response time in C# MVC

I made some googling about my subject title, but didn't find useful answer. Most question were about effect of number of table's columns on query performance, while i need to know the effect of number of table's rows on linq query response time in C# MVC project. Actually, i have a web MVC project in which i try to get alarms set using ajax call from server and show them in a web grid in client side. Each ajax call is performed every 60 seconds in a loop created with setTimeOut method. Number of rows are gradually increasing within the alarm table (in SQL Server Database) and after a week, it reaches to thousands of rows. At first when lunching the project, I can see in DevTools of browser(Chrome), the time each ajax call takes is about 1 second or so. But this time gradually increases every day and after a week each success ajax call takes more than 2 minutes. This causes about 5 ajax call always be in pending queue. I am sure there is no memory leak both in client(JQuery) and server(C#) sides code, So the only culprit I suspect is SELECT Query response time performed on alarm table. I appreciate any advice.
Check the query that is executed in the database. There are two options:
Your Linq query fetches all the data from the database and process them locally. In this case the number of rows in the table is quite important. You need to fix your Linq query and make sure it fetches only the relevant rows from the database. Then you can hit option 2.
Your Linq query fetches only the relevant rows from the database, but there are no relevant indexes in your table and each query scans all the data in the table.
However with only few thousands rows in your table, I have doubts it will take 2 minutes to scan the table, so option 1 is more likely to be the reason for this slowdown.

Disposing the BindingSource of a ComboBox is extremely slow...

I have a main table with 4 lookup tables. All tables are bound to SQL tables accessed via stored procedures. Each lookup table has between 3 and 27 rows. The main table has about 22,000 rows (pet food & products).
My DataGridView displays rows of the main table with the lookup columns configured as ComboBoxes. I can insert, delete, edit the main table, ... all of these functions work fine. The initial loading of the form and inserting a new row each take about a second but this is not concerning. All other operations are relatively fast.
Here is a screen shot of the DataGridView:
The problem comes when I close the form...and only after having inserted one or more rows into the main table. The closing of the form can take up to a minute. In the closing of the form I now dispose of the binding sources for the 5 tables myself so that I can time them. Disposing of the binding source for the 4 lookup tables routinely takes 10-15 seconds per table. Closing the binding source for the main table takes no time at all. Again, this only happens after inserting a new row into the main table. I can edit main table rows, change lookup column values, and delete rows and closing the form in those use cases is instant.
I have tried running the program within VS, outside of VS running a debug EXE, and running outside of VS running a release version of the EXE, all with similar results.
What can I do to prevent this long delay disposing of the ComboBox binding sources? Is this typical and are there alternatives I should be considering?
After three days of pounding my head against the wall trying all kinds of unsuccessful resolutions, I rebuilt the application from scratch. That fixed the problem and I believe I discovered the cause. At first I just created the data set with all the table adapters, which was pretty fast, and then I added a single form and grid to mimic the condition I described above above. Testing confirmed no issues at all so I continued added more forms with the same ComboBox look-ups and it continues to work fine. I am pretty sure that there was something screwy in the data set definitions I had previously. Hopefully this helps someone in the future.

KendoUi Grid page size performance issue

I am using kendoUi Grid with 20k records, When i am changing page size 20 to 200 grid taking 40 to 50 sec to work. some times it taking min. paging is client side only.
For large datasets, it's better to use server paging, mainly because:
Faster loads
Avoid wasting memory
Avoid unnecessary database and network loads, database just gives you the number of records of the current page
You should consider enabling server paging at datasource level, and then read pagination values on backend side before performing the query to the database.
http://docs.telerik.com/kendo-ui/api/javascript/data/datasource#configuration-serverPaging
good question, i am also faced this type of issue.
Please use MVVM logic instance of MVC for bind the grid,
for more please find this below link.
http://demos.telerik.com/kendo-ui/grid/mvvm

Resources