Since a few weeks we have performance problems in our web applications. First we thought the problems belongs to large DOM. Large DOM isn´t really good but thats not the main perfomance problem.
The problem are the composite components. The last weeks we developed core composite components to reduce code redundancy and have centralized places for changes. We replaced each individual implementation within our application(s) with the composite component.
First test case:
We´ve created a single page with a commandButton, a outputText and 50 composite components that contains a p:dialog. Click on the button just updates the outputText component (Ajax).
The update takes about 1.5 seconds.
Second test:
Same page with commandButton and outputText component. Instead of using the composite components we´ve added the p:dialogs directly to the page.
The update takes 0.06 seconds.
Is there something wrong with using composite components? I can´t find similar threads that are related to performance problems with composite components.
There were some performance bugs in Mojarra version up to 2.1.21, so use myfaces, until mojarra bugs get fixed, For more info visit http://blog.oio.de/2013/05/06/jsf-performance-tuning/
Related
I am working on a React Native application.
I have a Scene with a ListView with some Components inside.
Each one of the Components in the ListView can be marked as selected (or unmarked).
Once the Scene has focused in, I fetch the data and display those 500 Components.
I pass the props to each one of the Components in the OnRenderRow method.
Ideally, every time I click on one of the Components I would save the id inside a dictionary in the state and then I would trigger the fetch (now cached) again. Once I fetch the 500 Components again OnRenderRow checks if the ids exist in the dictionary and sets the isSelected prop to true (React will take care of rendering as selected).
Why I am fetching the data again?
In order to refresh React Native's ListView the datasource has to be 'marked as dirty'.
If I would try to refresh the list by reusing the same dataSource:
this.setState({dataSource: this.state.dataSource})
Since it's the same object it would not trigger any OnRenderRow. As far as I know the only way to force the update of the list is using:
ds.cloneWithRows(Array)
Which means it has to go through the whole 500 Components again (Actually it parses them all but it renders them in small windows of 20 or so)
This works pretty fast in the emulator but in the iphone5 it takes 2-3 seconds. I have the same problem with filtering.
Is there any way to force the ListView onRenderRow without parsing the whole dataSource again?
Note: The problem could also be related to Realm Native, but since the second time I fetch the elements I am caching the query and the problem persists its not probably the cause of it.
I have a JSF page with PrimeFaces 5.1 and I'm experiencing performance issues in a complex page. The page has a common design: a paginated datatable, a tree for filtering, a panel with the selected item details and a menu with some actions on that item. The majority of these components are provided by PrimeFaces.
Profiling the application, I concluded that the DB queries are not the bottleneck but rather the restore view and render response phases. Moreover, the major factor in the delay seems to be the number of JSF components as opposed to the amount of data stored in the backing bean. Even AJAX requests with partial processing and rendering will take time in those phases.
How can I reduce the processing time, specially in the AJAX requests?
My 'short list':
Use a recent version of your JSF library (especialy Mojarra > 2.1.21)
Use partial processing (process="#this") (already mentioned by you)
Don't use validation on ajax requests (immediate=true)
Use partial submission (partial-submit="true")
Be selective in what you update (Not an #form)
Don't do any amount of work in a getter (and if you do, do it lazy)
Don't use inline editing in the datatable
Use native components in the datatable if you do cannot do 7
Don't use loooooooooong select lists
Use lazy loading for filtering, sorting and paging in datatables and use a page size value that is moderate (tnx #Vrushank, I forgot this)
Don't use partial state saving in Mojarra with complex pages
We are using kendo grid control in our aspx page. We have significant amount of data in the grid. currently the load time of the page is around 5-7 seconds. Is there any smart tricks to reduce the load time? We are trying reduce the load time to 3-4 seconds. We are populating the grid using REST service. We have already removed the duplicate REST calls & unnecessary parameters form the REST web service.
We would really appreciate a prompt response on this.
Thanks in advance
Lalatendu
Do you have time on where are you spending time?
Things that you can try: do not load full kendo.all.min.js but only what you need. There is a very good blog on this in Kendo UI website Using UI Libraries Without the Bloat
Use serverPaging in order to transfer less data.
You might also consider using virtual scrolling
For this specific project, my team and I are thinking of leveraging the power of AngularJS directives and use them to build custom elements specific to our project, and use them like they were real HTML components.
For instance...
Say we have this table where each row is composed with a bunch of information that we get from a request, each row will be repeated with ng-repeat. Inside each row we will have multiple custom buttons with the same custom behavior. This button will also be used across different modules/pages on the website and not only on this specific table.
Our first thought was to use 2 directives, one for the table row and one for the button. The idea is that these are custom components/elements with custom attributes to define the custom behavior of the element. This would allow us to have a really nice modular application where each component is developed and unit-tested individually. Our HTML would also be easy to ready and understand.
Does this sound good?
Now, what about performance issues? Could we encounter big issues with this approach if, say we have a table with 100 of those rows and 5 of those buttons per row. Could this be big a problem if those rows/buttons had a couple of bindings to update information each X seconds?
The father of Angular Misko Hevery has this to say about performance and data-binding:
How does data binding work in AngularJS?
But in short like he says as long as you dont have more than 2000 data-bound items per page performance wont be an issue...
I have various tables with the following size : 12 columns and up to 1800 rows. It takes 8 seconds to render it to the user. I currently use h:dataTable. I tried ui:repeat to get the row data from a Java List object, managed by JSF. Although this works fine, the 8 seconds to render the table is unacceptable. I'm trying to find other ways to do this, but need to keep JSF as my controller for action buttons on the page. In other words I want to create the 'table markupto send to thepage myselfand then still associate actions onh:commandButtons` to the managed bean methods. Is there a way to do this?
The only ways I can think of is to use jquery or ajax to create the table markup, although I am new to technologies other than JSF for UI development.Maybe then I would somehow pass that to the client for render. The only problem is I don't know how to generate the markup from my list, and second how I would inject it between h:commandButtons that are in my XHTML file currently.
Does any one know how I can solve this without having to completely rip OFF JSF? One main problem I have is that the business requirement that says we can't page the datatable (i.e: Next / Back buttons displaying 100 at a time for example). So, possibly I was thinking I could do this by Ajax calls to the server and get 100 rows at a time after page ready, and append new rows behind the scenes to the user. This would be a "perceived" speed of load, but I don't know how to do this at all.
8 seconds isn't bad for a whopping 1800 rows on 12 columns. 10~100 rows is done in less than a second, right?
Before continuing with this, are you absolutely positive that all those 1800 rows are supposed to be shown at once? Isn't this very user unfriendly? Wouldn't the user have to need Ctrl+F to find the information it is looking for? Isn't that annoying? Why don't you introduce filtering (a search field) and pagination exactly like as Google is doing to present the zillion of results in a sane and user friendly manner?
Anyway, you could consider using "On-Demand data" option of PrimeFaces <p:dataTable>, wherein the data is loaded by ajax during scrolling via <p:dataTable liveScroll="true">. See also the showcase example. No homegrown code nor manually fiddling with jQuery necessary. PrimeFaces has done it all under the covers.
It you don't want to use PrimeFaces for some reason, then you could consider using OmniFaces <o:componentIdParam> in combination with some jQuery "live scrolling" plugin. See also the last example in its (snapshot) showcase page for a kickoff example (which should easily be adapted to be triggered by hitting the scroll bottom instead of by clicking).