AngularJS app becoming slow on IE 8 - performance

I have a AngularJS application that makes use of web services to load the content and render the view. I've checked the performance of application on Chrome and it is running fast enough. However on IE 8, I'm facing huge performance issues.
Noticed the memory usage for Internet Explorer. Found that it was around 200MB. If I opened multiple instances of the application in separate tabs, the memory usage kept on doubling with each instance. This is slowing down the response time and in the effect the whole PC performance. No such issues are there on Chrome.
I feel that it is because of the data that is present in the model for the application. So with each tab on IE, the model data is doubled and the RAM usage also increases. However, this problem doesn't occur for me in Chrome.
Please suggest some performance optimization techniques that I can use.

Try to avoid the use of "ng-repeat" for a large list rendering. Or, you can use the infinite scrolling for the same.

Related

ReactJS heavy load application performance issue

I am looking for some advice on how to track application performance; the application is developed using ReactJS, and I am building it with webpack.
First of all I will just present what I have done and what the application is expected to do:
I need to render a lot of, let's just call them widgets, that update real time presenting a lot of data. So, on a scale, I would say each widget renders about 50 to 80 values, these updates might be received from the server side all at once, so they should happen instantly when data is received. Consider I might have around 25 to 30 widgets that need to update real time.
Let me tell you a little bit about the implementation:
I have used the smart/dumb pattern for ReactJS components
The actual data is stored in application state and is distributed by the smart components to dumb components through props
I am using Pure Render Mixin to avoid unnecessary rendering
Also using Immutable data so that I will ensure Pure Render Mixin is working accordingly, that is, being accurate in determining if a render is necessary and at the same time be fast, really fast.
There are no weird bindings of callbacks, that might determine re-rendering of components, this is double checked already.
Now the issues I am having:
with about 5-6 widgets, meaning around 400-500 values that need to render each second, it works very well in Chrome and decent in Firefox.
adding about 25-30 widgets gets the application to still work decent in Chrome, but it starts to act slow in Firefox, by slow I mean user interaction that might even get a delay of around 1 second. That is really unacceptable.
What I have tried:
use Chrome dev tools to measure the performance; that didn't help too much, what I could see though, is that everything is alright. And there is no way I could read all the graphics this tool provides. (And I've read a lot of articles about it)
tried to use Firebug in Firefox. That's an amazing tool, but not in this case; just by opening it with the above mentioned load (30 widgets) gets Firefox to freeze... and the profiler gave me nothing)
on a last resort, I have used the default dev tools from Firefox, it has a performance tab. That got me some information of what parts of the application has the most load on the browser. It seemed it was some heavy computing determining min/max of an Immutable.List.
Unfortunately the application still has performance issues, and it is of high importance to get it working perfect, and Firefox profiler doesn't give me any other leads.
So my questions would be:
what would be the next action to take in order to determine performance issues? (as much as possible where they are: class/method/at least file)
did you guys use any performance testing tool that gives you an insight of what the hell is happening?
is there something else to consider to improve the overall functionality, especially targeting multiple browsers? (Firefox, Chrome, IE11)

Can I find out if there are other browser windows running webgl?

I am creating a three.js powered website, and on some browsers (I am looking at you firefox), if other tabs are also running webGL, my performance stutters.
I would like to know if there is a way to find out (in the browser) if other tabs are consuming webgl services so that I can alert the user.
I appreciate any and all comments!
That would be a security violation, so no, you can't do that.
Update:
I'll just add, you could include stats.js (more likely, you'll need to do something very similar to establish to what stats.js is doing to get an idea of what the average frame rate is like and look for dips in that performance), and if that is regularly dropping then alert the user. That said, you are likely to get the calculation wrong, and there are always many variables you can't control which can affect performance. Most of those can be resolved with a browser restart, particularly Firefox doesn't seem to be releasing its GPU memory across page reloads. When that memory is full the performance drops badly.
In any case, any warnings you give out should probably not be intrusive for the user in any way.
Also note that properly written WebGL programs (using requestAnimationFrame as intended) should to my knowledge not be consuming much in compute resources when the tab is in the background, though this may also vary per browser. But a tab in the background will still consume the memory (GPU memory, and JavaScript code and objects).

CSS responsive performance issues for 100's of containers slow

So I am backend developer who is dipping my toes into developing a responsive site that, for SEO reasons, needs to be able to show up to a 1000 responsive "containers" for search results, ie...
[1]: http://107.6.139.93/Melbourne.homes
So what seems to be happening is that the browser is locking up trying to render all these containers or something? For searches with less then 300 results, the delay is tolerable ie...
[2]: http://107.6.139.93/Viera.homes
To be honest, i'm in somewhat over my head here (im a database guy) and i'm trying to learn, but have no idea if its going to even be possible to improve performance without using pagination (something my clients is very much against)
I'm wondering if anyone here has any insights into my issues.
EDIT - the same "lock-up" delay occurs when you resize the browser and wait for the responsive-ness to kick in
I think your overloading the browsers memory, this you can't solve with css. it's the whole package (images and content).
You could solve this by using infinite scroll and thus only load content when the user scrolls. There are some things you have to look in to before throwing yourself into infinite srcolling especially on SEO level.
You might want to read this:
http://googlewebmastercentral.blogspot.nl/2014/02/infinite-scroll-search-friendly.html

Strange memory leak on Mac (Chrome, Firefox and Safari) with our GWT based web UI

We are experiencing a serious memory problem in our GWT based web application when running in Mac, for Chrome, Firefox and Safari.
For example, with Firefox, when looking at the Activity Monitor on Mac, the memory consumption is quickly increasing across time, even through frequent refreshes, and can reach 1 GB after a significant session. Similar phenomena happens for Chrome and Safari.
But, we cannot see a real reason using various profiling tools, including Java JProfiler (for GWT) and Chrome profiler and timeline looking at native JS, listeners and DOM elements.
Actually there are 2 related problems here:
The memory is increasing while using the UI for along time without refresh. In this case, we can see some uncollected garbage SVG elements (we are using SVG based canvas) that are unreachable, but the memory increase in the Activity Monitor is much higher than what we would expect with this garbage.
The memory remains high even after multiple refreshes, and even though the profiler shows that all the above garbage is completely gone.
We are chasing this leak for a while, with no results, so I would appreciate any help.
Thanks,
Yaron.
The problem occurs only on Mac?
What GWT version are you using? 2.5?
Did you saw this issue https://code.google.com/p/google-web-toolkit/issues/detail?id=6938?
I faced several issues with leaks using GWT before, but in 2.5 version it works fine, even in IE!
I've tracked down some leaks in my GWT application before, and it's certainly not easy to determine where they're coming from thanks to Java's garbage collector hiding what's going on. The most common reason for a leak would be cyclical references, so multiple objects can't be garbage collected because they reference each other. They're tough to spot on your own so I use a library called FindBugs - it also comes with a very convenient Eclipse plugin. FindBugs literally finds anything you could possibly consider and has worked wonders for me. BUT make sure to play with the settings first; cyclical reference checking is not enabled by default.
Bruno_Ferreira makes a good point, too - make sure you're up to date with your GWT version as they're always improving memory leaks.

Slow javascript execution in chrome, profiler yields "(program)"

How would I go about determining what the hangups are in my javascript app when the profiler puts (program) at the top with 80%? Is my logic too complex for the hotspot tracking to occur? Is my memory footprint too big? What is generally the cause of this?
More Information:
There are no elements on the form save the one canvas tag
There are no waiting communications (xhr)
http://i.imgur.com/j6mu1.png
Idle cycles ("doing nothing") will also render as "(program)" (you may profile this SO page for a few seconds and get 100% of (program)), so this is not a sign of something bad in itself.
The other thing is when you actually see your application lagging. Then (program) will be contributed by the V8 bindings code (and the WebCore code they invoke, which is essentially anything: DOM/CSS operations, painting, memory allocations and GCs, what not.) If that is the case, you can record a Timeline of your application (switch to the Timeline panel in Developer Tools and press the Record button in the bottom status bar, then run your application for a while.) You will see many internal events with their timings as horizontal bars. You will see reflows, style recalculations, timers fired, GC events, and more (btw, the latest Chromium versions have an improved memory profiler utilization timeline, so you will be able to monitor the memory used by certain internal factors, too.)
To diagnose memory problems (multiple allocations entailing multiple Full GC cycles) you may use the Profiles panel. Take a heap snapshot before the intensive part of your code starts, and another one after this code has run for some time. Then compare the two heapshots (the right SELECT at the bottom) to see which allocations have taken place, along with their memory impact.
To check if it's getting slow due to a memory option use: chrome://memory
Also you can check chrome://profiler/ for possible hints of what is happening.
Another option is to post your javascript code here.
See this link : it will help you in Understanding Firebug profiler output
I would say you should check which methods taking %. You can minimize unwanted procedures from them. I saw in your figure given some draw method is consuming around 14% which is running in background. May be because of that your JS loading slowly. You should determine what´s taking time. Both FF and Chrome has a feature that shows the network traffic. Have a look at yslow as well, they have a great addon to Firebug.
I would suggest some Chome's auditing tools which can tell you a lot about why is this happening, you should probably include more information about:
how long did it take to connect to server?
how long did it take to transfer content?
how much other stuff are you loading on that page simultaneously?
anyway even without all that, here's a checklist to improve performance for you:
make sure your javascript is treated and served as static content, e.g. via nginx/apache/whatever directly or cdn, not hitting your application framework
investigate if you can make use of CDN for serving javascript, sometimes even pointing different domain names to your server makes a positive impact, e.g. instead of http://example.com/blah.js -> http://cdn2.example.com/blah.js
make sure your js is served with proper expiration headers, don't re-download it every time client refreshes a page
turn on gzipping of js content
minify your js using different tools available(e.g. with Google closure compiler)
combine your scripts (reduces the number of requests)
put your script tags just before
investigate and cleanup/optimize your onload and document.ready hooks
Have a look at the YSlow plugin and Google PageSpeed, both very useful in improving performance.

Resources