CSS responsive performance issues for 100's of containers slow - performance

So I am backend developer who is dipping my toes into developing a responsive site that, for SEO reasons, needs to be able to show up to a 1000 responsive "containers" for search results, ie...
[1]: http://107.6.139.93/Melbourne.homes
So what seems to be happening is that the browser is locking up trying to render all these containers or something? For searches with less then 300 results, the delay is tolerable ie...
[2]: http://107.6.139.93/Viera.homes
To be honest, i'm in somewhat over my head here (im a database guy) and i'm trying to learn, but have no idea if its going to even be possible to improve performance without using pagination (something my clients is very much against)
I'm wondering if anyone here has any insights into my issues.
EDIT - the same "lock-up" delay occurs when you resize the browser and wait for the responsive-ness to kick in

I think your overloading the browsers memory, this you can't solve with css. it's the whole package (images and content).
You could solve this by using infinite scroll and thus only load content when the user scrolls. There are some things you have to look in to before throwing yourself into infinite srcolling especially on SEO level.
You might want to read this:
http://googlewebmastercentral.blogspot.nl/2014/02/infinite-scroll-search-friendly.html

Related

ReactJS heavy load application performance issue

I am looking for some advice on how to track application performance; the application is developed using ReactJS, and I am building it with webpack.
First of all I will just present what I have done and what the application is expected to do:
I need to render a lot of, let's just call them widgets, that update real time presenting a lot of data. So, on a scale, I would say each widget renders about 50 to 80 values, these updates might be received from the server side all at once, so they should happen instantly when data is received. Consider I might have around 25 to 30 widgets that need to update real time.
Let me tell you a little bit about the implementation:
I have used the smart/dumb pattern for ReactJS components
The actual data is stored in application state and is distributed by the smart components to dumb components through props
I am using Pure Render Mixin to avoid unnecessary rendering
Also using Immutable data so that I will ensure Pure Render Mixin is working accordingly, that is, being accurate in determining if a render is necessary and at the same time be fast, really fast.
There are no weird bindings of callbacks, that might determine re-rendering of components, this is double checked already.
Now the issues I am having:
with about 5-6 widgets, meaning around 400-500 values that need to render each second, it works very well in Chrome and decent in Firefox.
adding about 25-30 widgets gets the application to still work decent in Chrome, but it starts to act slow in Firefox, by slow I mean user interaction that might even get a delay of around 1 second. That is really unacceptable.
What I have tried:
use Chrome dev tools to measure the performance; that didn't help too much, what I could see though, is that everything is alright. And there is no way I could read all the graphics this tool provides. (And I've read a lot of articles about it)
tried to use Firebug in Firefox. That's an amazing tool, but not in this case; just by opening it with the above mentioned load (30 widgets) gets Firefox to freeze... and the profiler gave me nothing)
on a last resort, I have used the default dev tools from Firefox, it has a performance tab. That got me some information of what parts of the application has the most load on the browser. It seemed it was some heavy computing determining min/max of an Immutable.List.
Unfortunately the application still has performance issues, and it is of high importance to get it working perfect, and Firefox profiler doesn't give me any other leads.
So my questions would be:
what would be the next action to take in order to determine performance issues? (as much as possible where they are: class/method/at least file)
did you guys use any performance testing tool that gives you an insight of what the hell is happening?
is there something else to consider to improve the overall functionality, especially targeting multiple browsers? (Firefox, Chrome, IE11)

Can I find out if there are other browser windows running webgl?

I am creating a three.js powered website, and on some browsers (I am looking at you firefox), if other tabs are also running webGL, my performance stutters.
I would like to know if there is a way to find out (in the browser) if other tabs are consuming webgl services so that I can alert the user.
I appreciate any and all comments!
That would be a security violation, so no, you can't do that.
Update:
I'll just add, you could include stats.js (more likely, you'll need to do something very similar to establish to what stats.js is doing to get an idea of what the average frame rate is like and look for dips in that performance), and if that is regularly dropping then alert the user. That said, you are likely to get the calculation wrong, and there are always many variables you can't control which can affect performance. Most of those can be resolved with a browser restart, particularly Firefox doesn't seem to be releasing its GPU memory across page reloads. When that memory is full the performance drops badly.
In any case, any warnings you give out should probably not be intrusive for the user in any way.
Also note that properly written WebGL programs (using requestAnimationFrame as intended) should to my knowledge not be consuming much in compute resources when the tab is in the background, though this may also vary per browser. But a tab in the background will still consume the memory (GPU memory, and JavaScript code and objects).

Why is my website loading so slow

This is the page of the website that I'm coding : http://davidcocciante.com/beta4/
The issue is that the loading page lasts 9-12 seconds. I don't have very heavy stuff. Maybe the scripts are the issue?
Thanks to you!
Your images are of extreme high quality, you need to optimize them for the web. There is at least 4MB and I didn't check all the images on the site.
This slows down the load time so much, if users are on the mobile device it will take even longer to load.
The only solution to this that I can think of is to optimize ALL images for web using Photoshop or what ever you decide to go with.
Best of luck, website looks nice however graphic weight OVERLOAD :)

Predict consequences of installing LESS, CSS3PIE on a high-load project

I faced situation of global site redesign (not appearance, but code architecture and underlying technologies). Website has about 135 000 visitors everyday. And it's crucial to make right decision now.
I had no experience of using LESS and CSS3PIE on such big projects before. Maybe some of you can predict some trouble which I can run into using technologies mentioned above. I would like to know advantages and drawbacks.
Isn't it better to use old tested and reliable methods like sprites for round corner buttons with shadows and gradients? I look at http://zappos.com. They just degrade gracefully in IE and don't use CSS3PIE.
Nobody answered me, so I try to answer myself. Since there are server-side LESS-compilers for all major platforms (Ruby, .NET, PHP) I decided to use LESS but compile server-side instead of using LESS.js which is not good because it prevents client's browser from caching CSS.
As concerning CSS3PIE I don't see any significant drawbacks of using it, a little more load lies on client using IE but it's not so bad.
The only issue I can foresee now is background and decoration disappearing on popups. I have already encountered this problem and posted a question here but no one ever answered.
I would avoid using CSS3Pie for production sites. In my experience, the higher the number of CSS3Pie-rendered elements on the page, the worse IE8/9 will perform.
Specifically, when I was using IE9 with an IE8 document mode, and with at least 2 elements rendered using CSS3Pie (using border-radius and linear-gradient), I observed a noticeable lag when scrolling the browser window. That is, I would try and scroll down the page, and the scroll bar would take a couple of seconds to "catch up" with the mouse pointer.
As soon as I switched CSS3Pie off, no lag when scrolling was observed. The same applies for IE8 in my experience.

How do I know how much traffic my wordpress/buddypress based social media website could hold? What to do when traffic goes up?

Right now I'm paying 5 dollars a month for hosting to godaddy.com. Although there are no users registered yet (it's closed for maintenance mode as I'm testing and buiding it), it's slower than e.g. facebook. Does anyone have experience on using buddypress? What happens if my site blows up and draws a lot of users very fast. I guess I can get more expensive and better quality hosting, but is there a limit for buddypress based sites, especially when I'm using quite a few plugins.
BuddyPress scales quite high, so the code itself won't be a problem, even with tens of thousands of users. Your problems will probably be imposed by your host--limiting database transactions or sizes of tables--or specific themes taking a long time to render.
Firebug can be a great tool to use if you want to identify what component is causing a site to be slow. Instructions on using Firebug

Resources