I tried to learn about the performance and server load by pre-rendering angular2, I tested for small applications.
I have a question for someone with experience with pre-rendering large applications or online shops? Does pre-rendering is a efficient?
Related
I'm currently developing my portfolio website using Nuxt3 in the frontend and Netlify for hosting. The site contains a fair amount of videos and although most mp4 files are not excessively large in size (1.2 - 1.4mb), requesting them directly from my server has taken a strain on the loading times of my site.
Aside from lazy-loading and compressing, what further steps could I take to optimize the loading speed of my videos? I am aware of CDNs such as Amazon Cloudfront and Cloudinary, but uncertain as to which would be most suitable for a small portfolio project.
Since this is quite a general question, any pointers to other techniques and best practices are much appreciated. Thank you for the help!
Like images, video can have a billion things you can optimize and fine tune.
If it's a small portfolio project, just use Cloudinary. It will be super simple, highly optimized for you, will probably fall under a free tier and won't need reading a 400 pages book on how to work with various codes, containers, buffering etc etc...
There is a lot written about the performance of Laravel. It is not the very best framework when it comes to fast applications but it has a lot of options and a brilliant community and documentation. I would like to know if Laravel suits my situation:
I'm currently developing a browsergame that, hopefully, will be played by thousands of visitors worldwide. There could be over 3000 concurrent users at the same time. The application is a bit heavy, because it needs a lot of different modules: views, routing, session management, authentication, database connections, cronjobs and so on. It really is a dynamic game so the application will be loaded a lot of times. And: I don't have that much money to invest in a lot of dedicated servers (at least not in the very beginning).
I looked to other frameworks too. Because Lumen, Slim and some other micro frameworks don't support all of the modules my game needs, I think Laravel is a good choice. But I'm really scared for the benchmarks I see. Laravel doesn't look good there when compared to other frameworks: it's slow, consumes a lot of memory and can't handle many requests.
So my question: is Laravel a good choice for a heavy browsergame website with potential of thousands of concurrent users? Caching and Homestead will help for sure, but is Laravel a good choice at all, or is there a really better framework?
It's less about the framework, and more about how you write your code.
Follow good practices, plan for scaling, and Laravel can absolutely be a very high-performance solution for you. I've heard of some using Laravel with millions of requests per day.
We run a large production app with Laravel, load balanced across multiple web servers, separate redundant database servers, Redis caching, etc. We've had lots of scaling issues, but interestingly none of them were related to the framework. Your main bottlenecks will be elsewhere.
Folks worry way too much about performance before even beginning a project (premature optimization, and all that). Pick the best tool for the job in terms of what it does for you. Then build your app with scaling in mind.
If Laravel provides the functionality you need and you like how it works, use it!
Is cloud hosting the way to go? Or is there something better that delivers fast page loads?
The reason I ask is because I run a buddypress site on a bluehost dedicated server, but it seems to run slow at most times of the day. This scares me because at the moment the sites not live and I'm afraid when it gets traffic it'll become worse and my visitors will lose interest. I use Amazon Cloud to handle all my media, JS, and CSS files along with a catching plugin, but it still loads slow at times.
I feel like the problem is Bluehost, because I visit other sites running buddypress and their sites seem to load instantly. Im not web hosting savvy so can someone please point me in the right direction here?
The hosting choice depends on many factors such as technical requirements, growth rates, burst rates, budgets and more.
Bigger Hardware
To scale up hosting operation, your first choice is often just using a more powerful server, VPS, or cloud instance. The point is not so much cloud vs. dedicated but that you simply bring more compute power to the problem. Cloud can make scaling up easier - often with a few clicks.
Division of Labor
The next step often is division of labor. You offload database, static content, caching or other items to specific servers or services. For example, you could offload static content to a CDN. You could a dedicated database.
Once again, cloud vs non-cloud is not the issue. The point is to bring more resources to your hosting problems.
Pick the Right Application Stack
I cannot stress enough picking the right underlying technology for your needs. For example, I've recently helped a client switch from a Apache/PHP stack to a Varnish/Nginx/PHP-FPM stack for a very business Wordpress operation (>100 million page views/mo). This change boosted capacity by nearly 5X with modest hardware changes.
Same App. Different Story
Also just because you are using a specific application, it does not mean the same hosting setup will work for you. I don't know about the specific app you are using but with Drupal, Wordpress, Joomla, Vbulletin and others, the plugins, site design, themes and other items are critical to overall performance.
To complicate matter, user behavior is something to consider as well. Consider a discussion form that has a 95:1 read:post ratio. What if you do something in the design to encourage more posts and that ratio moves to 75:1. That means more database writes, less caching, etc.
In short, details matter, so get a good understanding of your application before you start to scale out hosting.
A hosting service is part of the solution. Another part is proper server configuration.
For instance this guy has optimized his setup to serve 10 million requests in a day off a micro-instance on AWS.
I think you should look at your server config first, then shop for other hosts. If you can't control server configuration, try AWS, Rackspace or other cloud services.
just an FYI: You can sign up for AWS and use a micro instance free for one year. The link I posted - he just optimized on the same server. You might have to upgrade to a small server because Amazon has stated that micro is only to handle spikes and sustained traffic.
Good luck.
I have a Drupal site with a lot of calculations and database requests on each page load (running on an Amazon EC2 server). I am curious how my site would hold up if it became popular or in some other way received heavy traffic. Perhaps the most important thing for me is to locate potential bottlenecks in my code.
What are the best tools for stress testing and finding bottlenecks in a Drupal site? Right now I'm not using any cache module. I've read about the MemCached module and some about Varnish.
Anyone who can share their experiences?
Generally php does not do great in really large scale projects. That is why google does not support php in their app engine.
Facebook which was made in php had to be compiled into c++ for better performance.
Having said that here are the some of the tools ( I have not used them )
http://www.webload.org/
http://xdebug.org/ - to profile your php code besides debugging
cheers,
Vishal
Apache Benchmark.
Send no cookies to simulate anonymous traffic and stress the Varnish caching layer.
Send rando cookie to stress the Drupal caching layer (where you are hopefully using memcache)
Send a login cookie to stress the DB layer.
We run massive Drupal sites on php. Scaling does cost in resources for multiple webheads, database clustering, separate memcache and file servers, but you have to balance that cost against hiring developers for refactoring your code into a different language, and code maintenance.
I've been doing a good bit of research into website performance lately and I'd say I've gained a fair amount of knowledge about best practises to improve website performance as well as reduce bandwidth requirements by making such tweaks as GZipping, content caching, and image and script optimization.
My problem is I've found plenty of case studies from hugely popular sites such as Facebook, Google and Amazon but what I really want is some findings and figures for sites a bit smaller say 50-250k visitors a month.
I'm looking for what was gained from investing time into performance optimization e.g. significant speed improvements, reduced bounce rate, reduced running costs, and all the analytics stuff.
For Facebook or Google, a 5% performance tuning improvement can save a lot of money. I have done a lot of performance analysis for clients and they often start with tuning questions. But 90%+ of the time, the greatest performance gain is to look into the application itself. You cannot tune a tanker to run like a porsche. This is some findings I found for Top J2EE Web Application Performance Problems. If the web site is using Drupal or Wordpress, you want to turn on the built in caching before going to the production. Those software package also supply optimization in combining JavaScript and CSS into one file in reducing network round trip time. If you have a site with a lot of content, increase the memory allocated to different buffers in your DB. For a site with very static content, configure the web server like apache to compress the html data. Set the content expiration policy correctly. Try to optimize the image file size. I found images in a lot of web site can be further reduced in size without losing much image quality. Make sure the web server have enough physical memory. Most out of the box server configuration is reasonably optimized. So usually there is only a few things need to do. For the type of web site you are looking at, I don't think you need to worry too much. If you have some files like Flash or PDF that is extremely popular. You can consider putting those files to CDN (cloud) so have other expertise to take care of the bandwidth for you. Those solution become pretty affordable even for small and mid site web site now.