I'm currently developing my portfolio website using Nuxt3 in the frontend and Netlify for hosting. The site contains a fair amount of videos and although most mp4 files are not excessively large in size (1.2 - 1.4mb), requesting them directly from my server has taken a strain on the loading times of my site.
Aside from lazy-loading and compressing, what further steps could I take to optimize the loading speed of my videos? I am aware of CDNs such as Amazon Cloudfront and Cloudinary, but uncertain as to which would be most suitable for a small portfolio project.
Since this is quite a general question, any pointers to other techniques and best practices are much appreciated. Thank you for the help!
Like images, video can have a billion things you can optimize and fine tune.
If it's a small portfolio project, just use Cloudinary. It will be super simple, highly optimized for you, will probably fall under a free tier and won't need reading a 400 pages book on how to work with various codes, containers, buffering etc etc...
Related
The settings:
Blog with posts, buily with Laravel, where:
Every post can have max of 1 image (nullable).
Max posts in the blog is 1000. Let's assume there are 1000 posts for the discussion.
Every post has a comment section. Where registered users can comment and include an image in their comment. Lets assume every post has 2 images in the comments.
So in total it counts to 3000 images* that need to be stored (and resized I guess), presented, and etc.
This is the ideal amount in the long range, I'm not looking for a "scaleable" solution, since there is not going to be a crazy exponential growth.
*In reality for the time being it is less, and I assume that for these amounts of media files it doesn't really matter if its 1000/1500/2000 or 3000.. Correct me if that's wrong.
Few extra things to note:
I'm hosting it in shared hosting (I can store up to 300k files).
I want it to be secured, so no malicious file is uploaded in the cover of an image file.
I'm looking for a budget solution (so if s3 will start charging hard after 12 months it makes it not relevant), preferably free (.
So the dilemma is between storing all images locally in Storage folder (manipulating images with some Laravel package). Other possibility is cloudinary, which I don't know much about, just that it enables to store/manipulate/backup/use their api to present the images I stored there.
If I choose to do it locally - is it safe to store a user uploaded image locally? how do I make sure it's not malware in disguise of an image file?
With this amount of images/content can it cause performance issues in the shared hosting when storing locally?
What would be the advantages of using cloudinary for me?
Thanks.
Cloudinary can actually help a lot in this case.
Instead of storing the resources locally and writing something up to manipulate them, you could integrate Cloudinary in the project.
This would free up server space. Storing images locally may or may not impact performance, depending on the architecture, but freeing server resources is always a good practice.
Also, manipulation and delivery of images could be done on the fly when first requested (or eagerly before they are requested if you want to) by a simple API call. So you don't need to make up something new, but leverage already existing API.
Cloudinary also has a fully-featured free tier that you could use. If you don't expect exponential growth at the moment, that tier would be more than enough for the project
Full disclosure: I'm currently working at Cloudinary, (but the above still holds :) ).
Currently I am working on a website that is using the Magento eCommerce platform.
Although the website has only seen about 1,000 visitors over a 30 day period, it is using over 70 GB of bandwidth. The website has Cache enabled to help reduce the amount of resources it takes to load each page, but it does not appear to be helping. I was hoping to find some pro tips on how to reduce the amount of bandwidth usage to avoid costly overage fee's with the hosting provider.
The website is http://fantasyfootballdraftboard.net if you would like to review the site. The primary purpose of the website is to sell fantasy football draft boards online, so I would prefer not to remove the large image on the home page. I've used Pingdom speed test, and it claims the site only uses roughly 2.5 MB of bandwidth to load each page. After a pretty in-depth analysis of Google Analytics, Page Views, and the amount of bandwidth it takes to load each page (2.5 mb according to Pingdom), the numbers just do not add up.
Does anybody have any suggestions or ideas for me? Does Magento use a lot more resources and bandwidth than other eCommerce websites?
Thanks in advance,
I ran your site through webpagetest and there are a few recommendations from there. You should certainly compress your transfer. Personally, I would recommend you avoid png files for images unless you really need them (eg for see through effects). Jpegs are much more efficient and compressed.
Go through webpagetest and I'd bet you could knock off at least a megabyte!
I am check your website, but find any problom.
and bandwith is no relation magento system.
maybe your image are other site used or stolen ftp password by cracker,
HTTP_REFERER check in apache config (or .haccess file )
http://www.webmasterworld.com/apache/4515652.htm
I have a website with lots of huge gif images. I have limited each page to 5 imagesbut the loading time is yet very high (+60seg). The images are around 2MB in size.
Is there a way of speeding up loading? Because of the nature of the images, i think they cannot be compressed (again) because it would decrease quality significantly. The images are "soundless mini videos" of funny situations.
I also thought about creating multiple connection to download images faster (as many download acelerators do), but i doubt it to be possible on client side.
I also tried load images one per one (aka wait first image to be download and then adding through DOM the next), but total time increased (less connections = slower total download speed).
Have you some idea?
UPDATE: Solved by using cloudflare (See answer)
I solved the problem by using cloudflare
CloudFlare protects and accelerates any website online. Once your
website is a part of the CloudFlare community, its web traffic is
routed through our intelligent global network. We automatically
optimize the delivery of your web pages so your visitors get the
fastest page load times and best performance.
Now my website is loading in seconds instead of minutes, it looks my hosting service was poor.
Is cloud hosting the way to go? Or is there something better that delivers fast page loads?
The reason I ask is because I run a buddypress site on a bluehost dedicated server, but it seems to run slow at most times of the day. This scares me because at the moment the sites not live and I'm afraid when it gets traffic it'll become worse and my visitors will lose interest. I use Amazon Cloud to handle all my media, JS, and CSS files along with a catching plugin, but it still loads slow at times.
I feel like the problem is Bluehost, because I visit other sites running buddypress and their sites seem to load instantly. Im not web hosting savvy so can someone please point me in the right direction here?
The hosting choice depends on many factors such as technical requirements, growth rates, burst rates, budgets and more.
Bigger Hardware
To scale up hosting operation, your first choice is often just using a more powerful server, VPS, or cloud instance. The point is not so much cloud vs. dedicated but that you simply bring more compute power to the problem. Cloud can make scaling up easier - often with a few clicks.
Division of Labor
The next step often is division of labor. You offload database, static content, caching or other items to specific servers or services. For example, you could offload static content to a CDN. You could a dedicated database.
Once again, cloud vs non-cloud is not the issue. The point is to bring more resources to your hosting problems.
Pick the Right Application Stack
I cannot stress enough picking the right underlying technology for your needs. For example, I've recently helped a client switch from a Apache/PHP stack to a Varnish/Nginx/PHP-FPM stack for a very business Wordpress operation (>100 million page views/mo). This change boosted capacity by nearly 5X with modest hardware changes.
Same App. Different Story
Also just because you are using a specific application, it does not mean the same hosting setup will work for you. I don't know about the specific app you are using but with Drupal, Wordpress, Joomla, Vbulletin and others, the plugins, site design, themes and other items are critical to overall performance.
To complicate matter, user behavior is something to consider as well. Consider a discussion form that has a 95:1 read:post ratio. What if you do something in the design to encourage more posts and that ratio moves to 75:1. That means more database writes, less caching, etc.
In short, details matter, so get a good understanding of your application before you start to scale out hosting.
A hosting service is part of the solution. Another part is proper server configuration.
For instance this guy has optimized his setup to serve 10 million requests in a day off a micro-instance on AWS.
I think you should look at your server config first, then shop for other hosts. If you can't control server configuration, try AWS, Rackspace or other cloud services.
just an FYI: You can sign up for AWS and use a micro instance free for one year. The link I posted - he just optimized on the same server. You might have to upgrade to a small server because Amazon has stated that micro is only to handle spikes and sustained traffic.
Good luck.
I've been doing a good bit of research into website performance lately and I'd say I've gained a fair amount of knowledge about best practises to improve website performance as well as reduce bandwidth requirements by making such tweaks as GZipping, content caching, and image and script optimization.
My problem is I've found plenty of case studies from hugely popular sites such as Facebook, Google and Amazon but what I really want is some findings and figures for sites a bit smaller say 50-250k visitors a month.
I'm looking for what was gained from investing time into performance optimization e.g. significant speed improvements, reduced bounce rate, reduced running costs, and all the analytics stuff.
For Facebook or Google, a 5% performance tuning improvement can save a lot of money. I have done a lot of performance analysis for clients and they often start with tuning questions. But 90%+ of the time, the greatest performance gain is to look into the application itself. You cannot tune a tanker to run like a porsche. This is some findings I found for Top J2EE Web Application Performance Problems. If the web site is using Drupal or Wordpress, you want to turn on the built in caching before going to the production. Those software package also supply optimization in combining JavaScript and CSS into one file in reducing network round trip time. If you have a site with a lot of content, increase the memory allocated to different buffers in your DB. For a site with very static content, configure the web server like apache to compress the html data. Set the content expiration policy correctly. Try to optimize the image file size. I found images in a lot of web site can be further reduced in size without losing much image quality. Make sure the web server have enough physical memory. Most out of the box server configuration is reasonably optimized. So usually there is only a few things need to do. For the type of web site you are looking at, I don't think you need to worry too much. If you have some files like Flash or PDF that is extremely popular. You can consider putting those files to CDN (cloud) so have other expertise to take care of the bandwidth for you. Those solution become pretty affordable even for small and mid site web site now.