A better solution to host static files besides Amazon S3 - ajax

I made a mobile application in static html, which is equal to my site wordpress site
The first version was completely static, all texts were in the mobile HTML application.
Today, I updated my application to pull data from the wordpress with AJAX.
The problem is that now, with so many requests being made, the S3 bucket is not being enough.
Despite having decreased from 6kb to 83kb, but it is still more slow because of AJAX..
is it possible put static applications in some other service from Amazon?

For the static content, you should probably be looking at AWS CloudFront instead of S3. As per the page itself:
Amazon CloudFront is a content delivery web service. It integrates with other Amazon Web Services products to give developers and businesses an easy way to distribute content to end users with low latency, high data transfer speeds, and no minimum usage commitments.
Other thing you can leverage is the AJAX caching. That will make your webpage load much faster from the next time. You may also want to using nginx on your server for caching (this will reduce your server load)

Related

Can I set up a website to retrieve data from my own back-end server

I've made a website for an arts organisation. The website allows people to browse a database of artists' work. The database is large and the image files for the artists' work come to about 150Gb. I have my own server that is currently just being used to keep the images on its hard-drive.
I'm going to purchase hosting so I don't have to worry about bandwidth etc... but would it be better to purchase hosting that allows me to upload my entire image database or should I use the website to get the images from my server? If so how would I do that?
Sorry I am very new to this
I think it could be better to have the data on the same server so you avoid calls to another server for images which are quite big as you say and this can slow you down overall.
I assume you will need to set up some API on your server to deliver the images or at least URLs for them but then you must make sure they are accessible.
You'll want the image files on the same server as your website, as requests elsewhere to pull in images will definitely hinder your site's performance - especially if you have large files.
Looking at large size of database and consideration of bandwidth, dedicated server will be suitable as they includes large disk spaces and bandwidth. You can install webserver as well as database server on same server inspite of managing them separately. Managing database backups and service monitoring becomes much more easier.
For an instance, you can review dedicated server configuration and resources here :- https://www.accuwebhosting.com/dedicated-servers

Amazon AWS and usage model for S3 storage

There is this example on amazon, a high traffic web application. I noticed that they are using S3 as their content delivery method. I was wondering if I need to have a Web Server for the content delivery and a Web App for my application. I don't understand why they have 2 web servers and 2 web app in the diagram.
And what is the best way to set up a website that serves images and static contents through S3 and the rest of the content through the regular storage.
My last question is, can I consider S3 as a main storage, reliable enough that I can only keep my static content there and don't have a normal storage as a backup ?
That is a very general diagram, specific diagrams will vary depending on the specifics of the overall architecture.
Having said that, I believe the Web Server represents something like Apache or Nginx and the App Server represent something like Rails, Rack Server, Unicorn, Gunicorn, Django, Sinatra, Flask, Jetty, Tomcat, etc. In some cases you can merge the Web Server and the App Server together like for example deploying Apache with python mod_wsgi to run your Django app. (So depends on Architecture)
what is the best way to set up a website that serves images and static
contents through S3 and the rest of the content through the regular
storage.
There's no really best way other than just point your dynamic content to your Databases (SQL and NoSQL) and point your static files to an S3 bucket (images, css, Jquery code, etc) You can also use third party modules depending on your application stack. For example you can accomplish this in Django with the django-storages module. You can find similar modules for other app stacks like Rails.
My last question is, can I consider S3 as a main storage, reliable
enough that I can only keep my static content there and don't have a
normal storage as a backup ?
S3 is pretty reliable, they provide a 99.999999999% reliability of your data. That goes down if you use their RRS (Reduced Redundancy Storage), but if you want to use it you probably want to back up your data in a non RRS bucket anyways. Anyhow, if it's extremely critical data, you are more than free to backup your data somewhere else just in case.
Notice in the diagram that they also recommend using CloudFront for your static files and this is especially useful if your users will be accessing your application from different geographical areas.
Hope this helps.

Can Azure CDN propogate changes to all nodes with just one "miss"?

I'm using an Azure CDN endpoint on a hosted service (meaning, not a Blob Storage CDN endpoint).
The service is lazy rendering images, and once they are rendered, they are practically static (I can safely use Cache-Control:public, max-age=31536000).
In the naive implementation, there will be up to** X misses (X times the service will render an image) - Where X is the number of CDN nodes around the world.
There are two workarounds, as I see it:
The lazy created images are stored in Blob Storage, and later pulled from there.
Implement a cache in the Cloud Service.
Is there a way to propagate files to all nodes? Is there a better solution then having two caching layers (Cloud Service Cache / Blob Storage + CDN)?
** "Up to", depending on the geographical location of web requests. In my case, all around the world.
There is currently not a way for you to push something to one of the remote CDN nodes. This is a feature that many folks have asked Microsoft for in the CDN product.
Both workarounds would work. The first one benefits from not having to recreate them for the other CDN notes at all and will reduce the load on the server since it wouldn't be feeding these up after it renders them the first time. However, if it turns into you needing to get the request at the server anyway and then redirect to a version already in BLOB storage then you could easily just return the cached image as well. I think it depends on how many images you are talking about. If you have a LOT of them I'd lean more toward the first option.

Not getting improvements by using CDN

I've just added a CDN distribution using Amazon Cloudfront to my Rails application on Heroku, it's working OK.
My homepage serves around 11 static assets, I've made some tests using http://www.webpagetest.org/ and there are no differences (in terms of performance, optimizing load times) between using the CDN or not.
Is there any particular reason why this could be happening?
My region is Latin America btw, so it's using the All locations edge option.
Thanks.
The main benefits of using CDN from Amazon or others is that they are hosted on fast and reliable servers and offload the traffic served directly from your server, which in case that you have a dedicated fast server you won't see a considerable boost.
But another benefit is that they are potentially cached by user's browser (due to visiting other websites which have used the same CDN) so the visitor will have a better experience first time they visit your site.
A couple of suggestinos.
If the site CSS is one of the static assets that you have moved to CloudFront then I would try moving it back to your main server.
Since page display can't start until the site CSS is downloaded, you want to serve this as fast as possible. If it's coming from a CDN then it requires a second HTTP request.
Also, use the waterfall display from webpagetest.org to pinpoint where the bottlenecks are.
Good luck!

Image storage on different server

I can see that all big sites store the images on a complete different server. What are the benefits of this practice?
Load balancing.
Separation of dynamic and static content.
Static content is served from servers which are geographically (or in network "length") close to the client.
(Update) forgot to mention that browsers used to limit the number of concurrent requests to the same server or domain (don't know if it's still used) and using different domain names allowed the server to bypass this limitation.
This way each kind of server serves resources it's tuned up for so clients get pages faster.
This way, the browser won't send cookies when requesting images.
It also enables the use of location-aware CDNs for images only.

Resources