What's a good alternative to page caching on Heroku? - heroku

I understand page caching isn't a good option on heroku since each dyno has an emepheral file system (so they wouldn't share files and it would get wiped out on each restart).
So I'm wondering what the best alternative is. I have a large amount of potential files that could get generated in a traditional page caching scenario (say 10GB-100GB) so redis/memcached don't seem like good options here. Redis can write out to disk, but my understanding is that once you exceed it's memory capacity, it's not the right solution to start reading off of disk.
Has anyone found a good solution here? I'm thinking maybe MongoStore. (And some way to run this in conjunction with redis since I'm using redis for some other scenarios.) Thanks!

If your site is 100% static content and never going to be dynamic, S3 may be a good option. You can then create a CNAME to the s3 domain. This allows you to leverage CloudFront should you need it. Otherwise, 100GB would have to go into the database, which is in turn then pulled up by your application.
Heroku's cedar stack allows for custom buildpacks. This one vendors nginx. This would be good if you envision transitioning to a more dynamic site.

Related

Slow MEAN Stack Web applications

I have three MEAN Stack built web applications hosted on a shared hosting plan. It's running really slow (takes minutes to login and minutes to call the database) and I'm not sure how to optimise the performance. I have created three backend servers so that each application can can call the backend separately. I have ensured that my files are gzipped and are on HTTP3. What else should/can I do on top on that? I can't seem to find much related information online. Please give me any suggestions that you may have!
Would implementing lazy loading help? If so, please share some easy examples because I'm still new. Much appreciated!
I'd suggest moving off of shared hosting and using one of the newer generation developer-focused hosting platforms like Render or Adaptable.io. Adaptable includes MongoDB, so it's great for MEAN stack. With Render, you'd probably use MongoDB Atlas. Both provide free tiers that smaller apps can fit within.
With any of the next-gen hosting platforms, you just connect a GitHub repo with your source code and they automatically deploy your app to the cloud. You don't have to deal with keeping servers up to date, optimizing database performance or anything like that.

Are CDNs that don't cache useful?

Is there any benefit to (HTTP-) serving a non-cacheable resource over a CDN?
(my use case: I'm serving a static Single Page App and I'd like to improve its load time, but I don't want index.html to get cached, because I want every new release to be reflected immediately. Specifically, this static site is hosted on AWS S3, and the CDN is AWS CloudFront.)
I assume that most of the performance benefits of CDNs are achieved through caching, but I could imagine other benefits due to, say, priviledged network infrastructure. As I don't know the first thing about networks, this may sound like a silly question.
Yes, it can be useful by moving the content closer to the user. Most CDN's will serve your static file from a geographical location as close to the user as possible, typically providing better latency.
Of course, you need to have users across the globe for this to make sense to you.

Is Heroku a replacement for a VPS?

We're currently evaluating Heroku to replace the initial workflow of renting a VPS for a small Web App (since we're working on NodeJS, cPanel hosting plans aren't enough, ergo, VPS).
The confusion lies in Heroku's actual usage as even though it's clear it's used as a platform as a service, there is no Disk (HDD/SSD) limit described.
Web App requirement includes file upload capabilities (profile picture, etc) so I'm not sure Heroku is what we need. Can I get a clear explanation on this?
Not a Heroku expert, but...
You could always use one of the various add-ins that offer database support for storing your images until that no longer works
As the usage of your site scales out, you'd probably want to place static content into a CDN.
I wouldn't consider placing files into Heroku that weren't related to running code and honestly I don't even know if you can.
(I originally just wanted to comment, but need a higher rep :/)

Is memcache(d) necessary when using Cloudflare/Incapsula

If you need caching in your website to make database use lower, do you have to do it using memcache or memcached (in PHP, for example) or can you achieve this by using professional services like CloudFlare, Incapsula or others like that do some caching for you?
Services like Cloudflare cache your HTML and/or assets like images and CSS files in a CDN, so that your entire server is hit less often. This is great for semi-static sites but may not be the best fit for highly dynamic sites.
Local caches like memcached just store any data in a way that's fast to access. You can use that to cache database queries and lower your database activity, but you can also use it to store pre-computed data that would be expensive to re-create all the time or whatever else you may want to store non-permanently in a fast-to-access way.
Both solutions solve different problems. You may use both together, or either, or neither. It really depends on where exactly your bottleneck is and which solution fits your problem better.
I'm the CEO of CloudFlare and I'd say: more (intelligent) caching is almost always a good thing. While we can significantly decrease the load coming to your web server, to get the best performance it's still extremely important to optimize your web application and it's interaction with your database. To that end, memcache and other fast caching layers can play an important role and I'd never discourage them.
PS - we work great with dynamic sites. 95%+ of our sites are highly dynamic web applications.

Caching images in Memcached

The user profile images are stored in a separate fileserver, and I am thinking of caching them in memcached. The memcached server is local to the app, and each image is less than 1MB.
But I saw over here that using memcached for images is a bad idea. Is it really? I am really not convinced.
Any best practices and suggestions? I am using SpyMemcached Java Client.
Linux automatically caches files that are read from disk. Caching proxies like Squid are also good at caching images.
So... there are certainly are better tools for the job. On the other hand, nginx recently added memcached support. Without context, it's really hard to judge that recommendation.
They might mean "Don't serve images from memcached via a PHP script", in which case, they're absolutely correct -- PHP adds tons of overhead. But I don't necessarily see how using Nginx's memcache feature to store and serve images would be a bad thing.
Edit: It appears that facebook may have cached profile images in memcached at one point.

Resources