Caching on Heroku CI - heroku

I am setting up Heroku CI with Elixir Phoenix buildpack. I want to start using Dialyzer.
Diazlyer is a static analysis tool that before the first run takes at least a couple of minutes to create a "persistent lookup table" (PLT) of types from Erlang, Elixir and project dependencies. Later, project analysis is much quicker. I want to cache the PLT.
I've found this section on caching during build: https://devcenter.heroku.com/articles/buildpack-api#caching but I can't find anything about caching in test-setup or test script.
Is there test/CI cache or is it only usable in buildpacks?

(Tomasz, I know that you already have found a path how to approach this problem, but I will share here publicly what I have shared with you privately so that others might also benefit.)
Is there test/CI cache or is it only usable in buildpacks?
Seems that in test/CI you can not do it, you have to use a buildpack. Or maybe hold the cache somewhere outside of Heroku (seems like not a good way to me though).
Have you seen this https://github.com/tsloughter/heroku-buildpack-erlang-dialyzer? It seems dated but maybe it has some hint that can be useful to you.
Setting up backpacks is rather straight forward and for your need this seems the only option that would support caching.

Related

Use Laravel to refactor old, large PHP application partially over time?

From what I've read this should be possible due to the modular nature of Laravel, but I need assurance from people with more Laravel experience:
I have a very large (500k loc) ancient PHP app. So ancient that some parts of it date from PHP3 times (ca. 2000, PHP4 was released already but PHP3 was used for backwards compatibility reasons).
Refactoring this is a huge project, and the only way to reasonably do it is in parts. Replace this part, then that part, etc. Fortunately, the "ancient" part comes in handy as no framework was used and basically every page is its own script, with a few central libraries for shared functionality.
Is it possible to spin up a Laravel app that can route new/refactored pages to the new site and everything else (wildcard if possible) to the ancient code? All data is stored in a database, so there will be no sync issues between them except for user authentication and session info.
Is it possible to get eloquent running on an ancient DB design or to refactor the DB in such a way that it works for both? There was a previous attempt to move the DB interface to Doctrine which from what I know was aborted after partial success (i.e. many DB objects are accessed through Doctrine, but there is also a lot of straight SQL code in parallel).
It's a huge mess, but the software in question is still being used and successfully so and a previous attempt to replace it with something else has already failed.
additional details:
Thanks #maiorano84 for good questions:
First, does your legacy application have tests?
Negative on that. PHPUnit was released in 2004. At that time, this app had already been in production for four year.
Second, are you able to get it to work on a more recent version of PHP?
Yes, the current codebase is running on PHP 5.6.33 - it has been maintained throughout the years, and a major update was made on the transition between PHP 4 and PHP 5.
If it runs on PHP 5.3+, you can use Instant Refactoring
I'm an author of Rector, a tool that can migrate huge amount of PHP files in a few seconds. E.g. upgrade PHP 5.3 to PHP 7.4, upgrade Symfony 2.8 to 4.2 or migrate from Nette to Symfony (read the case study).
1. Install Rector
composer require rector/rector --dev
2. Create rector.php with PHP sets, since you have old PHP code
// rector.php
use Rector\Core\Configuration\Option;
use Rector\Set\ValueObject\SetList;
use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator;
return static function (ContainerConfigurator $containerConfigurator):
void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Option::SETS, [
SetList::PHP_52,
// use one set at a time, as sets build on each other
// SetList::PHP_53,
// SetList::PHP_54,
]);
};
3. Then run Rector on your code (e.g. /src directory)
vendor/bin/rector process src
You can also write your own rules, so you can convert the code to Laravel/MVC approach. The idea is to write one rule, that e.g. converts 100+ files to controllers.
Read more on Github repository.
Is it possible? Yes.
Is it going to take a short amount of time? Absolutely not.
With any kind of legacy codebase, you're going to need to take the time in figuring out all of its moving parts and figuring out what portions are going to need to change in order to even be able to work on a modern platform.
The most recent version of Laravel requires PHP 7.1.3, so even attempting to just dump the entire codebase into a Laravel application is very likely going to result in failure.
First, does your legacy application have tests? These can be unit tests, integration tests, or functional tests. If not, and you want to be able to modernize your application without breaking things in the future, then you're going to want to write tests to ensure that nothing breaks as you begin upgrading. This alone can take a long time, especially with a codebase that makes it difficult to even test in the first place. Having a fully tested application will allow you to see which tests begin to fail as you start reworking your application, so this information will be extremely valuable.
Second, are you able to get it to work on a more recent version of PHP? If this code is already in production, then you're going to need to use some hardware virtualization through Vagrant, or better yet, containerization through Docker to get a local installation up and running without breaking your production code.
Once that's ready, then you should be able to begin refactoring. Taking entire pages of code and dumping them right into a Laravel application is not going to work straight out of the gate. You're going to want to start smaller. Find all of your moving parts, figure out what each one is responsible for, and encapsulate them in classes with the appropriate methods.
Use Composer's PSR-4 Autoloader to help remove all of those extra include and require statements and load your new classes throughout the application.
Use a decent Router to change all of your URLs into SEO-friendly paths and have a clearly defined entrypoint for all requests.
Move all of your business logic out of webroot: Create a /public folder in which you have just your index.php entrypoint and all public-facing assets (images, css, javascript, etc.). Since all requests are all being routed over to this file by this point, you should be able to process the request and return your response.
Once you get to a point where you've actually gotten the application into a system of well-defined components and modules, then migrating over to Laravel - or any other well-established framework - should be much easier.
This is going to take you a long time if you plan on doing it right. Hopefully this helps, and best of luck to you.
Refactoring is of course possible, but I have some doubts, if it is doable partially in this case. By partially here, I mean that, parts of the app will run sometimes on old and sometimes on new code in production.
I did this once for and old project, but not as ancient and big as yours.
In my case it was custom app (without any framework) running on php 5.3 and I was converting it to Laravel 4.2.
I must admit that there are some real challenges on the path.
This is only tip of the iceberg, but I'll try to name few of them, from what I remember:
PHP version compatibility or rather incompatibility in this case. You can rewrite existing code to run on latest PHP 7 versions. That might be a lot work however - not used in the end.
Routing and asset handling - you need to check if you can modify urls so they can fit into Laravel routing engine. It may be really hard, especially if old app is using Laravel standard paths and if you don't want to break google indexing for example. I have also seen systems with custom generators for urls which were then heavily used in views. Trying to do perfect match for these routes would be a nightmare.
Authentication. Changing auth must be done in one step, cause adapting Laravel to properly work with sessions from old system (although doable) will clutter new code.
Database. You will be lucky if database is well designed, but I don't think it will be even close to Laravel Eloquent conventions. Although you can run it on Laravel without any DB schema modifications, your new code will also get bloated in your new app. This and other things can be refactored again in complete system, but it's another load of work.
Considering amount of all possible, not optimal workarounds, in order to have properly designed app (built with best practices), probably it will be better to rebuild from scratch.
Hope it helps a bit...

sqlite on Heroku (production, in memory)

We want to use sqlite, via the sqlite3 gem in Ruby in a production app on Heroku. However, heroku detects the gem, and blocks our deploys.
We're aware heroku's filesystem is ephemeral, but we're using SQLLite's in memory mode for a short lived database in a background worker. Heroku blocks the gem because they worry people will attempt to use it as a persistent database and be surprised when their data disappears (see link below). I can appreciate their concern, but we have a legitimate use case, and are still blocked.
Are there are any work arounds to the nanny-check heroku adds if you have a legitimate use case for sqlite?
Edit: Please note, we are not looking for alternative tool suggestions. We already have a "real" database with terabytes of data. We're loading data into a local temp DB as a legit optimization. sqlite lite works great on Heroku using any other language binding. I'm just looking for a way around Heroku's nanny-check to use Ruby + sqlite.
https://devcenter.heroku.com/articles/sqlite3
Unfortunately, Heroku doesn't support SQLite on its architecture basis. You should migrate to PostgreSQL. But I know the solution without migrating - Dokkur - Heroku analogue.
I don't think you have a lot of options here. Heroku doesn't allow the use of SQLite.
You can still use a PG table as a generic, short-lived, database. Or if you just need a simple storage and you don't need advanced querying mechanisms, you can use Redis.
Both are available from Heroku.

Is Heroku a replacement for a VPS?

We're currently evaluating Heroku to replace the initial workflow of renting a VPS for a small Web App (since we're working on NodeJS, cPanel hosting plans aren't enough, ergo, VPS).
The confusion lies in Heroku's actual usage as even though it's clear it's used as a platform as a service, there is no Disk (HDD/SSD) limit described.
Web App requirement includes file upload capabilities (profile picture, etc) so I'm not sure Heroku is what we need. Can I get a clear explanation on this?
Not a Heroku expert, but...
You could always use one of the various add-ins that offer database support for storing your images until that no longer works
As the usage of your site scales out, you'd probably want to place static content into a CDN.
I wouldn't consider placing files into Heroku that weren't related to running code and honestly I don't even know if you can.
(I originally just wanted to comment, but need a higher rep :/)

Using org.springframework.cache.support.SimpleCacheManager in the cloud

I noticed that Spring reference application (Sagan) uses the SimpleCacheManager implementation. See here for source code of Sagan.
I was surprised by this choice because I thought that all but small applications running on a single node would use something like a Redis cache manager and not the simple cache manager.
How can a large application like Sagan -which I assume runs on cloudfoundry- use this simple implementation?
Any comment welcome.
Well, the SimpleCacheManager choice has been made because it was the simplest solution that could possibly work. Note that Sagan is, at least for now, not storing a lot of data in that cache and merely using it to respect various APIs rate-limiting and get better performance on some parts of the application.
Yes, Sagan is running on CloudFoundry (see this presentation) and is using CF marketplace services.
Even if cache consistency between instances is not a constraint for now, we could definitely add another marketplace service, here a Redis Cloud instance, and use this as a central cache repository.
Now that we're considering using that cache for more features, it even makes sense to at least consider that use case, since it could lower our monthly bill (pay a small fee for a redis service and use less memory for our CF instances).
In any case, thanks a lot balteo for this insightful question, we've created a Github issue for that.

What's a good alternative to page caching on Heroku?

I understand page caching isn't a good option on heroku since each dyno has an emepheral file system (so they wouldn't share files and it would get wiped out on each restart).
So I'm wondering what the best alternative is. I have a large amount of potential files that could get generated in a traditional page caching scenario (say 10GB-100GB) so redis/memcached don't seem like good options here. Redis can write out to disk, but my understanding is that once you exceed it's memory capacity, it's not the right solution to start reading off of disk.
Has anyone found a good solution here? I'm thinking maybe MongoStore. (And some way to run this in conjunction with redis since I'm using redis for some other scenarios.) Thanks!
If your site is 100% static content and never going to be dynamic, S3 may be a good option. You can then create a CNAME to the s3 domain. This allows you to leverage CloudFront should you need it. Otherwise, 100GB would have to go into the database, which is in turn then pulled up by your application.
Heroku's cedar stack allows for custom buildpacks. This one vendors nginx. This would be good if you envision transitioning to a more dynamic site.

Resources