How to properly add custom database support to Laravel 5? - laravel

Suppose you want to add custom database support in Laravel 5. Whether it is MongoDB, Oracle or sqlite mod (SQLCipher). How would you go about it?
What I'm looking for is proper, elegant solution. Without core-hacking, which is easy to do, but would force you to manually analyze the codebase and rewrite it on every Laravel update.
Background
Actually, got this question trying to implement sqlcipher support in Laravel. Analyzed alternative crypto solutions, found no good enough way to manage MySQL or Postgre keys, so decided to go with encrypted sqlite. In fact, one might probably copy sqlite routine and just change the lib + add key request, but as I said before, I don't want to hack the core. It's just plain wrong.

Related

Use Laravel to refactor old, large PHP application partially over time?

From what I've read this should be possible due to the modular nature of Laravel, but I need assurance from people with more Laravel experience:
I have a very large (500k loc) ancient PHP app. So ancient that some parts of it date from PHP3 times (ca. 2000, PHP4 was released already but PHP3 was used for backwards compatibility reasons).
Refactoring this is a huge project, and the only way to reasonably do it is in parts. Replace this part, then that part, etc. Fortunately, the "ancient" part comes in handy as no framework was used and basically every page is its own script, with a few central libraries for shared functionality.
Is it possible to spin up a Laravel app that can route new/refactored pages to the new site and everything else (wildcard if possible) to the ancient code? All data is stored in a database, so there will be no sync issues between them except for user authentication and session info.
Is it possible to get eloquent running on an ancient DB design or to refactor the DB in such a way that it works for both? There was a previous attempt to move the DB interface to Doctrine which from what I know was aborted after partial success (i.e. many DB objects are accessed through Doctrine, but there is also a lot of straight SQL code in parallel).
It's a huge mess, but the software in question is still being used and successfully so and a previous attempt to replace it with something else has already failed.
additional details:
Thanks #maiorano84 for good questions:
First, does your legacy application have tests?
Negative on that. PHPUnit was released in 2004. At that time, this app had already been in production for four year.
Second, are you able to get it to work on a more recent version of PHP?
Yes, the current codebase is running on PHP 5.6.33 - it has been maintained throughout the years, and a major update was made on the transition between PHP 4 and PHP 5.
If it runs on PHP 5.3+, you can use Instant Refactoring
I'm an author of Rector, a tool that can migrate huge amount of PHP files in a few seconds. E.g. upgrade PHP 5.3 to PHP 7.4, upgrade Symfony 2.8 to 4.2 or migrate from Nette to Symfony (read the case study).
1. Install Rector
composer require rector/rector --dev
2. Create rector.php with PHP sets, since you have old PHP code
// rector.php
use Rector\Core\Configuration\Option;
use Rector\Set\ValueObject\SetList;
use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator;
return static function (ContainerConfigurator $containerConfigurator):
void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Option::SETS, [
SetList::PHP_52,
// use one set at a time, as sets build on each other
// SetList::PHP_53,
// SetList::PHP_54,
]);
};
3. Then run Rector on your code (e.g. /src directory)
vendor/bin/rector process src
You can also write your own rules, so you can convert the code to Laravel/MVC approach. The idea is to write one rule, that e.g. converts 100+ files to controllers.
Read more on Github repository.
Is it possible? Yes.
Is it going to take a short amount of time? Absolutely not.
With any kind of legacy codebase, you're going to need to take the time in figuring out all of its moving parts and figuring out what portions are going to need to change in order to even be able to work on a modern platform.
The most recent version of Laravel requires PHP 7.1.3, so even attempting to just dump the entire codebase into a Laravel application is very likely going to result in failure.
First, does your legacy application have tests? These can be unit tests, integration tests, or functional tests. If not, and you want to be able to modernize your application without breaking things in the future, then you're going to want to write tests to ensure that nothing breaks as you begin upgrading. This alone can take a long time, especially with a codebase that makes it difficult to even test in the first place. Having a fully tested application will allow you to see which tests begin to fail as you start reworking your application, so this information will be extremely valuable.
Second, are you able to get it to work on a more recent version of PHP? If this code is already in production, then you're going to need to use some hardware virtualization through Vagrant, or better yet, containerization through Docker to get a local installation up and running without breaking your production code.
Once that's ready, then you should be able to begin refactoring. Taking entire pages of code and dumping them right into a Laravel application is not going to work straight out of the gate. You're going to want to start smaller. Find all of your moving parts, figure out what each one is responsible for, and encapsulate them in classes with the appropriate methods.
Use Composer's PSR-4 Autoloader to help remove all of those extra include and require statements and load your new classes throughout the application.
Use a decent Router to change all of your URLs into SEO-friendly paths and have a clearly defined entrypoint for all requests.
Move all of your business logic out of webroot: Create a /public folder in which you have just your index.php entrypoint and all public-facing assets (images, css, javascript, etc.). Since all requests are all being routed over to this file by this point, you should be able to process the request and return your response.
Once you get to a point where you've actually gotten the application into a system of well-defined components and modules, then migrating over to Laravel - or any other well-established framework - should be much easier.
This is going to take you a long time if you plan on doing it right. Hopefully this helps, and best of luck to you.
Refactoring is of course possible, but I have some doubts, if it is doable partially in this case. By partially here, I mean that, parts of the app will run sometimes on old and sometimes on new code in production.
I did this once for and old project, but not as ancient and big as yours.
In my case it was custom app (without any framework) running on php 5.3 and I was converting it to Laravel 4.2.
I must admit that there are some real challenges on the path.
This is only tip of the iceberg, but I'll try to name few of them, from what I remember:
PHP version compatibility or rather incompatibility in this case. You can rewrite existing code to run on latest PHP 7 versions. That might be a lot work however - not used in the end.
Routing and asset handling - you need to check if you can modify urls so they can fit into Laravel routing engine. It may be really hard, especially if old app is using Laravel standard paths and if you don't want to break google indexing for example. I have also seen systems with custom generators for urls which were then heavily used in views. Trying to do perfect match for these routes would be a nightmare.
Authentication. Changing auth must be done in one step, cause adapting Laravel to properly work with sessions from old system (although doable) will clutter new code.
Database. You will be lucky if database is well designed, but I don't think it will be even close to Laravel Eloquent conventions. Although you can run it on Laravel without any DB schema modifications, your new code will also get bloated in your new app. This and other things can be refactored again in complete system, but it's another load of work.
Considering amount of all possible, not optimal workarounds, in order to have properly designed app (built with best practices), probably it will be better to rebuild from scratch.
Hope it helps a bit...

Token based authentication using Dapper micro-orm

I am looking for tutorial or sample for Dapper using token based authentication in web api 2. I appreciate if anyone can suggest where to start, I have found tutorial in http://www.c-sharpcorner.com/UploadFile/ff2f08/token-based-authentication-using-Asp-Net-web-api-owin-and-i/ but the sample is using EF and I havent tried using EF, but dapper also I am using MySQL for my database. Thanks in advance and good day.
Dapper is a very different tool to EF (which is the DbContext described in your step 3 / step 4). It simply will not be compatible with those steps, and isn't designed to be used with those steps.
But here's the thing: dapper is just a tool. EF is just a tool. It is ok to use more than one tool. If it suits your purposes, then use EF to do one set of jobs (for example, to help you use a particular library that is designed with that in mind), and use another tool (such as dapper) elsewhere in the same project. That's OK. No one will mind.
If you really really don't want to use EF at all, then you'll need to find out everything that the library needs to support what you are doing, and implement it manually. If the library is designed around IQueryable<T> etc, then this may be very difficult.

Multitenant app with single database

I'm developing a multitenant application using Laravel. I've read different blogs, posts, sites for this and I decided to do it with a single database.
So, I know that I only need to filter every query with the tenant_id and that's it! But if I do it from every query, probably someday there'll be an error and I don't want to cause any information security issue for my tenants.
I read, probably, an old article for it, culttt.com/2014/03/31/multi-tenancy-laravel-4, and I found many concepts that I still don't understand because I'm new to Laravel.
Is this approach still the best for do it? Or has Laravel now its own solution to do it?
I like something similar to this: stackoverflow.com/questions/33219951/php-pdo-add-filter-to-all-queries but from Eloquent. How can I do this?
Thanks.
If I were you I would not go this way. I would create separate database for each client/each app - it's much safer solution and in addition in case you will need create Database backups or restore some client data it will be much simpler to do that than dealing with huge database when you have all your clients.

How does Node.JS and/or Meteor get a callback from the database when a 3rd party software update the database

I would like to use Meteor (Node.JS) to develop an application that will be used by 3,000+ concurrent users on a large size database.
I have looked at the nice examples and the idea to push changed data to the clients is very nice and very useful, but before I start the development I want to be sure how it works behind the scenes to be sure that when I have the application running with all these users it work fast with standard hardware.
I also require this to use Oracle as a database, but not sure that it is supported and if not, what are the requirements from an Oracle package to enable this facility.
I think that the server is having an active on going non-blocking query on the OPLOG table in mongodb and that is how we get the callback for all the changes in the database. Is that correct ? if so, is there a similar way to do it in Oracle ?
Thanks Roni.
I also require this to use Oracle as a database, but not sure that it is supported and if not, what are the requirements from an Oracle package to enable this facility.
Nope, meteor is currently mongo-only as they have implemented an in-browser library called minimongo. My guess is this project will never support oracle, but who knows. There is no mention of oracle support on the meteor project roadmap
Just happened to come across this question while searching at google.
However, if there are no native solutions. We can always figure out a way for a medium language to issue publishing.
Example Case:
Python will be used for synchronising data between Mongodb and Oracle (24/7 operation using cx_Oracle and MongoDB drivers from python)
Meteor Server will keep watch on what to publish
Meteor clients/browsers that subscribed to the channel that will be updated with oracle data.

Is CodeIgniter suitable for large intranet applications that do not use MySQL?

I don't really plan on using active record or any of the built in database constructs native to CodeIgniter for database access. I have Oracle, SQL Server, and others. I want to use PHP PDO (unless anyone thinks that's bad) because of the universal aspect of it.
I mainly want CI because of some of the built in libraries and MVC. I also like that it is small and easy to work with.
2.x if it matters.
I did see other questions but none exactly about databases.
Thanks.
edit: It's not that I don't think CI and PHP can take it with large websites. This is solely about using multiple databases of varying companies. I have mostly seen MySQL used with it. I know I can use other databases but again, I don't know if it is more trouble than worth or what.
MySQL is the default just because of how widely-adopted it is, especially in the PHP world. Almost everyone has a *AMP stack to work on so it ends up being the main driver used in almost every example out there.
If you're not planning on using the database class, then it really doesn't matter what type of database you are using, just don't load the class. You can still use routing, helpers, libraries, and other CI features.
So yes, I do think it is suitable for your purposes.
CodeIgniter was built with the idea of being the framework closest to native PHP that doesn't tell you what to do. The entire framework is modular and you are not required to use any single component.
Yes, it is absolutely suited to what you are doing. You can plug and play whatever DB driver you want and CI will not complain one bit.
I think CI is more suited for this role than any other of the 'big' frameworks.

Resources