Is CodeIgniter suitable for large intranet applications that do not use MySQL? - codeigniter

I don't really plan on using active record or any of the built in database constructs native to CodeIgniter for database access. I have Oracle, SQL Server, and others. I want to use PHP PDO (unless anyone thinks that's bad) because of the universal aspect of it.
I mainly want CI because of some of the built in libraries and MVC. I also like that it is small and easy to work with.
2.x if it matters.
I did see other questions but none exactly about databases.
Thanks.
edit: It's not that I don't think CI and PHP can take it with large websites. This is solely about using multiple databases of varying companies. I have mostly seen MySQL used with it. I know I can use other databases but again, I don't know if it is more trouble than worth or what.

MySQL is the default just because of how widely-adopted it is, especially in the PHP world. Almost everyone has a *AMP stack to work on so it ends up being the main driver used in almost every example out there.
If you're not planning on using the database class, then it really doesn't matter what type of database you are using, just don't load the class. You can still use routing, helpers, libraries, and other CI features.
So yes, I do think it is suitable for your purposes.

CodeIgniter was built with the idea of being the framework closest to native PHP that doesn't tell you what to do. The entire framework is modular and you are not required to use any single component.
Yes, it is absolutely suited to what you are doing. You can plug and play whatever DB driver you want and CI will not complain one bit.
I think CI is more suited for this role than any other of the 'big' frameworks.

Related

Use Laravel to refactor old, large PHP application partially over time?

From what I've read this should be possible due to the modular nature of Laravel, but I need assurance from people with more Laravel experience:
I have a very large (500k loc) ancient PHP app. So ancient that some parts of it date from PHP3 times (ca. 2000, PHP4 was released already but PHP3 was used for backwards compatibility reasons).
Refactoring this is a huge project, and the only way to reasonably do it is in parts. Replace this part, then that part, etc. Fortunately, the "ancient" part comes in handy as no framework was used and basically every page is its own script, with a few central libraries for shared functionality.
Is it possible to spin up a Laravel app that can route new/refactored pages to the new site and everything else (wildcard if possible) to the ancient code? All data is stored in a database, so there will be no sync issues between them except for user authentication and session info.
Is it possible to get eloquent running on an ancient DB design or to refactor the DB in such a way that it works for both? There was a previous attempt to move the DB interface to Doctrine which from what I know was aborted after partial success (i.e. many DB objects are accessed through Doctrine, but there is also a lot of straight SQL code in parallel).
It's a huge mess, but the software in question is still being used and successfully so and a previous attempt to replace it with something else has already failed.
additional details:
Thanks #maiorano84 for good questions:
First, does your legacy application have tests?
Negative on that. PHPUnit was released in 2004. At that time, this app had already been in production for four year.
Second, are you able to get it to work on a more recent version of PHP?
Yes, the current codebase is running on PHP 5.6.33 - it has been maintained throughout the years, and a major update was made on the transition between PHP 4 and PHP 5.
If it runs on PHP 5.3+, you can use Instant Refactoring
I'm an author of Rector, a tool that can migrate huge amount of PHP files in a few seconds. E.g. upgrade PHP 5.3 to PHP 7.4, upgrade Symfony 2.8 to 4.2 or migrate from Nette to Symfony (read the case study).
1. Install Rector
composer require rector/rector --dev
2. Create rector.php with PHP sets, since you have old PHP code
// rector.php
use Rector\Core\Configuration\Option;
use Rector\Set\ValueObject\SetList;
use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator;
return static function (ContainerConfigurator $containerConfigurator):
void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Option::SETS, [
SetList::PHP_52,
// use one set at a time, as sets build on each other
// SetList::PHP_53,
// SetList::PHP_54,
]);
};
3. Then run Rector on your code (e.g. /src directory)
vendor/bin/rector process src
You can also write your own rules, so you can convert the code to Laravel/MVC approach. The idea is to write one rule, that e.g. converts 100+ files to controllers.
Read more on Github repository.
Is it possible? Yes.
Is it going to take a short amount of time? Absolutely not.
With any kind of legacy codebase, you're going to need to take the time in figuring out all of its moving parts and figuring out what portions are going to need to change in order to even be able to work on a modern platform.
The most recent version of Laravel requires PHP 7.1.3, so even attempting to just dump the entire codebase into a Laravel application is very likely going to result in failure.
First, does your legacy application have tests? These can be unit tests, integration tests, or functional tests. If not, and you want to be able to modernize your application without breaking things in the future, then you're going to want to write tests to ensure that nothing breaks as you begin upgrading. This alone can take a long time, especially with a codebase that makes it difficult to even test in the first place. Having a fully tested application will allow you to see which tests begin to fail as you start reworking your application, so this information will be extremely valuable.
Second, are you able to get it to work on a more recent version of PHP? If this code is already in production, then you're going to need to use some hardware virtualization through Vagrant, or better yet, containerization through Docker to get a local installation up and running without breaking your production code.
Once that's ready, then you should be able to begin refactoring. Taking entire pages of code and dumping them right into a Laravel application is not going to work straight out of the gate. You're going to want to start smaller. Find all of your moving parts, figure out what each one is responsible for, and encapsulate them in classes with the appropriate methods.
Use Composer's PSR-4 Autoloader to help remove all of those extra include and require statements and load your new classes throughout the application.
Use a decent Router to change all of your URLs into SEO-friendly paths and have a clearly defined entrypoint for all requests.
Move all of your business logic out of webroot: Create a /public folder in which you have just your index.php entrypoint and all public-facing assets (images, css, javascript, etc.). Since all requests are all being routed over to this file by this point, you should be able to process the request and return your response.
Once you get to a point where you've actually gotten the application into a system of well-defined components and modules, then migrating over to Laravel - or any other well-established framework - should be much easier.
This is going to take you a long time if you plan on doing it right. Hopefully this helps, and best of luck to you.
Refactoring is of course possible, but I have some doubts, if it is doable partially in this case. By partially here, I mean that, parts of the app will run sometimes on old and sometimes on new code in production.
I did this once for and old project, but not as ancient and big as yours.
In my case it was custom app (without any framework) running on php 5.3 and I was converting it to Laravel 4.2.
I must admit that there are some real challenges on the path.
This is only tip of the iceberg, but I'll try to name few of them, from what I remember:
PHP version compatibility or rather incompatibility in this case. You can rewrite existing code to run on latest PHP 7 versions. That might be a lot work however - not used in the end.
Routing and asset handling - you need to check if you can modify urls so they can fit into Laravel routing engine. It may be really hard, especially if old app is using Laravel standard paths and if you don't want to break google indexing for example. I have also seen systems with custom generators for urls which were then heavily used in views. Trying to do perfect match for these routes would be a nightmare.
Authentication. Changing auth must be done in one step, cause adapting Laravel to properly work with sessions from old system (although doable) will clutter new code.
Database. You will be lucky if database is well designed, but I don't think it will be even close to Laravel Eloquent conventions. Although you can run it on Laravel without any DB schema modifications, your new code will also get bloated in your new app. This and other things can be refactored again in complete system, but it's another load of work.
Considering amount of all possible, not optimal workarounds, in order to have properly designed app (built with best practices), probably it will be better to rebuild from scratch.
Hope it helps a bit...

How does Node.JS and/or Meteor get a callback from the database when a 3rd party software update the database

I would like to use Meteor (Node.JS) to develop an application that will be used by 3,000+ concurrent users on a large size database.
I have looked at the nice examples and the idea to push changed data to the clients is very nice and very useful, but before I start the development I want to be sure how it works behind the scenes to be sure that when I have the application running with all these users it work fast with standard hardware.
I also require this to use Oracle as a database, but not sure that it is supported and if not, what are the requirements from an Oracle package to enable this facility.
I think that the server is having an active on going non-blocking query on the OPLOG table in mongodb and that is how we get the callback for all the changes in the database. Is that correct ? if so, is there a similar way to do it in Oracle ?
Thanks Roni.
I also require this to use Oracle as a database, but not sure that it is supported and if not, what are the requirements from an Oracle package to enable this facility.
Nope, meteor is currently mongo-only as they have implemented an in-browser library called minimongo. My guess is this project will never support oracle, but who knows. There is no mention of oracle support on the meteor project roadmap
Just happened to come across this question while searching at google.
However, if there are no native solutions. We can always figure out a way for a medium language to issue publishing.
Example Case:
Python will be used for synchronising data between Mongodb and Oracle (24/7 operation using cx_Oracle and MongoDB drivers from python)
Meteor Server will keep watch on what to publish
Meteor clients/browsers that subscribed to the channel that will be updated with oracle data.

Object persistence without ORM or DB Engine

I learned to love how LINQ enables set operations on collections. I'm not saying that I plan to shun traditional RDMBS, because I do need it for reporting. There are NoSQL alternatives out there, but they seem to all need to fire up a separate service.
What I looking for is something local where a DLL can create a database and perform CRUD on it. As mentioned, I'm not going to report out of this, just internal data store. The main application that will be using it is in C#.
I'm hoping that someone can give me a lead. If not, if there is anyone willing, we can start a open-source project for it. I'm not interested in commercial products.
Thanks,
You can run RavenDB in embedded mode inside your .NET application - no need for external services or anything.
And RavenDB supports Linq....

web development - MVC and it's limitations

MVC sets up clear distinction between Model, View and Controller.
For the model, now adays, web frameworks provides ability to map the model directly to database entities (ORM), which, IMHO, end up causing performance issues at runtime due to direct database I/O.
The thing is, if that's really the case, why model ORM is so pupular and every web frameworks want to support it either organically or not.
To a web site has huge amount of traffic, it definitely won't work. But what's the work around? Connect directly to database is definitely not a wise solution here.
What's your question?
Is it a good idea to use direct db access from webpages?
A: No.
Is it a good idea to use ORM's?
A: Debatable : See How can I design a Java web application without an ORM and without embedded SQL
Is it a good idea to use MVC model?
A: Yes - it has nothing to do with "Direct" database access - it's about separating your application logic from your model and your display. (Put simply).
And the rationale for not putting database logic inside webpages has nothing to do with performance - it's about security/maintainability etc etc. Calling a usp from a webpage is likely to be MORE performant than using an ORM, but it's bad because the performance gain is negligible, and the cons are significant.
As to workaround: if you mean how do you hook up a database to a web application...?
The simplest way is to use something like Entity Frameworks or Linq-Sql with your Model - there are plenty of examples of this in tutorials on the web.
A better method IMO, is to have a separate Services layer (which may be WCF based), and have all the database access inside that, with DTO's transferring the data to your Web Application which has it's own ViewModel.
Mvc is not about orm but about separation of display logics and business logics. There is no reason your exposed model needs to be identical to you database model and many reasons to ensure that the exposed model closely matches what is to be displayed.
The other part of the solution to scale well would be to implement caching in the control and be able to distribute load on sevaral instances.
I think #BonyT has given a good answer, (and I've voted for it :) ), I'd just add that:
"web frameworks provide the ability to map the model directly to database entities (ORM), which, IMHO, ends up causing performance issues at runtime due to direct database I/O"
Even if this is true, using an ORM can solve a lot of problems with a model being easy to update and translate back and forth between a database. Solving a performance hit by buying extra web servers or cloud instances is much cheaper than having to buy extra developers or extra hours in development to solve things other people have already written ORMs to do for you.

Would SQLite be a 'better' choice for Joomla than MySQL, if it would be available?

Since this doesn't touch a real problem of mine I'm somwhat uncertain, if it is even worth to be asked here. However maybe some of you would like to share your opinion on that.
In general I have to admit, that 'better' means anything and nothing at all at the same time. So I probably should be more specific, but I tried not to overflow the topic. In a regular hosted environment on one of those cheap webhosters (like Dreamhost), with around 1000 articles in Joomla, a couple of users and a few hundreds visitors a day, would a SQLite database with a persistent connection (sqlite_popen) perform noticeable faster than the MySQL equivalent (with the TCP/IP overhead etc.)?
Or in short: Would it be wise to call Joomla to support SQLite?
I have never used sqlite on a website, but I have used it extensively for other purposes and I quite like it. The truth is, you won't know till you try. If you try, I reccomend creating a db abstraction layer first so that you can easily swap in other db's.
The downside to sqlite is that it's not really meant to be a multi user database. If you rarely write to the db, but do lots of reading, sqlite will probably be fine. If you find that you need multiple processes writing to the same db, I believe sqlite uses file level locking to maintain database consistency.. So, if all you're tables are in the same file, you'll lock the whole file while it's being written to even if another process wants to modify a completely different table.
In my opinion it's not the big multi user databases of the world that should be worried about competition from sqlite... It's all the regular files out there (and there custom file formats) that applications create and use that should be shaking in their boots about sqlite...
Linux ISPs for whatever reason seem to have settled on MySQL. This is what they offer and you will lock yourself to a limited number of service providers if you wander outside the norm.

Resources