Separating different parts of the project in Git - laravel

How can I efficiently separate different parts of the project in Git? I have a Laravel web application that includes admin panel + API for Mobile app to increase performance. I thought it would be a good idea to separate the admin part from the API to disable a service provider in API and even run the admin panel on a different server (connect to the database via remote MySQL) and dedicate a server API. How can I separate these parts without duplicating changes that I make in common parts like models? I thought of creating them as two branches in a Git repository. Is there a better way to do this separation or the whole optimization that is easier to maintain?
Update: The issue I'm facing is the response time. I put the following code into my routes, and it takes 400-600ms to respond.
Route::any('/test2', function()
{
return "test";
});
I tested it on two different servers, and the configuration is good enough, I think (10GB ram - 4 CPU core 3.6Ghz). By the way, I have less than 1k requests per hour for now, and soon I'm looking at 5k-20k at most.

I think dividing your source code into modules is good enough. Give a look to Laravel Module

I will suggest you to do as the creator of the Framework (Taylor): Packages and use Composer.
In the Laravel community, you have many Packages available like Horizon, Nova, Telescope, Spatie/* etc.
If you want to add them you just have to add a Composer dependency and it just work out of the box.
You can do the same with your code that will be in both project like Models etc.
Every Package has its own Git repo.
This is a more Laravel way to do it than separate into Module (compared to Symphony world). Laravel doesn't come with Modules at its core.
Now about separating projects:
As i read your need, i am not sure you will have performance issue if you run the API and the admin panel on the same project unless you have millions of http calls per hours.
I am currently working on a project with a lot of code for the client side, we also have an api with thousands of call per hours and everything is fine. We also run Nova for the internal backend.
You should consider that when you will have those scale problem, you will probably have database problem too and maybe servers problems (bandwith, memory, cost etc).
To be scalable 100% is not an easy task.
In my opinion, when you face it, solve it. Separating the Api/admin pannel at the beginning could be too much overhead to start/maintain a project.

Related

Codeigniter seperate front-end and back-end

I'm currently learning myself the Codeigniter framework. I want to create a project that will have a front-end (which will be used by users) and a back-end (which would be used by administrators).
I have looked through different articles, all of them suggest using HMVC to separate the public and admin controllers/views. I have also considered to create two separate projects, one for the public and one for the admin, both using the same database.
I have tried to do research on which one of the methods mentioned above would be the best solution for a potentially large project, but could not come up with any sustainable answer.
Is it possible that two separate CodeIgniter projects can access and use the same database simultaneously?
Edit:
The client project would mostly just query the database for results, whereas the admin project would be full CRUD.
If indeed creating multiple projects would be the recommended way to go, the admin project would be running on a sub-domain i.e admin.example.com whilst the client project would be running on example.com
It is valid to use any of the approaches you mention. It is a matter of personal preference (read: opinion). I have used each singly and in combination with more or less the same outcome. I have settled on using none of the above. Instead, I use a single project, no HMVC, no subdomains, standard CI file structure. I feel keeping it simple ultimately makes it easier to build and maintain. YMMV.
What separates the public-users from admin-users is authentication and authorization (A&A). Nobody gets into an admin controller without the proper login credentials and permissions. You're going to need A&A anyway to keep the public from accidentally discovering the admin area. IMO, a "special" file structure and subdomains actually make implementing A&A harder.
Depending on your deadline for this project you might want to look at using CodeIgniter Version 4. It's a thoroughly modern revamp of the framework. It is still in beta test mode, but I've found it to be quite stable. They are working hard to get to the release version. There is no published release date yet, but all indications are it will be sooner rather than later.
The answer as to how to configure CI is really dependent on your needs and what you feel is best. There is no right answer or "acceptable" way of doing things in this regard.
When I first started with Codeigniter, I had just a sub-folder for backend controllers called admin as well as an Admin base/core controller that all admin classes extended rather than CI_Controller. Models/views can be similarly organized in to sub-folders. This was a perfectly acceptable solution in my opinion for small-scale applications.
I moved in to HMVC and found that it really isn't that much different in terms of keeping them both separate. Although you can easily drag-and-drop modules from different projects so long as they are decoupled, you'll still have to jump through hoops to get front/back ends separate. I believe I used this is a starting point: https://github.com/jmtolibas/HMVC-CI3-with-Separate-Backend-and-Frontend
In terms of what you mentioned, having 2 separate projects wouldn't necessarily be a bad idea. You could even share the same system folder with a modification in index.php regarding the system path. Multiple database connections shouldn't be an issue.
So basically, all 3 approach will work, it is up to you to determine which one you like working with the most.
If you want my opinion, I would use Laravel or Lumen on any new project, and separation of front/back end is rather easy with packages, namespacing, .etc.

What is the right way to realize a large web app

I have to realize a web app able to guarantee two main actions execution:
first one action is about allowing to a small number of users to upload ads or posts, these A users will can upload ads in the application as they will want but they will can upload photos until five megabyte overall threshold. Moreover ads total number will be approximately 10k.
Second action is about allowing to a broad public (1k users per day) to looking for and check published articles from A users, these research will can be more accurate inserting advanced filters.
I would like to know if is strictly necessary build up my app in a scalable way or if I could simply use a MVC approach?
This app will be developed using Laravel framework and it will hosted by Amazon server.
what do you recommend me to do ?
I would like to have some advices, tips and tricks to do it in the best way.
Thanks in advance
The MVC approach is fine, Laravel, or any platform, can be scaled in multiple ways. The simplest is separating the functions of DB, laravel app, cache, queue, to separate servers, and each of those pieces can be scaled separately.
There is a great online set of videos about this, https://serversforhackers.com/scaling-laravel/forge.
But unless you know you will have a large amount of traffic right away, it's better to start with a simpler structure, you will save on cost and it's not hard to scale it later. I mean, start with one server for now, then maybe separate the functions (cache, DB etc) to separate servers as you find they need to be scaled.
If you want to save a little hassle though, I do recommend Laravel Forge, and Envoyer. It makes deploying and managing servers a lot easier. Envoyer is for deployments, great to automate all that.

Use Laravel to refactor old, large PHP application partially over time?

From what I've read this should be possible due to the modular nature of Laravel, but I need assurance from people with more Laravel experience:
I have a very large (500k loc) ancient PHP app. So ancient that some parts of it date from PHP3 times (ca. 2000, PHP4 was released already but PHP3 was used for backwards compatibility reasons).
Refactoring this is a huge project, and the only way to reasonably do it is in parts. Replace this part, then that part, etc. Fortunately, the "ancient" part comes in handy as no framework was used and basically every page is its own script, with a few central libraries for shared functionality.
Is it possible to spin up a Laravel app that can route new/refactored pages to the new site and everything else (wildcard if possible) to the ancient code? All data is stored in a database, so there will be no sync issues between them except for user authentication and session info.
Is it possible to get eloquent running on an ancient DB design or to refactor the DB in such a way that it works for both? There was a previous attempt to move the DB interface to Doctrine which from what I know was aborted after partial success (i.e. many DB objects are accessed through Doctrine, but there is also a lot of straight SQL code in parallel).
It's a huge mess, but the software in question is still being used and successfully so and a previous attempt to replace it with something else has already failed.
additional details:
Thanks #maiorano84 for good questions:
First, does your legacy application have tests?
Negative on that. PHPUnit was released in 2004. At that time, this app had already been in production for four year.
Second, are you able to get it to work on a more recent version of PHP?
Yes, the current codebase is running on PHP 5.6.33 - it has been maintained throughout the years, and a major update was made on the transition between PHP 4 and PHP 5.
If it runs on PHP 5.3+, you can use Instant Refactoring
I'm an author of Rector, a tool that can migrate huge amount of PHP files in a few seconds. E.g. upgrade PHP 5.3 to PHP 7.4, upgrade Symfony 2.8 to 4.2 or migrate from Nette to Symfony (read the case study).
1. Install Rector
composer require rector/rector --dev
2. Create rector.php with PHP sets, since you have old PHP code
// rector.php
use Rector\Core\Configuration\Option;
use Rector\Set\ValueObject\SetList;
use Symfony\Component\DependencyInjection\Loader\Configurator\ContainerConfigurator;
return static function (ContainerConfigurator $containerConfigurator):
void {
$parameters = $containerConfigurator->parameters();
$parameters->set(Option::SETS, [
SetList::PHP_52,
// use one set at a time, as sets build on each other
// SetList::PHP_53,
// SetList::PHP_54,
]);
};
3. Then run Rector on your code (e.g. /src directory)
vendor/bin/rector process src
You can also write your own rules, so you can convert the code to Laravel/MVC approach. The idea is to write one rule, that e.g. converts 100+ files to controllers.
Read more on Github repository.
Is it possible? Yes.
Is it going to take a short amount of time? Absolutely not.
With any kind of legacy codebase, you're going to need to take the time in figuring out all of its moving parts and figuring out what portions are going to need to change in order to even be able to work on a modern platform.
The most recent version of Laravel requires PHP 7.1.3, so even attempting to just dump the entire codebase into a Laravel application is very likely going to result in failure.
First, does your legacy application have tests? These can be unit tests, integration tests, or functional tests. If not, and you want to be able to modernize your application without breaking things in the future, then you're going to want to write tests to ensure that nothing breaks as you begin upgrading. This alone can take a long time, especially with a codebase that makes it difficult to even test in the first place. Having a fully tested application will allow you to see which tests begin to fail as you start reworking your application, so this information will be extremely valuable.
Second, are you able to get it to work on a more recent version of PHP? If this code is already in production, then you're going to need to use some hardware virtualization through Vagrant, or better yet, containerization through Docker to get a local installation up and running without breaking your production code.
Once that's ready, then you should be able to begin refactoring. Taking entire pages of code and dumping them right into a Laravel application is not going to work straight out of the gate. You're going to want to start smaller. Find all of your moving parts, figure out what each one is responsible for, and encapsulate them in classes with the appropriate methods.
Use Composer's PSR-4 Autoloader to help remove all of those extra include and require statements and load your new classes throughout the application.
Use a decent Router to change all of your URLs into SEO-friendly paths and have a clearly defined entrypoint for all requests.
Move all of your business logic out of webroot: Create a /public folder in which you have just your index.php entrypoint and all public-facing assets (images, css, javascript, etc.). Since all requests are all being routed over to this file by this point, you should be able to process the request and return your response.
Once you get to a point where you've actually gotten the application into a system of well-defined components and modules, then migrating over to Laravel - or any other well-established framework - should be much easier.
This is going to take you a long time if you plan on doing it right. Hopefully this helps, and best of luck to you.
Refactoring is of course possible, but I have some doubts, if it is doable partially in this case. By partially here, I mean that, parts of the app will run sometimes on old and sometimes on new code in production.
I did this once for and old project, but not as ancient and big as yours.
In my case it was custom app (without any framework) running on php 5.3 and I was converting it to Laravel 4.2.
I must admit that there are some real challenges on the path.
This is only tip of the iceberg, but I'll try to name few of them, from what I remember:
PHP version compatibility or rather incompatibility in this case. You can rewrite existing code to run on latest PHP 7 versions. That might be a lot work however - not used in the end.
Routing and asset handling - you need to check if you can modify urls so they can fit into Laravel routing engine. It may be really hard, especially if old app is using Laravel standard paths and if you don't want to break google indexing for example. I have also seen systems with custom generators for urls which were then heavily used in views. Trying to do perfect match for these routes would be a nightmare.
Authentication. Changing auth must be done in one step, cause adapting Laravel to properly work with sessions from old system (although doable) will clutter new code.
Database. You will be lucky if database is well designed, but I don't think it will be even close to Laravel Eloquent conventions. Although you can run it on Laravel without any DB schema modifications, your new code will also get bloated in your new app. This and other things can be refactored again in complete system, but it's another load of work.
Considering amount of all possible, not optimal workarounds, in order to have properly designed app (built with best practices), probably it will be better to rebuild from scratch.
Hope it helps a bit...

What's the traditional way to build a RESTful App via Laravel?

I'm going to build my first REST app via Laravel and a client side framework which I'm not sure yet (probably React or Vue.js).
I did some research about how should I build my app, and unfortunately it got me even more confused.
I've come to a conclusion that I can build my app in 2 ways:
Build the app in the same project. However, without Laravel Blade.
Separate the App to 2 projects (Front and Back).
On the one hand, the pros of building the app on the same project:
Grants me the option to use Laravel Mix.
Laravel offers out of the box Vue support.
On the other hand, the pros of building the app separated from Front to Back:
Each side has its own single responsibility and can be refactored easily.
As I heard it from my friends, it's more convenient (even tho for me it sounds too complex).
I wanted to know what is the most popular way to build a RESTful app when Laravel is being part of it. Is there another way of what I mentioned?
Personally,
I like to keep them apart because it's easier to maintain. Yes, you have to keep track of 2 different projects/folders/repositories but they are pieces of the same cake.
Scaffolding an API in Laravel is very easy and simple. I assume you already know how to do that. You are worried about loosing the advantages offered by Laravel Mix, but believe me you are loosing nothing.
Since your preference is on Angular, just clone any seed project repository with everything setup. e.g:
1. AngularJS: https://github.com/angular/angular-seed
2. Angular 2: https://github.com/mgechev/angular-seed
As you can see, these seed projects already have all the build tools you need and now things seem actually easier. That's what frameworks are made for.
Now imagine later you want to add a mobile app to the stack. You don't even need to change a single thing. Your API already runs independently of the frontend and vice versa.
Question is opinion based... So here is my opinionated answer.
TLDR: For speed of development and arguably more satisfaction, build as one project. Don't overcomplicate unnecessarily too early. When project gets big enough, and starts to generate you some money, then think about splitting the projects - you will know when it is time.
The Laravel Ecosystem is just great for small, medium and even large applications.
Laravel gives you a resources folder, where you can put all your Javascript & front-end assets. You have Envoy to deploy your application and write your deployment scripts. You have mix to build your assets. You don't have to use mix - you could write your own gulp/webpack/grunt etc...
By keeping together as one project, you are able to use the same IDE project for both front-end and backend work, yet keep separation of concerns because all backend code is completely separated from front-end code. You can tweak the payloads being sent from angular, and tweak how the payloads are handled in PHP api nice and easy so you only need 1 ide and one browser and a terminal client.
The nicest thing about keeping the project together, is that assuming you are using VCS (git) and you really should be, then your front and back-end will always be in-sync with each other. Otherwise, you need to manage & coordinate deployments of your front-end and backend code.
When your application gets big enough, it won't take long to separate the projects as the frontend and backend should be already extremely loosely coupled.
Just think of all the added layers of complexity that you are introducing to your application. When you deploy a change to your REST API, you will probably need to also deploy a change to your angular application. What version of the angular app is compatible with what version of the API? If you have a team of devs, working on specific projects, then this complexity pays off - but most teams have processes in place to manage, synchronise & automate deployments.
I think you should go with 2 projects. I would.
I will give an example using complexity rate of growth. It is just from my own experience. (X means amount of features, Y means how complex they are to implement)
With a single project, it is super simple at first. No communication with the server, no hard stuff, everything is tangled. Then nearing the end, it starts to get harder and harder to create more features/pages because everything is tangled.
Or as a function:
But if you start with 2 projects, sure, it will be harder at first (communication, synchronisation, whatever) but the rate of growth for complexity will not be as high. Everything has it's own responsibility, everything does what it needs to do. Tests are simpler, expansion is simpler, refactoring is simpler, and you can complete the project with ease.
Or as a function:
Clearly, from the graphs above, you can infer that the rate of growth for a single project is much slower. (And of course, not actual numbers, I did not measure anything or tracked such projects, this is just out of my own experience)

Running Magento for multiple clients - single Installaton vs. multiple installations

I am looking to set-up a Magento (Community Edition) installation for multiple clients and have researched the matter for a few days now.
I can see that the Enterprise Edition has what I need in it, but surprisingly I am not willing to shell out the $12,000 odd yearly subscription.
It seems there are a few options available to be but I am worried about the performance I will get out of the various options.
Option 1) Single install using AITOC advanced permissions module
So this is really what I am after; one installation so that I can update my core files all at the same time and also manage all my store users from one place. The problems here are that I don't know anything about the reliability of this extra product and that I have to pay a bit extra. I am also worried that if I have 10 stores running off this one installation it might all slow down so much and keel over as I have heard allot about Magento's slowness.
Module Link: http://www.aitoc.com/en/magentomods_advanced_permissions.html
Option 2) Multiple installations of Magento on one server for each shop
So here I have 10 Magento installations on one server all running happily away not using any extra money, but I now have 10 separate stores to update and maintain which could be annoying. Also I haven't been able to find a whole lot of other people using this method and when I have they are usually asking how to stop their servers from dying. So this route seems like it could be even worse on my server as I will have more going on on my server but if my server could take it each Magento installation would be simpler and less likely to slow down due to each one having to run 10 shops on its own?
Option 3) Use lots of servers and lots of Magento installations
I just so do not want to do this.
Option 4) Buy Magento Enterprise
I do not have the money to do this.
So which route is less likely to blow up my server? And does anyone have experience with this holy grail of a module?
Thanks for reading and thanks in advance for any help - Chris Hopkins
Let's get the non-options out of the way right away. You don't want to do #3 and #4 is a non-solution. Magento Enterprise Edition doesn't add any features that will let you run multiple customers from one store.
Now, onto possible options. As you state, #1 will allow you to update one version of the code, but of course this comes with some risks. As I understand it, your customers will need to access the stores? If you have multiple customers running on one database and one codebase, you will always run into issues with them affecting each other. For instance, who will control product attributes, which are by nature global? If one store deletes a product attribute, other stores may lose data as a result. If you solve this issue, what about catalog promotions and product categories, etc. Magento was built to handle multiple websites, but not to insulate them from each other, and you will experience problems because of this. As for performance, a large product catalog or customer base will tend to slow down the site, but you can mitigate this using the flat product catalog and making good use of caching.
For option #2, you can run multiple Magento stores, which brings up two main problems. First, as you state, is updating the sites. If you are using a vanilla installation of Magento and not modifying core files, this should be a nonissue. Magento's updater is pretty simple for those installations, with difficulty increasing as you do more mods and have to use more manual processes for upating.
As far as performance, running multiple magento sites might be slower, but it depends on how you structure them. Regardless of having one or multiple sites, you'll have to load data for each site, so database size won't be terribly different. File size on a server is pretty much a nonissue. In either case, when a customer requests a page, Magento has to spin up the entire framework to serve the request, which is where performance issues start to show themselves. One of the big mitigations for this is to use an opcode cache like Xcache, but with multiple machines you need to give Xcache much more memory to hold all the installations' code. Legitimate problem.
My recommendation? Start on one machine, multiple installs. Work your way up on number of installs, and when the server doesn't support any more, move on. Keep your code changes out of the core and use extensions that can be easily updated, so updates are easy. That should mitigate as many of the concerns as possible.
Hope that helps!
Thanks,
Joe
We handle a couple dozen magento "installs" using a single code base, but multiple databases. Essentially we've done a rough job of creating a multi-tenanted Magento.
Here's how we're doing it:
Use nginx as a reverse proxy to handle some basic routing rules, and to set some server variables (fastcgi_params) based on the request.
We hardcode routing rules into Nginx Config based on the requested domain, browser language, and visitor location.
Set a server variable using Nginx fastcgi_params as "client-id"
Make copies of the app/etc folder with the convention of app/[client-id]/etc
override Mage.php variable $etcDir to $etcDir = self::getRoot() . DS . $_SERVER['CLIENT_ID'] .'/' . 'etc'; (You'll have to apply some logic here, to ensure that this can fail elegantly)
Edit app/etc/[client-id]/local.xml to point to a fresh db with magento tables and base content already imported. (You'll have to also set the URL in the core_config table, or in the local.xml file for anything to work)
Modify the include path of app/code/local/ to be app/code/local/[client-id]/ in Mage.php (yes, shoot me for overriding core code, but it's the only way we could find)
Setup Session Handling to be done in a Redis db, with db # unique per client
Override getVarDir() in Mage_Core_Model_Config_Options to include [client-id] in the path. (This is to ensure that you aren't sharing cache between clients)
You probably get the gist so far.
Other things you'll want to consider:
Isolation of media by client-id,
Consolidating all "Admin Panel" urls, and asking Admin user to select [client-id],
Configuring Varnish in a way that is sensible,
Setting up CDN in a way that is sensible,
Modifying Magento installer to support this method, and handle basic configuration automatically.
I think getting a vps account and scaling it up when it becomes necessary will give you the best options for your cost requirements.
For my two cents I think you are going to run into more problems than pros by throwing everyone into a single installation of Magento with everyone bumping into one another. Not to mention Customer X on Website Y cant seem to figure out why he can't create an account on Website Z that he has never been to before (that is a configuration issue, but it could happen)
What I would recommend you do is setup a git repository that has your "base" Magento installation and then have all your clients on different versions setup that you could clone from that main install.
This will give you only one real code base to update (database changes are a different story) and everyone is separate.
We run multiple clients on a single Magento CE installation and use AITOC's Advanced Permissions module to control visibility for our different clients. The module works well, although it has several hiccups and lacks functionality in a handful of areas, which we've had to handle with our own in-house modules. It doesn't seem to have any noticeable effect on performance, as we've been running it this way for months now without any issues. (That said, we do use Amazon EC2 and auto-scaling.)
As I understand it, EE does provide advanced role permissions that would render AITOC's module useless. However, I have also heard that the EULA for EE requires only 1 client per installation. I haven't been able to find hard facts on this, but if it's true, it's truly a deal-breaker, as having an additional EE installation for each client would get tremendously expensive, tremendously quickly. (Maybe someone can confirm yes/no on this, though?)

Resources