Laravel keep public folder in sync - composer-php

When using composer on laravel development, is there a way to keep the folders in sync <root>/public/ with <root>/vendor/mymodule/src/public?
I use php artisan vendor:publish to publish my assets to the root, but while developing I have to copy & paste the changes back to my vendor folder to keep them in sync, which is really error prone.
Any best practices will be appreciated.
Thanks a lot
Best

Related

How to properly duplicate a Laravel project in the same computer? [duplicate]

I have a Laravel 5.3 project which was created 5 months ago, today I made a duplicate from the project and I made some changes into the code.
When I edit the views in a blade.php file my project which I edited showed me the last project view, I made a new route in the new laravel project and in the routes works well, but still shows the last project view.
It's funny because the js files works pretty well, but the view doens't work. for example, I edit the profile.blade.php file and it shows the content from the last project, if I writte something new in the other view from the last project, it shows in the new project.
Any ideas?
Thanks in advance.
Your views/routes are compiled/cached.
The storage directory contains your compiled Blade templates, file
based sessions, file caches, and other files generated by the
framework.
The bootstrap directory contains files that bootstrap the framework
and configure autoloading. This directory also houses a cache
directory which contains framework generated files for performance
optimization such as the route and services cache files.
Run these commands
php artisan view:clear - Clear all compiled view
php artisan optimize --force - Optimize the framework for better performance
php artisan config:cache - Create a cache file for faster configuration loading
php artisan route:cache - Create a route cache file for faster route registration
Disable opcache from php.ini or use:
ini_set('opcache.enable', 0);

File session driver not working properly on production (Laravel on shared hosting)

When using session on my local environment all worked ok, but when I publish the site in a shared hosting I started noticing some strange behaviours in the app. After a while I realized it has to do with the session and specifically I noticed that when I was working on my local environment the storage/framework/sessions folder only had 1 file that keep updating on any change but then on production I start monitoring the same folder and I realized that on any change instead of updating the file (or creating a new one and deleting the other) it was creating a new file but also keeping the old files making the app start acting in a wrong way.
Is this normal or should it be only 1 file per session as it was in the local environment?
Update
After login the user the app ask to select the business they want to work and also they can change between business after, to store the business they choose I use the session and there is where the problem pop, after every change on that property of the session it creates a new session file without deleting the old one. Again when I do exactly the same thing locally it works but for some reason on the shared hosting it doesn't.
SOLUTION
After days of trying to figure it out, I just figure out the solution.
Instead of using the Global Helpers of Laravel for storing the data I did it throw the request and apparently that work it out.
So basically instead of doing this:
session('clienteElegido' => $client);
I change it for this:
$request->session()->put('clienteElegido',$client);
I still don't understand what's the difference and why it was working fine in my local environment and not in the share host but its working now like that so all good.
Thank you for all the quick replies.
Try clearning cache, route, config and view
php artisan cache:clear
php artisan route:clear
php artisan config:clear
php artisan view:clear
and let's see if your session issue will be fixed.

What is the most direct way to push a site update to Laravel?

I've SSH into the server and into the Laravel folder. I updated one of the html footer files but the changes aren't reflected on the website. I feel like I probably need to recompile something.
I tried deleting and re-creating the .env file (I backed it up first).
I've tried running the following commands:
php artisan clear-compiled
php artisan optimize
php artisan cache:clear
The only way I can seem to update the site is by updating the main.min.js file, located at /laravel/public/assets/js/main.min.js which is a terrible way to update the site.
How do I force Laravel to recreate this file or recompile the site based on changes I made to html template files within the site?
What am I missing here? I don't have any Laravel experience and am trying to update this site for a client.
edit:
I think I need to clarify a bit more...
The site appears to be rendered from this file: /public/assets/js/main.min.js
Most of the site's homepage, for example, is located in this js file. But the file is minified and therefore unwieldy to edit directly.
I am assuming (and I could be completely wrong here) that the file is generated from the html files located in the Laravel folder. To support this notion, I have found html files in other directories that correspond to the html located in the main.min.js file.
My assumption is that the previous developer would update the html files and then run something to compile the site into javascript files. But maybe this has nothing to do with Laravel, per se, and more to do with some frontend framework?
Try clearing the cached views...
php artisan view:clear
Laravel assets reside in
resources/assets/js
of your root directory you can have look their
if your file main.js is build using laravel mix have a on webpack.mix.js which compiles all your files you can get idea from that. make sure to run
npm run prod
if you change any file
Hope this helps?

Laravel FileZilla upload/download

Is there a proper explanation as to why when I'm either pulling the complete project from the server, or pushing it on server, views don't get updated? I always have to manually go to the resources folder and transfer views after everything has already uploaded/downloaded.
The solution is to clear your cache like stated in the comments.
php artisan view:clear
The reason why this is happening is because by default the caching is set to File. Which means that the view-cache is stored in files. When you upload everything the old-cache is uploaded. The files are stored in storage\framework\views
If you upload a single folder, laravel notices a change and the cache is busted.
PS: it is recommended that you use git for this kind of stuff, Laravel has .gitignore files to prevent this kind of behavior.

How to avoid using php composer dump-autoload with laravel 4?

I have uploaded my project on a web host and I use ftp to edit my code.
The problem is that I added models using eloquent for my database and to get it work I have to download my project dans run php composer dump-autoload then re-upload. Otherwise its say class not found ... Doing this all the time is just heavy.
Is there any other solution?
My webhost does not have ssh or any thing to connect to the server. Neither I can use rsync like stuff.
Maybe I should use an other framework than laravel4 to avoid using composer?
In your case, you should add your needed autoload directory to the ClassLoader::addDirectories array under /app/start/global.php. Laravel gives multiple ways to accomplish the same thing depending on your personal needs.
You can remove or not upload the bootstrap/compiled.php file.
I'm not sure if this completely fixes your problem, because I'm not sure if dump-autoload generates multiple files.
[edit]An other approach is to work on your local machine and upload after you are finished.
I'm not sure, sometimes it works, sometimes not...
But I upload:
bootstrap/autoload.php
vendor/autoload.php
vendor/composer/*
And I don't have a bootstrap/compiled.php

Resources