How to deploy a Laravel 5 using composer and FTP - laravel

I built a project using Laravel 5 on my dev machine and now I'd like to deploy it.
One solution that came to my mind is to upload everything using FTP but I guess there is a better way.
I uploaded the composer.json but I receive tons of errors.
I have ssh/root access but using GIT is not an option.

Make sure you can use composer binary on your server and you are set
upload every file except vendor folder (you may use some FTPS manager that reads git-ignore file and does not upload ignored files)
set permissions to ./storage folder (browse thru this severfault thread)
make sure your web server root is ./public
create env file (that is not going to be changed ever, until you want) and do not overwrite it with "local" env file.
$ composer install (installs everything from composer.lock)
$ composer update (updates from repositories again, do test on local before updating on production)

Related

How to deploy Laravel project in to shared server

I have a Laravel project, I want to deploy it into the server, the thing is that normally we have index.php and .htaccess inside the public folder, but in my case, I have brought these two files into the root. So I want to know, what are the changes needed in serve?
How can I upload this to server?
Solution 1
Somehow you need to get the ssh access as a shared user from your hosting provider and then you can use git to clone your repository into your server.
Solution 2
You can copy paste all of your project into the server using ftp from your cpanel or relevant control panel.
Solution 3
Use Amazon as your hosting as it gives 1 year free tier access, and also gives you ssh service. Follow the solution 1 after getting this.
Put back the files to the public folder. You can change the root path of your server to your project's public folder.
Follow the steps to do:
Go to /etc/apache2/sites-enabled/
Open the .conf file inside this folder
Change the docuementRoot to /etc/var/www/html/project_name/public
Restart the apache server using the following command:
sudo systemctl restart apache2

500 issue with Laravel

I have seen this answer in many posts but they have not helped me at all. I followed the regular steps to create the laravel project like this:
I cloned from my repository.
I ran composer update.
I added 777 permissions to storage and bootstrap folders.
I have a .env file.
I verfied the .htacces and it's ok.
It is working in locahost, but when I try to replicate it in Hostinger it does not work, it displays the 500 server error. So I wonder what is the problem?
I checked the logs by the way and they were empty. I put the laravel project debugger to true too.
the website url is xellin.com
The debug:
The logs folder:
Thanks.
I think this is a good opportunity to point out how PHP / Laravel / Underlying Server interacts one to each other.
First:
The HTTP server inspects Document Root and .htaccess to get instructions.
If the file is .php (like Laravel), then it CALLS to the php handler.
The php handler could be a FPM version or a Fast CGI version.
-> If an error ocurrs parsing the .htaccess or with the initial interaction between Http Server and PHP... Laravel never runs for real. All ends in a PHP error log
To find out what's wrong, you need to inspect what PHP / Http Server said about the error in their respective logs.
In short words: at this point is not a Laravel error, but a server/php one.
Second:
If Apache/PHP runs well, then PHP executes the Laravel Applicacion Lifecycle... if Laravel encounters a problem, then you will see the usual output error of Laravel Error Handler.
I think this is a must to know to work with web apps in general, because many times developers miss to catch if the problem was with Laravel, or with PHP / Server itself.
As a side note, that's why it is important to know how to choose propper hosting service for Laravel.
Thanks for reading.
You can try to clear cache
Like as
php artisan optimize
Or
You can manually delete cache files which is located in bootstrap folder and inside bootstrap folder you can see cache folder inside cache folder delete all files except git ignore file your issue fix
If you show again this error on live serve then tou can update your composer and then run
php artisan optimize
at first, if you give any of your folders 777 permissions, you are allowing ANYONE to read, write and execute any file in that directory.... what this means is you have given ANYONE (any hacker or malicious person in the entire world) permission to upload ANY file, virus or any other file, and THEN execute that file...so please be careful because IF YOU ARE SETTING YOUR FOLDER PERMISSIONS TO 777 YOU HAVE OPENED YOUR SERVER TO ANYONE THAT CAN FIND THAT DIRECTORY. please read the full explanation from here
the second here is the detailed steps I used to deploy my projects to the server:
run npm run production then update your github repo.
clone the project from GITHUB to server - clone to an outside folder (not public_html folder)
run cd <cloned folder name>
run composer install
run npm install
copy and configure .env file to cloned folder( be sure name is .env not env).
copy all content of cloned_project_folder_name/public to public_html folder
in index.php inside public_html folder edit as below
$app = require_once __DIR__.'/../cloned_project_folder_name/bootstrap/app.php';
require __DIR__.'/../cloned_project_folder_name/vendor/autoload.php';
set your .htaccess properly.
change permission to 755 to index.php and all file in public_html folder
run composer install --optimize-autoloader --no-dev
run php artisan config:cache
run php artisan route:cache
I think I state it all, hope that will help

DigitalOcean: I want to download the .env file from server where laravel application installed

I am working on Laravel application on a DigitalOcean server. To speedup my programming I am going to clone my project to my local machine. But I did not find the .env file on the server.
Anyone, please help me getting the .env file from the DigitalOcean server.
The .env file is not tracked by git on the server, so when you git clone the repository from the server, it will not be included. Depending on the complexity of your project on the server, the .env file might include custom variables that are required for the project to function properly. So your best bet is to get a copy of the .env file from the server.
To manually copy the .env file from the server to your local machine using scp or similar tool:
$ scp username#server:/repo/.env /some/local/directory
Or, if you are using Laravel Forge to deploy to the DigiitalOcean server, you could log in to Forge and copy the contents of the .env from the interface.
But don't forget to customize the .env afterwards for your local environment (database credentials, etc).
In laravel project , .env file exit in the project root file, that does not matter if it is local or live.
So if you can not find the .evn file in your digital ocean server it is likely that you dont have it in your application .
What you can do is make a copy of .env.example and rename it to .env file.

Composer on live server

I have moved my composer based project to a live server. Accountrix is complaining about my composer.json and composer.lock files. Do I need these files on the live server or is it ok to delete them?
There is no general answer.
If you plan to use composer on that live server then yes you need these files.
If you just upload your project files to your live server and you don't plan to use composer directly on your live server, then you can delete these files safely. Your project will run without composer.

Keeping PHPStorm files in sync with the ones generated on the server via php artisan

I am using Laravel with PHPStorm and a custom server where I connect via SFTP. The problem is that being SFTP, it's not in sync. So everytime I generate files via php artisan command, I have to download the file(s) with PHPStorm. I know that I can get around that by using Homestead and Shared folders, but this project requires a custom VPS.
I know that no SFTP "drive" is currently working ok with Windows. Also, the server is remote, not on the same network, so Samba can't do the job.
Thank you!
This is a workflow I use, you may simply need to do the following, assuming you have already setup a default deployment server.
Editing remote files
If you are editing the remote files instead of a local copy, don't; instead:
create a local copy/git clone/etc of you project files.
create a new phpstorm project with the local copy.
Setting up a sync
If you already are working off a local copy but just need sync setup:
ctrl+shift+a
type deployment
select options
change the option: Upload changed files automatically [..] to always
enable upload external changes
As an added bonus, this also automatically syncs assets from say gulp watch too.
If you haven't setup a deployment server
ctrl+shift+a
type deployment
select configuration
create a new server with you method of connection to it.
enable as default server (last icon on the top left column)
Important: if you don't select the server as the default, it will not be able to auto upload changes.
Also don't forget to setup the excludes in the configuration menu, I usually exclude bower_components, and node_modules from deploying to my servers, and only send the build assets. (But it's up to you)
EDIT: Don't run commands remotely, run them locally and let them sync back to the server.
I execute the artisan commands on both sides... i do it on this way on my linux maschine
<?php
unset($argv[0]);
$params = implode(' ', $argv);
$remoteOutput = shell_exec("sshpass -p password ssh -o StrictHostKeyChecking=no user#1.1.1.1 'php /path/to/artisan $params'");
if(!empty($remoteOutput)){
shell_exec("php artisan $params");
}
Save it and add it as commandline tool in phpstorm.... in windows i think you can use the PHP SSH library or somthing else.

Resources