Laravel Envoy on fortrabbit - laravel

I've setup a Laravel app on a Fortrabbit server. I can do the following
$ ssh user#server
$ cd htdocs
$ php artisan migrate
Which works perfectly fine.
But I'm trying to use Envoy for tasks like this. So I've made a simple task:
#servers(['test' => 'user#server'])
#task('task:test', ['on' => 'test'])
cd htdocs
php artisan migrate
#endtask
However, I encounter the following issue when doing so:
$ envoy run task:test
[user#server]: \(><)/ Welcome to the rabbit hole. \(><)/
[user#server]: [PDOException] SQLSTATE[HY000] [2002] No such file or directory
It might be worth noting that the db connection uses credentials from ENV variables set in the Fortrabbit interface. If i SSH into the server and do env, all the variables are listed as they should be. But when doing
$ ssh user#server env
I only get a small portion of the Fortrabbit server ENV variables.
I can't figure out what could be the cause of the error, and I haven't been able to find anything when asking Google. Could anybody shed a bit of light on this? Thanks alot.

As #ukautz said, the session scripts are not executed by envoy, that's why some of your environment variables are missing.
Here's what I did:
#servers(['dev'=>"<<user#server>>"])
#task('list')
. /etc/profile.envvars
env
#endtask
That showed all ENV variables, including those set by you on the dashboard !
Your script should work with this:
#servers(['test' => 'user#server'])
#task('task:test', ['on' => 'test'])
. /etc/profile.envvars
cd htdocs
php artisan migrate
#endtask

I suggest to check if your environment dedection works as expected.
$ php artisan env
If not, your can force the environment for artisan commands, like so:
$ php artisan migrate --env=test_server

Related

Sail - Initiate Xdebug session from command line

So the last version of Sails makes it very easy to use Xdebug. Basically just had to define SAIL_XDEBUG_MODE in the .env file, configure path mapping in PhpStorm, activate listening and all was set - it works perfectly from the browser.
Now, how should I go if I want Xdebug to activate when I'm using the command line? Like when using artisan commands for seeding, or even better when using custom artisan commands created to run scripts to update some data... I can't find any arguments to add to my sail artisan myOwnCommand that would tell Xdebug it has to activate.
I'm working on Windows 11 with WSL2.
Thanks ahead!
Thanks to Derick's suggestion, I found out a solution. Prepending a sail call with anything wouldn't help, since sail calls scripts in the docker container, and your environment variable wouldn't be set there. But since it was just about setting an environment variable, it can be easily done in docker-compose file.
I just had to add PHP_IDE_CONFIG: 'serverName=0.0.0.0' in the environment section of my Laravel service. Of course replace 0.0.0.0 with your own server name. Then, instead of running sail artisan test or sail artisan my:command you replace artisan with debug, as stated in the docs.
Now you can use command debug to run with Xdebug (ex. sail debug myOwnCommand).
Here is documentation: https://laravel.com/docs/9.x/sail#xdebug-cli-usage

Cannot write to images directory | cloud hosting with Laravel forge

I have a problem when deploying Laravel an application using Laravel forge. I try to generate fake images using faker package in Laravel but,
Cannot write to directory "/home/forge/my.domain/public/storage/images/products/cover_img"
at vendor/fakerphp/faker/src/Faker/Provider/Image.php:98
94▕ ) {
95▕ $dir = null === $dir ? sys_get_temp_dir() : $dir; // GNU/Linux / OS X / Windows compatible
96▕ // Validate directory path
97▕ if (!is_dir($dir) || !is_writable($dir)) {
➜ 98▕ throw new \InvalidArgumentException(sprintf('Cannot write to directory "%s"', $dir));
99▕ }
100▕
101▕ // Generate a random filename. Use the server address so that a file
102▕ // generated at the same time on a different server won't have a collision.
in the factory file,
'cover_img' => $this->faker->image(public_path('storage/images/products/cover_img'), 640, 480, null, false),
This is my first time using cloud hosting. when using shared hosting I can give permission to create a folder or can create a folder manually. please help me to solve this problem. Thank You!
***UPDATE ***
I change a few codes and again tried.
if(!File::exists(public_path().'/storage/images/products/cover_img'))
{ File::makeDirectory(public_path().'/storage/images/products/cover_img', 0777,true);
}
Now it has error when deploy using Laravel forge,
ErrorException
mkdir(): Permission denied
If someone can help me to solve this I really appreciate it.
Finally, I've figured out the issue, I think First I did the total wrong way because I tried to seed when deploying using forge with php artisan migrate:fresh --seed code. Then came that error. below has the wrong code.
cd /home/forge/dev.mydomain.com
git pull origin $FORGE_SITE_BRANCH
$FORGE_COMPOSER install --no-interaction --prefer-dist --optimize-autoloader
( flock -w 10 9 || exit 1
echo 'Restarting FPM...'; sudo -S service $FORGE_PHP_FPM reload ) 9>/tmp/fpmlock
if [ -f artisan ]; then
$FORGE_PHP artisan migrate --seed
fi
So, finally, I did just deploy with default settings in the forge and I log in to serve using terminal, Then I run the php artisan migrate --seed command and I was successful without any error. This was the step I followed,
Generate SSH key using git bash
ssh-keygen
Added ssh key into the forge
run this command in git bash terminal
ssh forge#ip address in the server
then installed the oh my zsh using this code,
sh -c "$(curl -fsSL
https://raw.githubusercontent.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"
then went to my directory in the host using terminal,
cd dev.maydomain.com
and final run seed,
php artisan migrate --seed

Unable to run migration via artisan command

I have a Laravel 7 project and pushed it to a live server for production but I can't run migration successfully. I always get an error 'access denied' error.
I can confirm that the command sees the .env file and the connection details are all correct. When I ssh into the server and run mysql command using same parameters saved in the .env file, connection is successful. Adding the details into workbench and SequelPro also works so I am not sure why php artisan migrate doesn't work
Run the following command:
php artisan tinker
Tinker is Laravel's own repl.
It will prompt you to enter the commands. Here you can check and print the value of the environment variables by entering string inside env method.
>>> env('DB_DATABASE')
and so on for the other DB parameters.
Hope this helps.
For more help you can check out the official Github repository of tinker:
https://github.com/laravel/tinker

Laravel does not read env variables

Did anyone has issues with env variables? For some reason, helper env('VARIABLE') returns null every time I use it. It happened very unexpectedly and I dont really know the reason. Restarting the apache/IDE/computer does not work.
Run those of command
composer dump-autoload
php artisan cache:clear
php artisan config:clear
php artisan view:clear
Now try to read
$value = env('VARIABLE_NAME');
if not working till now then,
Try another syntax to read env variable.
$value=getenv('VARIABLE_NAME');
The solution is simple, but neither the IDE nor the debugger says anything about it. It just returns null. When you use php artisan config:cache, according to the documentation:
If you execute php artisan config:cachecommand during your deployment process, you should be sure that you are only calling the env() function from within your configuration files.
Obviously I have env variables outside the config files, so after caching I was not able to use it outside anymore. The php artisan config:clear puts it back to work.
What I've found more about the usage of env, that it should be used only within config files. You can access env variables from the rest of the project using other helper method config(). Be sure to assign it to another key in config file, e.g. 'key' => env('CACHE_DRIVER')
What is more, you have to remember to run php artisan config:cache each time you will change .env file. Laravel will not load the new values, until it's cached. If it's not cached, no need to do it.
please use this its work for me use git bash or cmd and past this command
$ rm bootstrap/cache/config.php
this command clear cache folder
Just run
php artisan config:clear
It solved my issue.
In my case, it's just because I'm using php artisan serve, just restart it, the server command may read the env and keep at the startup time.

Access denied 'homestead'#'localhost' when running a Laravel Command

I'm having some trouble getting Laravel to connect to my database, when using a custom Artisan command.
I can post my command but I'll skip to my db settings as I suspect that is what is wrong. In start.php I have:
$env = $app->detectEnvironment(array(
'local' => array('homestead'),
));
and then I have no local/database.php files in my config. Instead I have a .env.local.php which works great for everything except this. All database settings are set as 'DB_NAME => getenv('DB_NAME'); etc.
When I run php artisan custom:command I get the following:
[Illuminate\Database\QueryException]
SQLSTATE[3D000]: Invalid catalog name: 1046 No database selected (SQL: select * from `users` where `paused_until` = 3)
Then if I run
php artisan fdb:reactivate-paused --env=local
I seem to get much closer but still get:
Access denied for user 'homestead'#'localhost' (using password: YES)
Is it that Laravel doesn't know to use the .env.local.php file when I am running commands in the terminal? All my migrate and db:seed queries seem to work fine. Can anyone point me in the right direction?
Looking at it again this morning, I must have been 100% burnt out. I was running the command from my Local Machine, not the Homestead VM.

Resources