Laravel queue job does not work in background - laravel

Hy! I have an application where I have to send some emails at certain actions (such as user creation, etc.). Problem is they are not running in the background, instead I have to wait until the process is done, and then it redirects me to another page.
I use database driver with queues, Laravel 5.2.
My code for email, for exp, after user creation:
$this->dispatch(new WelcomeEmail($user));
Artisan::call('queue:work');
where WelcomeEmail is the job that is pushed on queue. This type of code is placed in all the places where I want an email to be send. What is wrong?

First, you do not want to use Artisan::call on 'queue' commands within your dispatcher.
You should open your terminal and execute: php artisan queue:listen --timeout=0 --tries=1 and you should let it be.
Then you can visit your page where $this->dispatch or even better dispatch method will be called. Code on that page should be:
dispatch(new WelcomeEmail($user));
On your production server, you should use supervisord to monitor your php artisan queue:listen command, to make sure that it's up and running all the time.
For further reading please visit: https://laravel.com/docs/5.2/queues

I don't know why changing .env is not enough to fix the issue but after changing this line from
'default' => env('QUEUE_CONNECTION', 'sync'),
to
'default' => env('QUEUE_CONNECTION', 'database'),
in config/queue.php file
Everything works fine.

I had a similar problem, but because was a single job I didn't want a daemon to always run, also there is the update code problem,.... So I solved running the command directly from PHP, like:
exec('nohup php /my_folder/artisan queue:work --once > /dev/null 2>&1 &');
This will launch one job and then turn off, without waiting the result. But be careful on the Laravel log file permissions, the o.s. user can change if you are running under Linux depending on context and configuration.
Hope that can help someone.

Related

When and how do Laravel daily logging files get deleted?

I'm running Laravel 5.8 and using the built-in Laravel Log functionality. And I've created a few custom logs.
I set each of them to store the daily log files for 30 days, like so:
'reconciliation' => [
'driver' => 'daily',
'path' => storage_path('logs/reconciliation/reconciliation.log'),
'days' => 30
]
However, the logs don't stay for 30 days, like I've set them to. They stay for 7.
I've heard that there is another configuration called log_max_files, which might be what I'm looking for.
But all the references to log_max_files are from Laravel 5.4 and below. And I know that the logging functionality was revamped in 5.6. So I'm not sure if log_max_files even works anymore and I'd like to test it.
I've created a new log file with a date far outside the current 7-day range and I want to confirm that it gets deleted when the logs get cleared.
I'm just not sure when Laravel clears out the old logs. I couldn't find any in-depth information like that in the docs and my Google searching hasn't turned anything up.
I've tried restarting my server, running php artisan config:clear and php artisan cache:clear with no luck.
Does anyone actually know the process by which these log files get deleted and when that process runs?

Laravel 5.8 Run PHP Artisan Command in Background while Clicking on a button from Blade View

I need to run a scheduler task from blade view by clicking a button (Sync) and it should go for process in the background.
I have created an artisan command that is php artisan projects:get and then I schedule it once a day for running in cron job, but in some case we need to run the cron job at user's choice so when he/she is in CMS logged in, they can click a sync button to run the cron job from there, but I think its not possible, but I knew there is some work around in Laravel to process the php artisan command which I already created that is projects:get using queues or process from symphony, but I know I can do it from the command line (terminal) using putty or cPanel terminal window, but As you know client can't login into cPanel and run the command so we need to give them just a simple button to click and sync in background, right now when user click that button, its getting delay and he/she can't continue to work on other things while its fetching all the projects from the API's that I used in that command. We need to queue/background process.
php artisan projects:get
As you pointed out, you can run artisan commands from your php code as documented in the documentation
Since the artisan command will likely take some time to execute, it is a good practice to use a queue for this.
You said in the comments, that you are on xampp. On local, you need to run the pap artisan queue:work command once you started xampp. After you executed the command, the watcher will pick up jobs and will execute them. However, first you need to configure the queue. This will get you up and running. On a production server, you need to configure a supervisor to run the queue command.
You can run artisan command programmtically like this:
Route
Route::get('/run/command', 'SomeController#runCommand');
SomeController
public function function()
{
$exitCode = \Illuminate\Support\Facades\Artisan::call('projects:get');
return $exitCode;
}

Howto find out if Laravel Command has been started via the scheduler?

When the laravel scheduler starts a command like:
$schedule->command('test:testcommand')->hourly();
i need to find out, inside the command, if it has been started via
artisan test:testcommand
or
artisan schedule:run
i looked into $_SERVER['argv'] but i don´t see any info that helps me to
identify this.
maybe laravel has some fancy internal functions, but i wasn´t able to find them.
The only way to do this is to communicate through arguments. So you could do as follows:
$schedule->command('test:testcommand',['--scheduler'])->hourly();
You can do this with events:
php artisan make:event OnCommandRun
Then in the handle() of your testcommand class fire it:
event(new OnCommandRun());
Then within the event handle() function do whatever you want
More on Events

Laravel artisan command from controller

I update my .env file using a function in my controller.
After I save the settings I need to update, I call Artisan::call('config:cache') to clear the cache of my site's configuration.
Everything works fine on localhost, but when I try to clear config cache on production, it doesn't work. (No warnings or errors.)
I even tried with --no-interaction option attached to this CLI command.
Did anyone have this problem and know what causes it?
check into the PHP security settings and make sure you can run these exec,passthru,shell_exec functions in your server.

Getting strange errors installing a Laravel application

The first problem I'm running in to is that when installing I receive a mysql error stating that a table cannot be found. Of course it can't, I finished installing the dependencies much less run the migration. The error was being triggered by a Eloquent query in a view composer. After commenting out the entirety of my routes file Composer let me continue.
I proceed to uncomment out my route file - I get the error once again trying to run any artisan commands (can't migrate my database because I haven't migrated my database). Repeat the solution for step one and I've migrated my database.
Artisan serve is now serving me my layout file in the terminal and exiting. I'm at a bit of a loss to troubleshoot this. I assumed that it was possibly a plugin, trying to disable plugins one by one results in:
Script php artisan clear-compiled handling the pre-update-cmd event returned with an error
and being served up my layout file in the terminal.
It seems that the error is directly related to this function in my routes file:
View::composer('layouts.main', function($view) {
$things = Thing::where('stuff', 1)->orderBy('stuff')->get();
$view->with(compact('things'));
});
This isn't a new introduction to the application however so the underlying cause is coming from somewhere else.
As i said in the comment, if you are finding database errors in production server but not in local, then
check database credentials. if its ok then....
check the different configs in the environment.
using profilers(any) will let you know what environment you are in.

Resources