Recently added json language file values are not updated in email blade - laravel

I send mail as a cron job with Laravel. For this, when I want to use the last value I added in my resources/lang/de.json file in the mail blade template file(resources/views/mails/...blade.php), it gives an output as if such a value is not defined. However, if I use the same key in a blade file I created before, it works without any errors. In addition, the keys that I added to the same file (de.json) in the first time work without errors in the same mail blade file.
Thinking it's some kind of cache situation, I researched and found out that restarting the queue worker might fix the problem. However, both locally and on the server with ssh.
'php artisan queue:restart'
Even though I ran the command, there was no improvement.
Do you have any ideas?

Since queue workers are long-lived processes, they will not notice changes to your code without being restarted. So, the simplest way to deploy an application using queue workers is to restart the workers during your deployment process. https://laravel.com/docs/9.x/queues#queue-workers-and-deployment
but php artisan queue:restart will instruct all queue workers to gracefully exit after they finish processing their current job so that no existing jobs are lost. And I see a lot of issues with this command not to solve restart and deploy the worker.
So, Simplest way,
try to stop the worker manually (ctrl+C)
start the worker again with php artisan queue:work again.
might this help.

Related

Running commands from Controller async

There is a migration task. User uploads file to the server, then it should be saved and migration command should be run async. The first path works well, there is an issue with the second part.
I've tried to put all code to console command and run it with
Artisan::call('user:migrate', ['user_id' => $userId]);
or
Artisan::queue('user:migrate', ['user_id' => $userId]);
the script works, but not async, controller's function waits for the end.
Also I've tried to create a Job and call it via:
$this->dispatch(new UserMigration($user));
and had the same result, script works but not async. Please help to realize how queues work and that approach is better for my task.
I've not created any queue migrations and configuration, because need this step just async calling.
In order to run tasks asynchronous, the general idea in Laravel is to push jobs to a queue (database table for instance) and have a background process pick them up.
See https://laravel.com/docs/8.x/queues for information directly from the source.
You can start a queue worker using:
php artisan queue:work
Note that this is an ongoing process that doesn't stop unless it's told to do so. This means that any changes you make to the code, will only be reflected once you restart that queue worker. It is therefore important to run php artisan queue:restart (or kill and start the running task) when you deploy your code.
So now your queue worker is running, you can for instance queue an email to be sent (like upon registration), and your controller will respond immediately instead of having to wait for the email to be sent.
Most if not all info can be found in the link above. If you are going to have lots and lots of background tasks, take a look at Laravel Horizon.

Job not dispatching to jobs table Laravel

I'm dispatching a Job, which should go to jobs table, according to my .env config (QUEUE_DRIVER=database)
But what happens is that nothing appears in jobs table and the job doesnt even run on sync mode, for example. I've watched queue:listen, Laravel log file, failed_jobs table is empty as well. Please help me, my job is gone
OBS: I'm running artisan config:clear after changing .env file and then i restart PHP FPM service
One thing i noticed is that when I turn QUEUE_DRIVER to sync, it runs all jobs I've dispatched even they didnt go to jobs table
Am I loosing something?
When you set QUEUE_DRIVER to sync, dispatched jobs will be immediately run and not be inserted in the jobs table.
So you should set it to 'database' to get the desired action.

How to stop killing the queue command in laravel?

I want this command php artisan queue:work stays active and not get killed for a long time...
when we have queues, and we run the server, if we only use the command php artisan queue:work, it can get killed for some reason and our queues don't work anymore. what should I do in this case?
Your question is quite ambiguous , but I'll assume that you need the command to run whilst accessing that same instance of command line without closing or stopping the process.
I would recommend using Screens for this which allow you to effectively have virtual terminals open within the one you have.
Give the following article a read
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-screen-on-an-ubuntu-cloud-server

Notifications not added to queue

I've provisioned a Laravel Forge server and configured it to use redis for queues via .env:
QUEUE_DRIVER=redis
My settings for Redis in both config/queue.php and config/database.php are the defaults found in a new laravel project.
The problem is that when a mail notification is triggered, it is never added to the queue. It never gets to the processing stage.
I've tried using forge's queue interface as well as SSH into the server and running a simple
php artisan queue:listen
without any parameters. In both cases, no results (using the artisan command confirms no job is added to the queue).
Interestingly, I tried Beanstalkd:
QUEUE_DRIVER=beanstalkd
and suffered the same problem.
As a sanity check, I set the queue driver to sync:
QUEUE_DRIVER=sync
and the notification was delivered without issue, so there isn't a problem with my code in the notification class, it's somewhere between calling the notify method and being added to the queue.
The same configuration running locally works fine. I can use
php artisan queue:listen
and the notifications go through.
After an insane amount of time trying to address this, I discovered it was because the app was in maintenance mode. To be fair, the documentation does state that queued jobs aren't fired in maintenance mode, but unless you knew maintenance mode was the culprit you probably wouldn't be looking in that section.

How to fire Laravel Queues with beanstalkd

I'm pretty new to the whole Queue'd jobs thing in Laravel 4. I have some process heavy tasks I need the site to run in the background after being fired by the user doing a particular action.
When I was doing the local development for my site I was using this:
Queue::push('JobClass', array('somedata' => $dataToBeSent));
And I was using the local "sync" driver to do it. (The jobs would just automatically fire, impacting on the user experience but I assumed when going into the production phase I could switch it to beanstalkd and they would then be run in the background)
Which brings me to where I'm at now. I have beanstalkd set up with the dependencies installed with composer and the beanstalkd process listening for new jobs. I installed a beanstalk admin interface and can see my jobs going into the queue, but I have no idea how to actually get them to run!
Any help would be apprieciated, thanks!
This is actually a really badly documented feature in Laravel.
What you actually need to do is have the JobClass.php in a folder that is auto-loaded, I use app/commands, but they can also be in app/controllers or app/models if you like. And this function needs to have a fire event that takes the $job and $data argument.
To run these, simply execute php artisan queue:listen --timeout=60 in your terminal, and it will be busy emptying the queue, until it's empty, or it's been running for longer then 60 seconds. (Small note: The timeout is the time-limit to start a queue, so it may run for 69 seconds if 1 job takes 10 seconds.
If you only want to run 1 job (perfect for testing), run php artisan queue:work
There are tools like Supervisord that make sure your job handlers keep running, but I recommend to just make a Cron task that starts every X minutes based on how fast the data needs to be processed, and on how much data comes in.
Keep in mind you need to path your artisan.
php /some/path/to/artisan queue:work

Resources