I'm facing an issue in my production environment. One specific job is not working as expected when specifically called via the scheduler php artisan schedule:run. As it seems, the queue worker does not run through the code in the handle function of the job. Instead, the job is marked as completed in Horizon. The strange thing is that if I run Laravel Tinker on my production server, and push the job manually to the queue it works as expected.
See below for my setup and code snippets.
Does anyone have an idea what the issue is? No code around this specific job has been touched in months, and the issue just showed up randomly last Friday. The server, docker, and Horizon have been restarted several times without any change in the behavior.
Server Setup
Laravel Version: v8.78.1
Laravel Horizon Version: v5.7.18
Docker PHP Image: php:8.1.0-fpm-alpine3.15
App\Console\Kernel.php
protected function schedule(Schedule $schedule)
{
$schedule->job(new AdExportFile(
fileName: 'adwords.csv',
daysToExport: 10,
header: [
'Google Click ID',
'Conversion Name',
'Conversion Time',
'Conversion Value',
'Conversion Currency'
],
timezone: config('app.timezone'),
origin: Visit::SOURCE_GOOGLE
))->hourly();
}
App\Jobs\AdExportFile.php
class AdExportFile implements ShouldQueue
{
use Dispatchable;
use InteractsWithQueue;
use Queueable;
public const QUEUE = 'default';
/**
* Create a new job instance.
*
* #return void
*/
public function __construct(
protected string $fileName,
protected int $daysToExport,
protected array $header,
protected string $timezone,
protected string $origin,
protected string $delimiter = ',',
protected int $minBaseCommission = 5000
) {
$this->onQueue(static::QUEUE);
\Log::info('AdExportFile: Running __construct');
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
\Log::info('AdExportFile: Running the handle function');
}
}
Log output
AdExportFile: Running __construct
Edit: This has now spread to a second job, that is called via the "$schedule->call" method
Related
I have always used events and listeners to add tasks to the queue. Now I'm trying to use Jobs. I do it like this:
my job.
class eventJob implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public $message;
/**
* Create a new job instance.
*
* #return void
*/
public function __construct($message)
{
$this->message = $message;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
Log::alert($this->message);
}
}
My .env file: QUEUE_CONNECTION=database
In my controller, I dispatch the event like this:
eventJob::dispatch('my message');
A new record appears in the jobs table and to execute it I run php artisan queue:work
The record is removed from the jobs table, but nothing appears in the file logs
I tried in the handle method and the constructor to do throw new \Exception("Error Processing the job", 1); But nothing is written in the filed_jobs table, from which I made the assumption that the handle method and the constructor do not execute.
I also tried running my job like this:
$job = new eventJob('my test message'); dispatch($job);
But it does not change anything
I don't know why but when I changed config/queue.php file from 'default' => env('QUEUE_CONNECTION', 'sync') to 'default' => env('QUEUE_CONNECTION', 'database') everything started working as it should
Currently I have a cron running that calls a command and adds this job to my queue.
This works normally up to a point, then the job runs but doesn't add anything to the queue, so I have to log into the server and give an artisan config:clear to get everything running again.
Does anyone have an idea what it could be? I'm using forge to perform server deployments and management, my queue is using redis driver, laravel 9, horizon and octane, php 8.1, mysql
Just to be clear: my problem is not happening while running the jobs, when the job arrives in the queue the horizon is processing perfect. the biggest problem is when adding item to the queue, that when cron goes to run, all of a sudden it doesn't find the settings of which queue it should use anymore and doesn't add anything to the queue :(
Example of command running using crontab:
namespace App\Console\Commands;
use App\Jobs\MyJob;
use Illuminate\Console\Command;
class MyCommand extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'cron:myCommand';
/**
* The console command description.
*
* #var string
*/
protected $description = '';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return int
*/
public function handle()
{
MyJob::dispatch()->onQueue('my_queue');
return Command::SUCCESS;
}
}
On my application, users have lists of emails they can send to. Their accounts have settings for the time of day they want emails automatically sent and the timezone they're in.
I would like to test certain scenarios on when my queues are triggered since each user's send time may differ drastically.
I'd like to globally set a fake time with carbon.
In public/index.php, I tried to set:
$time = Carbon::create('2020-09-16 00:00:00');
Carbon::setTestNow($time);
but pieces of my application are not affected.
Is there a global way to set a fake time?
Original question below:
On my application, users have lists of emails they can send to. Their accounts have settings for the time of day they want emails automatically sent and the timezone they're in.
I have a command that will trigger an event that sends email.
Inside the listener, the handle method looks like:
public function handle(ReviewRequested $event)
{
$time = Carbon::create(2020, 9, 15, 0);
Carbon::setTestNow($time);
$reviewRequest = $event->reviewRequest;
Log::info('email sending at ' . $reviewRequest->sent_at . ' and current time is ' . Carbon::now());
Mail::to($reviewRequest->customer->email)
->later($reviewRequest->sent_at, new ReviewRequestMailer($reviewRequest));
}
Note that I'm faking the time with Carbon and setting it to midnight. In this example, The emails should be sent at 9am. The logged info is as follows:
local.INFO: email sending at 2020-09-15 09:00:00 and current time is 2020-09-15 00:00:00
So the current time is 12AM and I'm queuing these up to get sent at 9AM.
As soon as I run php artisan queue:work, the pending jobs (emails) are immediately run and sent. Why is this happening? They should remain queued until 9AM.
Perhaps queuing is using system time and doesn't care about what I set in Carbon? How can I resolve this?
Edit: I forgot to mention that I'm using Redis
Check what queue driver you're using in your .env file. QUEUE_CONNECTION=sync does not allow for any delaying (sync stands for synchronous).
The quickest way to fix this would be doing the following:
change the driver to database QUEUE_CONNECTION=database
clear the cached configuration php artisan config:clear
publish the migration for the jobs table php artisan queue:table
migrate this new table php artisan migrate
After following these steps, you can now have delayed execution in your queues when you run it with php artisan queue:work
I think you should use Laravel Cron Job for this purpose. you should make a file in App/Console/Commands/YourCronJobFile.php
<?php
namespace App\Console\Commands;
use App\TestingCron;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\DB;
class TestingCronJob extends Command
{
/**
* The name and signature of the console command.
*
* #var string
*/
protected $signature = 'send:Mail';
/**
* The console command description.
*
* #var string
*/
protected $description = 'This command is use for test cron jobs.';
/**
* Create a new command instance.
*
* #return void
*/
public function __construct()
{
parent::__construct();
}
/**
* Execute the console command.
*
* #return mixed
*/
public function handle()
{
DB::table('testing_cron')->insert(['created_at' => now(),'updated_at' => now()]);
}
}
Then go to directory App/Console/Kernel.php
<?php
namespace App\Console;
use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;
class Kernel extends ConsoleKernel
{
/**
* The Artisan commands provided by your application.
*
* #var array
*/
protected $commands = [
Commands\TestingCronJob::class
];
/**
* Define the application's command schedule.
*
* #param \Illuminate\Console\Scheduling\Schedule $schedule
* #return void
*/
protected function schedule(Schedule $schedule)
{
$schedule->command('send:Mail')->dailyAt('09:00');
}
/**
* RegisterController the commands for the application.
*
* #return void
*/
protected function commands()
{
$this->load(__DIR__.'/Commands');
require base_path('routes/console.php');
}
}
https://laravel.com/docs/7.x/scheduling
I know, the question is very strange...
Scenario:
I have a Job class that sends an email, but the content of this email is modified, since the template of the email is selected before being of dispatch.
I do not know if it's a truth, but apparently Laravel maintain a cache of content that he fired for the first time. Even by changing the value of the properties, Job sends exactly the same email.
If this is true, I would like to know how to use the same job class to send different emails or what would be the best alternative.
\app\Jobs\SenderEmail001.php
/**
* Create a new job instance.
*
* #return void
*/
public function __construct($template_id, $subject)
{
$this->template_id = $template_id;
$this->subject = $subject;
}
/**
* Execute the job.
*
* #return void
*/
public function handle()
{
$template = Template::findOrFail($this->template_id);
\Mail::send([], [], function($message) use ($template)
{
$message
->replyTo('source#domain.com', 'Email Source')
->from('source#domain.com', 'Email Source')
->to('target#domain.com', 'Email Target')
->subject($this->subject)
->setBody($template->markup, 'text/html');
});
}
MyController
\App\Jobs\SenderEmail001::dispatch(6, 'subject subject subject')
->delay(now()->addSecond(100))
->onQueue('default');
Because queue workers are long-lived processes, they will not recognise code changes without restarting. To gracefully restart the workers during deployment... run.
php artisan queue:restart
see more: https://laravel.com/docs/5.7/queues#queue-workers-and-deployment
UPDATE - This has been narrowed down to beanstalkd, sync works
I am receiving the following error when attempting to run queued commands in my production environment:
exception 'ErrorException' with message 'unserialize(): Function spl_autoload_call() hasn't defined the class it was called for'
in /home/forge/default/vendor/laravel/framework/src/Illuminate/Queue/CallQueuedHandler.php:74
I have tried both the beanstalkd and database drivers, no change. For simplicity, I am using the following command:
<?php namespace App\Commands;
use App\Commands\Command;
use App\User;
use Illuminate\Queue\SerializesModels;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Contracts\Bus\SelfHandling;
use Illuminate\Contracts\Queue\ShouldBeQueued;
class TestQueueCommand extends Command implements SelfHandling, ShouldBeQueued {
use InteractsWithQueue, SerializesModels;
/**
* #var User
*/
private $user;
/**
* Create a new command instance.
*
* #param User $user
*/
public function __construct(User $user)
{
//
$this->user = $user;
}
/**
* Execute the command.
*
* #return void
*/
public function handle()
{
\Log::info("You gave me " . $this->user->fullName());
}
}
Dispatch code:
get('queue-test', function()
{
Bus::dispatch(new TestQueueCommand(User::first()));
});
This works in my Homestead environment, fails in production (Digital Ocean, Forge). I have several beanstalkd workers and I have tried restarting them. I have also run php artisan queue:flush.
Here is the code where the error is occurring (from source):
/**
* Handle the queued job.
*
* #param \Illuminate\Contracts\Queue\Job $job
* #param array $data
* #return void
*/
public function call(Job $job, array $data)
{
$command = $this->setJobInstanceIfNecessary(
$job, unserialize($data['command'])
);
$this->dispatcher->dispatchNow($command, function($handler) use ($job)
{
$this->setJobInstanceIfNecessary($job, $handler);
});
if ( ! $job->isDeletedOrReleased())
{
$job->delete();
}
}
In the past, I also ran into a similar issue while unserializing. The problem was the default Beanstalk job size (65,535 bytes), which might not be big enough if the class being serialized contains lots of properties that need to be kept (increasing the size of the serialized string and using more than 65K for storage).
In order to solve this, try setting the size to 131,072 or even 262,144 bytes using the -z option, on the configuration file (/etc/default/beanstalkd):
BEANSTALKD_EXTRA="-z 262144"
After that, you should restart the service.
Also note that the configuration file path might be other, depending on the distribution you're using.
And since you're using Digital Ocean, you might find their documentation useful.