Queue don't start in Laravel - laravel

I try:
php artisan queue:listen
But the result is a empty:
picture of result
What should happen?
I want it to execute the code:
$job = (new SendEmail())->delay(10);
$this->dispatch($job);

That's what should happen. It means it's waiting for something to be pushed to the queue.
If you carry out an action that pushes something to the queue, through an event or job etc. then you will see something like
-bash-4.1$ php artisan queue:listen
[2016-07-22 09:27:57] Processed: App\Listeners\Users\SendWelcomeEmail#handle
Have you definitely set up the correct queue driver (e.g. database) in your .env or config/queue.php file?

Related

How to run different Beanstalkd Laravel queues from the same server?

I have two different Laravel queues in the same server. In my Supervisord.d folder I have two ini files for those queues. The job names are different in the queues. But, every time I run a job and expect the result from one queue, other queue also interferes. Here is the sample of the ini files:
[program:queue_runner]
command = php /path_to_prod/artisan queue:work --daemon --queue=default,smsInt,smsIntLow --tries=1 --timeout=30
stdout_logfile = /path_to_prod/storage/logs/supervisor.log
redirect_stderr = true
numprocs = 5
process_name = %(program_name)s%(process_num)s
[program:queue_runner_test]
command = php /path_to_test/artisan queue:work --daemon --queue=default,smsIntTest,smsIntTestLow --tries=1 --timeout=30
stdout_logfile = /path_to_test/storage/logs/supervisor.log
redirect_stderr = true
numprocs = 50
process_name = %(program_name)s%(process_num)s
Could you please help me to solve it.
Found the solution of my problem. Though the jobs were despatching from the test site on the smsIntTest and from the other site on the smsInt queues from the beginning. But, they were getting picked up by wrong queues every time.
As the following post suggested, Why is Laravel or Beanstalkd skipping jobs?
I've assigned 'queue' => 'smsInt' in the 'connections' array of the app/config/queue.php file for one site, and 'queue' => 'smsIntTest' for the other one. This solution solved the problem.

Laravel Jobs: Can't delete file after queue processed

I just cant figure why is this happening. When I dispatch job which uploads file to my Amazon S3, I want to delete id and update relation (channel). Everything works but I cant delete the file. I cant delete the file even manually. Windows says some process is still using it. To delete it manually I have to end queue worker first. Wierd part is that file is uploaded o my S3, relation is updated in database and queue worker in terminal says job is processed. No failed or running jobs in the jobs table. Anyone know whats going on?
My Job handle below.
public function handle()
{
$path = storage_path() . "/uploads/" . $this->fileId;
$fileName = $this->fileId . ".png";
if(Storage::disk("s3images")->put("profile/" . $fileName, fopen($path, "r+"))){
File::delete($path);
}
$this->channel->image_filename = $fileName;
$this->channel->save();
}
Try
unlink($path)
It worked for me.

multiple sidekiq queue for an sinatra application

We have a Ruby on Sinatra application. We use sidekiq and redis for queue process.
We already implemented and using sidekiq that queues up jobs that does insertion into database. it works pretty fine till now.
Now I wanted to add another jobs which will read bulk data from database and export to csv file.
I donot want both this job to be in same queue instead is there possible to create different queue for these jobs in same application?
Please give some solution.
You probably need advanced queue options. Read about them here: https://github.com/mperham/sidekiq/wiki/Advanced-Options
Create csv queue from command line (it can be done in config file as well):
sidekiq -q csv -q default
Then in your worker:
class CSVWorker
include Sidekiq::Worker
sidekiq_options :queue => :csv
# perform method
end
take a look at sidekiq wiki: https://github.com/mperham/sidekiq/wiki/Advanced-Options
by default everything goes inside 'default' queue but you can specify a queue in your worker:
sidekiq_options :queue => :file_queue
and to tell sidekiq to process your queue, you have to either declare it in configuration file:
:queues:
- file_queue
- default
or pass it as argument to the sidekiq process: sidekiq -q file_queue

laravel4.1 can't render template when upload to server

it's weird, when developing localhost, everything works fine, the default page shows.
after upload to server, it just show blank page !
it's driving me crazy !
echo 'outside route';
Route::get('/', function()
{
echo 'inside route';
return View::make('hello');
});
both echo works, but View::make('hello') just don't work, views/hello.php is the default file.
You might have to fix your permissions on the remote server, as it might be a cache issue.
1) Run recursive chmod on you storage path (*assuming you already have proper file ownage)
cd /path/to/laravel
chmod -R 755 app/storage
2) Clear cache with Artisan
php artisan cache:clear
3) Refresh page, should work now.
*if you are running the http server as different user (for example you're on Ubuntu and Apache runs as user www-data), you might want to set file ownage for Laravel app files as well
chown -R www-data .
EDIT:
Just a remark about your code example - remember that if you want to use Blade templating engine you have to name your files accordingly. If you want to have a blade template called 'something', you will place your code in app/views/something.blade.php and than reffer to it for example View::make('something').

CodeIgniter Cron Job on Shared Hosting?

I am trying to learn how to do my first cron job using CodeIgniter. In the past, it seemed the only way to do this with CI was to use the wget command instead of php.
The CodeIgniter User Guide, however, says that now you can do this from the command line, for example by running:
$ cd /path/to/project;
$ php index.php controller method
This works great using Terminal on my local setup. But when I use a similar command in the cron section of cPanel on my shared hosting, the task just returns the contents of index.php.
I'm not entirely sure what cPanel does with this command, so unsure as to whether it's using the command line at all.
Could someone explain how I might be able to set up a cron job on shared hosting using CodeIgniter please?
Here is the example code from the CodeIgniter user guide:
tools.php
public function message($to = 'World')
{
echo "Hello {$to}!".PHP_EOL;
}
}
?>
It's going to depend on your host. Cron jobs could really screw stuff up if you're not careful, so a lot of shared hosts don't allow it. You probably need to be on some virtual container (like a VPS, virtuozo, etc.) to do this. This isn't a CodeIgniter issue, but a hosting provider issue. Call them first.
We worked around this exact issue as follows:
Set up a normal php file that is scheduled by cron. Nothing to do with codeigniter yet
Inside it, you can make an fsocket or curl request to perform your regular CodeIgniter call as you do from the web.
Here's an example (say, cron.php)
#!/usr/local/bin/php.cli
<?php
DEFINE('CRON_CALL_URL','https://my_server/'); //
DEFINE('CRON_HTTPS_PORT', 443); // port to use during fsocket connetion
DEFINE('CRON_SSL_PREFIX', 'ssl://'); // prefix to be used on the url when using ssl
$current_time = now();
$md5_hash = md5('somevalue'.$current_time);
$url = CRON_CALL_URL.'MYCTRL/MYMETHOD';
$parts=parse_url($url);
//
$parts['query']='md5_hash='.$md5_hash.'&time='.$current_time;
$fp = fsockopen(CRON_SSL_PREFIX.$parts['host'],
isset($parts['port'])?$parts['port']:CRON_HTTPS_PORT,
$errno, $errstr, 30);
if (!$fp) {
} else {
if (!array_key_exists('query', $parts)) $parts['query'] = null;
$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($parts['query'])."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($parts['query'])) $out.= $parts['query'];
fwrite($fp, $out);
fclose($fp);
}
}
?>
NOTE: Make sure that in your MYCTRL/MYMETHOD function you have
ignore_user_abort(true);
that way when you fsocket connection is closed, your script will still run to the end.
We actually have a bunch of these fsockets for various reasons. If you need to make sure that the call to that controller/method came from the cron script, you need to pass some additional hash values so that only cron and the script know it. Once the script is called it has access to any codeigniter functions. Works like a charm.
I've set up 100s of CI cronjob on shared hosting like this: create a short php script which calls the CI controller as if it was a webbrowser.
So, script.php contains this:
script #! /usr/local/bin/php -f /home/example/public_html/script.php
<?php
get_get_contents('http:example.com/cronjob/');
?>
Then set your cronjob in cPanel to call script.php
When it runs Script.php will call the Codeigniter Cronjob controller. There you have the entire CI framework at your disposal.
If you are going to call it like a web browser, why not replace the cronjob
command with:
wget http://example.com/cronjob/
instead of creating something new or simply
curl --suppress http://example.com/cronjob/`

Resources