I'm trying to set up a CRON job for my laravel 4.2 app and am strugglng to get things to work.
I've created a command which works successfully from the command line. I first tried created a CRON task with my service provider but was unable to get this to work. I tried:
/usr/bin/php /var/www/vhosts/mydomin.co.uk/subdomains/golfmanager/httpdocs/artisan reminder:week
This does not appear to work
I then tried:
/usr/bin/lynx -dump /var/www/vhosts/mydomain.co.uk/subdomains/golfmanager/httpdocs/artisan reminder:week
That failed to work either. My understanding is Lynx is a browser? but I assume because all the traffic is re-routed this approach won't work for a Laravel app?
So I then installed the package [liebig/cron][1] with a view to getting that up and running. I created a cron task with an external provider 'cronservice' which appears to be triggering but I'm not getting the expected results from the task.
I have configured the package as described and have current placed the following in bootstratp/start.php
Event::listen('cron.collectJobs', function() {
Cron::add('reminder-week', '*/15 * * * *', function() {
echo "Running Task";
Artisan::call('reminder:week');
return true;
});
});
The package logs activity to a database. I can see a log entry suggesting it's fired but can't see an entry that the job has worked. Laravel log files suggest there is an httpfoundexception
I've not created a route for CRON - the readme suggests it's using an internal one?
I'm quite confused. I'd like to stick with the package approach and the external provider but not sure if I now need to create a route and how I can test the set up is correct and the jobs will work.
I've tried running the script from the browser `http://mydomain.com/cron.php?key=xxxxx' but that also throws an httpnotfound exception
ANy help appreciated to get this to work
I changed the Event::listen code to app/start/glopal.php and all working now.
Re-read the readme more carefully!
Related
Making custom package with custom data kept in migration/seeder files on laravel 9 site and reading docs https://laravel.com/docs/9.x/packages
I did not find in which way I can run some data checking on plugin installation ?
In file packages/companyname/Mypackage/src/Providers/MypackageProvider.php
I have loaded migration files :
$this->loadMigrationsFrom(__DIR__.'/../database/migrations');
I have a custom class, running which I got list of errors in case of data in related tables have logical errors.
I think to make functionality to run this checking process from dashboard of my site
Also I would like in case of such errors to show some errors when I run this checking process manually and maybe (if that is possible)
when I run commands in console of the site :
composer install mypackage
composer dump-autoload
and migrating migration/seeder files of this plugin...
Also is cases of logical errors I would like in the app make a condition like :
if(class_exists(mypackage::class)) {
and make functionality of this plugin unavailable...
How that can be implemented ?
"laravel/framework": "^9.41"
Thanks in advance!
If I understand what you want to do is check some conditions on the user app before your package may be used, but I think you can check these conditions before the user install the package, executing an script before installing the package using a composer script pre-install-cmd. After installing your package you can use the boot method of your service provider and execute all those conditions that have to be meet before using the functionality and abort if they are not presents.
Hope this help.
I have the following code in Kernel.php. The purpose is to run the command between 23:000 and 04:00
$schedule->command('moving:vehicles -vvv')
->between('23:00', '04:00')
->everyTenMinutes();
However, the cron starts executing the command at 17H00. I have tried to replicate the same by send emails into mailtrap and I get different results.
https://github.com/laravel/framework/issues/28943
The link above explains why the schedule was not running as intended. It was a bug in Laravel 5.x and it was resolved in version 6. I changed direction to use https://crontab.guru/#*/10_0-4,23_*_* to solve the issue I faced.
I'm trying to use a laravel-request-logger.
I followed the installation steps and set the log level to debug.
Then changed the log level to debug
Ran composer update and composer dump-autoload
Tried several good and bad requests
First I didn't even find the file mentioned http.log anywhere in my app folder, neither in my storage folder (where the only logs folder is).
So I created the http.log file myself within the storage\logs folder and gave all users write permissions. But nothing is written after several good and bad requests.
Second, I notice the default laravel.log hasn't logged anything since installing this package, probably because this package overules the default?
Because Laravel’s logging features are managed by Monolog53, you have a wide array of additional logging options at your disposal, including the ability to write log messages to this log file, set logging levels, send log output to the Firebug console54 via FirePHP55, to the Chrome console56 using Chrome Logger57, or even trigger alerts via e-mail, HipChat58 or Slack59.
public function index(){
$items = [
'one',
'two'
];
\Log::debug($items);
return view('about');
}
Aa customized log message similar to the following will
be appended to storage/logs/laravel.log:
[2015-12-14 01:51:56] local.DEBUG:
array ( 0 => 'Pack luggage',
'one',
'two'
)
There is of course more but it will be to difficult to explain it in here. :) Cheers
The first problem I'm running in to is that when installing I receive a mysql error stating that a table cannot be found. Of course it can't, I finished installing the dependencies much less run the migration. The error was being triggered by a Eloquent query in a view composer. After commenting out the entirety of my routes file Composer let me continue.
I proceed to uncomment out my route file - I get the error once again trying to run any artisan commands (can't migrate my database because I haven't migrated my database). Repeat the solution for step one and I've migrated my database.
Artisan serve is now serving me my layout file in the terminal and exiting. I'm at a bit of a loss to troubleshoot this. I assumed that it was possibly a plugin, trying to disable plugins one by one results in:
Script php artisan clear-compiled handling the pre-update-cmd event returned with an error
and being served up my layout file in the terminal.
It seems that the error is directly related to this function in my routes file:
View::composer('layouts.main', function($view) {
$things = Thing::where('stuff', 1)->orderBy('stuff')->get();
$view->with(compact('things'));
});
This isn't a new introduction to the application however so the underlying cause is coming from somewhere else.
As i said in the comment, if you are finding database errors in production server but not in local, then
check database credentials. if its ok then....
check the different configs in the environment.
using profilers(any) will let you know what environment you are in.
I've been searching for a while and think I have part of the information I need but just need some assistance putting it all together.
What I'm trying to achieve is to call a URL (a codeigniter controller) on a regular basis e.g. every 5 minutes which will go through my database mail queue and send the mail using amazon SES.
So far I have successfully created the controller, model, DB and SES is working just fine. The controller sends 10 emails at a time and it all works fine when I manually hit the URL.
I'm not too familiar with cron jobs, but think this is where I need to head.
My application is set up on Elastic beanstalk on AWS.
I think that I need a folder called .ebextensions in my web root, with a file called something.config in it, where I can put some 'container commands'. I also think I will need to include 'leader_only: true' in there somewhere to avoid my replicated instances doing the same jobs.
When I don't understand is what should my container command be, considering controller is 'http://myapplication/process_mail' ? From examples I've seen I couldn't see how it determines the frequency, or even the code that 'calls' the URL.
In my controller, I previously had the following code to ensure it could only be called from the command line. Is this something I can keep and have or will the container command just hit the URL like any other user?
if (!$this->input->is_cli_request()) {
echo "Access Denied";
return;
}
Thanks in advance for any help at all. I think i just need help with what should go in the config file, but then again I may have gone down completely the wrong path altogether!
UPDATE:
So far I've got as far as this:
I believe i need to run the application from the commandline like this http://ellislab.com/codeigniter/user-guide/general/cli.html
so my command would be php index.php process_mail
So what I actually need is help with running this command evey 5 minutes. This is what I have so far:
container_commands:
send_mail:
command: php index.php process_mail
leader_only: true
But what I don't understand is how I get this to run every 5 minutes, rather than just when the instance is set up. Do I need to create a cron job file on instance creation, with the php command in it instead?
Update 2:
To anyone else with the same problem, i got this sorted in the end like this:
an ebextensions file that looks like this: (.ebextensions/mail_queue.config)
container_commands:
01_send_mail:
command: "cat .ebextensions/process_mail.txt > /etc/cron.d/process_mail && chmod 644 /etc/cron.d/process_mail"
leader_only: true
a file called process_mail.txt in the same folder that looks like this:
# The newline at the end of this file is extremely important. Cron won't run without it.
*/5 * * * * root /usr/bin/php /var/app/current/index.php process_mail > /dev/null
So, every 5 minutes it runs via the cmd line the codeigniter main index file, passing in the controller name.
thanks to this: https://stackoverflow.com/a/15233848/2604392
I would set up the cron job to talk to the url, then store result in a MySQL database. Then regular PHP or any other app can connect to MySQL and access the data. That's the suggested way to connect to Twitter since a few months, so you can find info on how to do this floowing search for Twitter connectivity.
Hope this helps
By the way, while writing an email generating PHP script, I noticed that I have to slow down the pace of email sending to avoid being flagged as spammer. I added a delay of 2 seconds between emails and it did the job. My database was only 2500 so no big deal (except taking care of changing the PHP_MAXEXECUTION time variable)...