I have set up my cronjob for Magento to run every two hours, that is the quickest my host can set it to. However, newsletters don't get sent until I actually go into my host's control panel and click the 'Run' button for the particular cronjob.
What did I do wrong? My cron path is set as: /bin/sh /usr/www/users/FTP_USER/cron.sh
It is because the event observers are loaded based on the context (adminhtml,frontend or in your case crontab). The newsletter sending observer is not on the crontab list so it cannot send emails.
See this article: http://www.aschroder.com/2010/01/magento-events-explained-and-a-few-gotchas-avoided/
I imagine this is highly likely to be a permissions/PATH problem of one form or other. When you manually trigger the event by clicking something in the control panel, it is probably getting ran as the Apache user (www-data or equivalent, depending on the platform). The cron will most likely be running as a different user to this.
Assuming you're referring to the core newletters cron Mage_Newsletter_Model_Observer::scheduledSend, it's unlikely there's any problem with cwd being incorrect for relative include paths. This leaves the most likely culprits a.) the cron user doesn't have execute permissions on your cron.sh, b.) the cron user doesn't have access to the mail application on the server, because it isn't include in the users PATH.
In my experience the cron.sh script hasn't been up to it. Consequently I just run the cron.php . This requires php cli being setup properly with enough RAM and sensible timeouts.
In your crontab try:
* * * * * /usr/bin/php /home/USER/public_html/cron.php >> /home/USER/public_html/var/log/cron.log 2>&1
I also keep a log file in var/log/cron.php so that I can see errors made during a cron job.
Related
I have a .sh file that is stored in GCS. I am trying to schedule the .sh file through google cloud shell.
I can run the same file using gsutil cat gs://miptestauto/baby.sh | sh command but not able to schedule it.
Following is my code for scheduling the file:
16 17 * * * gsutil cat gs://miptestauto/baby.sh | sh
It displays the message as "auto saving..done" but the scheduled job is not get displayed when I use crontab -l
# contents of .sh file
bin/bash
bq load --source_format=CSV babynames.baby_destination13 gs://testauto/yob2010.txt name:string,gender:string,count:integer
Please can anyone tell me how schedule it using google cloud shell.
I am not using compute engine/app engine. Just wanted to schedule it using the cloud shell.
thank you in advance :)
As per the documentation, Cloud Shell is intended for interactive use only. The Cloud Shell instances are provisioned on a per-user, per-session basis and sessions are terminated after an hour of inactivity.
In order to schedule a daily cron job, the instance needs to be up and running all time but this doesn’t happen with Cloud Shell and I believe your jobs are not running because of this.
When you start Cloud Shell, it provisions a f1-micro instance which is the same machine type you can get for free if you are eligible for “Always Free”. Therefore you can create a f1-micro instance, configure the cron job on it and leave it running so it can execute the daily job.
You can check free usage limits at https://cloud.google.com/compute/pricing#freeusage
You can also use the Cloud Scheduler product https://cloud.google.com/scheduler which is a serverless managed Cron like scheduler.
To schedule a script you first have to create a project if you don’t have one. I assume you already have a project so if that’s the case just create the instance that you want for scheduling this script.
To create the new instance:
At the Google Cloud Platform Console click on Products & Services which is the icon with the four bars at the top left hand corner.
On the menu go to the Compute section and hover on Compute Engine and then click on VM Instances.
Go to the menu bar above the instance section and there you will see a Create Instance button. Click it and fill in the configuration values that you want your new instance to have. The values that you select will determine your VM instance features. You can choose, among other values, the name, zone and machine type for your new instance.
In the Machine type section click the drop-down menu tab to select an “f1-micro instance”.
In the Identity and API access section, give access scope to the Storage API so that you can read and write to your bucket in case you need to do so; the default access scope only allows you to read. Also enable BigQuery API.
Once you have the instance created and access to the bucket, just create your cron job inside your new instance: In the user account under which the cron job will execute, run crontab -e and edit this file to run the cron job that will execute your baby.sh script.The following documentation link should help you with this.
Please note, if you want to view output from your script you may need to redirect it to your current terminal.
I am looking for a way to link an azure scheduler or web job to the Laravel schedule.
My understanding is, to set up an Azure schedule I would need an end point to link to my Laravel, which I am not sure how to achieve that.
TL;DR
You can use the WebJobs under WebApps with an commandline script to trigger the Laravel scheduler.
Full reference
Azure providers WebJobs that can fire various triggers including Cron-like schedulers. In order to run the Laravel scheduler you need to trigger the schedule:run command every minute. For now I'll assume artisan lives in D:\home\site\wwwroot\artisan which is the default location for PHP based deployments.
Create a new file called runsched.cmd or anything else als long as it has the .cmd extension. Edit the file with notepad and add:
php %HOME%\site\wwwroot\artisan schedule:run
Save the file and go to the Azure portal.
Select you WebApp and find WebJobs under the application settings. Click Add and a sidepanel will appear.
Give the WebJob a name, for example LaravelSchulder and upload the runsched.cmd file from the first step.
Set Type to Triggered and make sure Triggers is set to Scheduled.
Now you can specify how often the command must be triggered. Even the portal says 'CRON Expression' the cron format is not the same as the Linux notation. If you set the expression to all asterisks as shown in the Laravel documentation you're command will be triggered every second, which is way too often for most applications. The correct CRON Expression is:
0 * * * * *
If you're Job looks something like this click OK.
The Laravel scheduler will now be triggered every minute. To verify that everything is working correctly you can trigger the job once yourself (Select the LaravelSchulder job and click Run) and check the job status. If Azure reports Failed under status check the logs and make sure you've entered the correct paths'
Hope that explains it.
Quick note: Microsoft likes to change Azure Portal on a regular basis, so any of these instructions may have changed by now.
I am using my dev site to test an abandoned cart email through MageMonkey/Mandrill. I believe I already have the cron job already configured as other transactional emails send without a problem (maybe this assumption is wrong?).
I also installed the AOE Scheduler and it displays all of the correct cron jobs. After I manually run the heartbeat and generate a schedule - nothing else runs and I get the notice that the "heartbeat is older than xx minutes."
I'm honestly not sure where my issue is - whether it is because I am in the dev site (shouldn't be because other emails send), the cron job configuration or the AOE Scheduler, etc.
In my magento admin under configuration I have the following:
generate schedules every 15
schedule ahead for 30
missed if not run within 45
success history lifetime 1440
failure history lifetime 1440
heartbeat taske */5 * * *
I am using Magento 1.7
Thanks everyone! This is pretty new to me
Here is my cron.php file -
require 'app/Mage.php';
if (!Mage::isInstalled()) {
echo "Application is not installed yet, please complete install wizard first.";
exit;
}
// Only for urls
// Don't remove this
$_SERVER['SCRIPT_NAME'] = str_replace(basename(__FILE__), 'index.php', $_SERVER['SCRIPT_NAME']);
$_SERVER['SCRIPT_FILENAME'] = str_replace(basename(__FILE__), 'index.php', $_SERVER['SCRIPT_FILENAME']);
Mage::app('admin')->setUseSessionInUrl(false);
umask(0);
try {
Mage::getConfig()->init()->loadEventObservers('crontab');
Mage::app()->addEventArea('crontab');
Mage::dispatchEvent('default');
} catch (Exception $e) {
Mage::printException($e);
}
I had the same issue cronjobs not working. I searched and found a solution that worked for me. My Magento is ver. 1.9.1 though.
http://support.xtento.com/wiki/Setting_up_the_Magento_cronjob
I added the following line in cron.php
$isShellDisabled = true;
after the line
$isShellDisabled = (stripos(PHP_OS, ‘win’) === false) ? $isShellDisabled : true;
Hope this helps someone who has same issue.
Most transactional emails are triggered synchronously during runtime through Magento’s events system. I can’t ask any follow up questions about your development environment, but are you sure that your system cron is set up to trigger Magento’s cron service? AOE Scheduler can generate the cron schedules but you still need the system cron to invoke Magento’s cron service.
To execute all these configured tasks, the cron.php file located in
the Magento root will need to be run periodically, for example every
15 minutes. Basically, this script will check if it needs to run any
tasks, and if it needs to schedule any future tasks.
While setting up the system cron service is crucial for getting all of Magento’s scheduled tasks to run normally, and for testing purposes I would still recommend this, you can also use AOE Scheduler to run specific jobs immediately from the Admin Panel. Check out the screenshot in the linked article that shows the screen where you can do this. Simply select the job you need to run and choose “Run now” from the Actions box.
You can also choose to run the task directly. Be careful with that, as
the execution might last longer than a few seconds or might depend on
some other command line environment settings. For testing small tasks
this might still be a comfortable option.
Using Codeigniter 2.2.0 for my project. Is $this->input->is_cli_request() validation is necessary for a cron job?
It is recommended to protect your cronjob not to execute when someone type the URL in their browser. However if you don't have any problem running your cronjob invoked by anyone then can avoid this check.
Refer https://ellislab.com/codeigniter/user-guide/general/cli.html for more details.
It recommented to run cron jobs with command line.
There are many reasons for running CodeIgniter from the command-line, but they are not always obvious.
Run your cron-jobs without needing to use wget or curl
Make your cron-jobs inaccessible from being loaded in the URL by checking for $this->input->is_cli_request()
Make interactive "tasks" that can do things like set permissions, prune cache folders, run backups, etc.
Integrate with other applications in other languages. For example, a random C++ script could call one command and run code in your models!
More info read here
But you also can prevent calling from URL in your server.
I have a script that pulls some data from a web service and populates a mysql database. The idea is that this runs every minute, so I added a cron job to execute the script.
However, I would like the ability to occasionally suspend and re-start the job without modifying my crontab.
What is the best practice for achieving this? Or should I not really be using crontab to schedule something that I want to occasionally suspend?
I am considering an implementation where a global variable is set, and checked inside the script. But I thought I would canvas for more apt solutions first. The simpler the better - I am new to both scripting and ruby.
If I were you my script would look at a static switch, like you said with your global variable, but test for a file existence instead of a global variable. This seems clean to me.
Another solution is to have a service not using crontab but calling your script every minute. This service would be like other services in /etc/init.d or (/etc/rc.d depending on your distribution) and have start, stop and restart commands as other services.
These 2 solutions can be mixed:
the service only create or delete the switching file, and the crontab line is always active.
Or your service directly edits the crontab like this, but
I prefer not editing the crontab via a script and the described technique in the article is not atomic (if you change your crontab between the reading and the writting by the script your change is lost).
So at your place I would go for 1.