I have to run couple of scripts which crawl some 1000s web pages and save some information for every 10 minutes.
I am using dreamhost shared hosting for my PHP site hosting.
What would be the appropriate way to configure these services in cron so that it executes 24X7.
Please let me know which host i can use for the same.
If you can ssh into your server, you would need to run "crontab -e" to edit your cron jobs and then add a line like this:
*/10 * * * * path/to/ruby path/to/your/script.rb
Related
I'm trying to figure out if I can run schedule a command to run across all servers.
Currently I have a command clean:directories and I run it like this:
$schedule->command('clean:directories')->daily();
It scans the filesystem and removes files older than a set date. I need it to run on a queue server rather than just on the main.
Update: for now I've added an entry to crontab on the specific server I would like this ran on
If your Laravel application is deployed to multiple servers, you should think about adding the schedule:run command on each of them as a linux cronjob.
Laravel Docs:
* * * * * cd /path-to-your-project && php artisan schedule:run >> /dev/null 2>&1
Running the Scheduler (Laravel Docs)
Before you add the schedule:run command on your other servers you should be in mind, that all of your kernel defined jobs will be executed on all servers. If you have commands, which require only an execution once, you should improve your commands with the onOneServer() method. * (See requirements below)
Official Laravel Docs:
If your application's scheduler is running on multiple servers, you may limit a scheduled job to only execute on a single server. For instance, assume you have a scheduled task that generates a new report every Friday night. If the task scheduler is running on three worker servers, the scheduled task will run on all three servers and generate the report three times. Not good!
Running Tasks on One Server (Laravel Docs)
onOneServer Requirements:
To utilize this feature, your application must be using the database, memcached, dynamodb, or redis cache driver as your application's default cache driver. In addition, all servers must be communicating with the same central cache server.
I currently have a VPS and I have the fear of something happening to it, either by me or by the hosting company, and so I need to have a daily backup sent to servers unrelated to that of the hosting company.
Essentially I need my server to automatically export my database into an SQL file and then send it to a third party server, idk, such as google or whatever, on a daily basis or even a few times every day so if something happens to the server the sql files will be accessible regardless.
How can I achieve that?
We are not suppose to write you a solution, only help you with coding errors etc.
Here's what you can do:
Create a shell script on the remote server that you want to save the database,this can be a mac or a linux box, we need cron an a shell.
Create a cron job to run dayly.
ShellScript Example. [dbBackup.sh]
#!/bin/bash
today =`date '+%Y-%m-%d'`;
ssh root#remoteServer.com mysqldump -u root --password=SomeDiffPassword databaseName > /home/user/DailyDatabaseBackups/database_$today.sql
Cron Example
* * * * * /home/user/dbBackup.sh
I'm really struggling with this issue, which is a little frustrating to me because it seems to be so simple in a Linux server. I have a Windows Azure Web App and I want to run "php artisan queue:listen" on the server continuously to take care of dispatched jobs. From what I read from the documentation, in Linux you just use Supervisor to run the command constantly and to revive it, in case it dies. From what I found online, Azure has a similar functionality called WebJobs where you can serve them a script to be ran and then decide whether it should run on a schedule or continuously (kinda like the Scheduler in Laravel). With this I have 2 questions.
1 - Is this the right solution? Place a script to run the command on a WebJob and have the WebJob run continuously?
2 - I'm not experienced in writing php scripts to run command lines, so all I can do is something like this:
echo shell_exec('php artisan queue:work');
Problem is this does not give me the output of the command (I don't see anything like the "processed" result that I see when I run the command by hand on my command console and a job is processed). It is important to me to be able to read the output of the command, because I want to be able to check the logs for errors in case something happens when a job isn't able to be processed. From the documentation shell_exec returns null in case an error is thrown so I'm completely clueless on how to deal with this.
Thank you so much in advance!
Instead of using shell_exec() you can directly upload .cmd file which includes your command php artisan queue:work, and then you can find the output log in WebJob Details page.
About how to do that, please check Ernesto's answer out.
For Azure you can make a new webjob to your web app, and upload a .cmd
file including a command like this.
php %HOME%\site\wwwroot\artisan queue:work --daemon
and defining that as a triguered and 0 * * * * * frecuency cron.
that way work for me.
For more information, please refer to Run Background tasks with WebJobs.
I need to setup cron jobs on a codeigniter site on a shared host that uses cpanel. The cron script works when run via a browser, however, I first tried running it in cron using curl and then wget, but neither of these worked. Ultimately I will want to run the jobs via php/cli.
As for why the curl and wget methods don't work, could it have anything to do with the fact that the site is completely SSL, and htaccess is used to rewrite all http requests to https? To be honest, I haven't actually ruled out the fact that the host may have disabled cron for some strange reason.
EDIT: Have checked with the host and cron is running fine!
I read an article here about cron and CI CLI and it gives this example;
/usr/local/bin/php -f /home/clinic/public_html/index.php cron foo
I have tried that method but my controler is inside a subdirectory eg /controlers/utility/cron.php and I have CI setup to not use the index.php So how would I run cron in this way?
You can use subdirectories in you parameters of index.php like this to reach the controller and method you want:
php index.php utility/cron method_in_controller
OK this is really very embarrassing. Despite checking the script several times and confirming it worked when run in a browser, I overlooked the fact that an authentication function had inadvertently been pasted in, and as I was logged in in the browser I was able to execute the cron script, but that's why it was failing when cron tried to run it. Sorry for wasting your time complex857, and thanks very much for your help any way!
I have set up my cronjob for Magento to run every two hours, that is the quickest my host can set it to. However, newsletters don't get sent until I actually go into my host's control panel and click the 'Run' button for the particular cronjob.
What did I do wrong? My cron path is set as: /bin/sh /usr/www/users/FTP_USER/cron.sh
It is because the event observers are loaded based on the context (adminhtml,frontend or in your case crontab). The newsletter sending observer is not on the crontab list so it cannot send emails.
See this article: http://www.aschroder.com/2010/01/magento-events-explained-and-a-few-gotchas-avoided/
I imagine this is highly likely to be a permissions/PATH problem of one form or other. When you manually trigger the event by clicking something in the control panel, it is probably getting ran as the Apache user (www-data or equivalent, depending on the platform). The cron will most likely be running as a different user to this.
Assuming you're referring to the core newletters cron Mage_Newsletter_Model_Observer::scheduledSend, it's unlikely there's any problem with cwd being incorrect for relative include paths. This leaves the most likely culprits a.) the cron user doesn't have execute permissions on your cron.sh, b.) the cron user doesn't have access to the mail application on the server, because it isn't include in the users PATH.
In my experience the cron.sh script hasn't been up to it. Consequently I just run the cron.php . This requires php cli being setup properly with enough RAM and sensible timeouts.
In your crontab try:
* * * * * /usr/bin/php /home/USER/public_html/cron.php >> /home/USER/public_html/var/log/cron.log 2>&1
I also keep a log file in var/log/cron.php so that I can see errors made during a cron job.