I need to setup cron jobs on a codeigniter site on a shared host that uses cpanel. The cron script works when run via a browser, however, I first tried running it in cron using curl and then wget, but neither of these worked. Ultimately I will want to run the jobs via php/cli.
As for why the curl and wget methods don't work, could it have anything to do with the fact that the site is completely SSL, and htaccess is used to rewrite all http requests to https? To be honest, I haven't actually ruled out the fact that the host may have disabled cron for some strange reason.
EDIT: Have checked with the host and cron is running fine!
I read an article here about cron and CI CLI and it gives this example;
/usr/local/bin/php -f /home/clinic/public_html/index.php cron foo
I have tried that method but my controler is inside a subdirectory eg /controlers/utility/cron.php and I have CI setup to not use the index.php So how would I run cron in this way?
You can use subdirectories in you parameters of index.php like this to reach the controller and method you want:
php index.php utility/cron method_in_controller
OK this is really very embarrassing. Despite checking the script several times and confirming it worked when run in a browser, I overlooked the fact that an authentication function had inadvertently been pasted in, and as I was logged in in the browser I was able to execute the cron script, but that's why it was failing when cron tried to run it. Sorry for wasting your time complex857, and thanks very much for your help any way!
Related
im using crontab on a server to run a shell script which uses pgloader to load data into a postgresql everyday and i have bitbucket pipeline with a python script that runs every week, but i want the bitbucket pipeline only to run if the cronjob was successfull.
I thought of 2 possible ways to solve the problem:
Using hc-ping to get the status of the cronjob, but im not sure i understood the documentation of hc-ping correctly, as i understood it you can only check if crontab functions properly and not the status of the jobs itself?
Another method i thought of was to check wether the data migration with pgloader was successful or not and create a file depending on it which is used in another cronjob and get the hc-ping of that cronjob. If the file was not created then the job would fail and i could check with hc-ping that the crontab was not run.
I appreciate every help i can get.
Best regards
I've spent 3 days beating my head against this before coming here in desperation.
So long story short I thought I'd fire up a simple PHP site to allow moderators of a gaming group I'm in the ability to start GCP servers on demand. I'm no developer so I'm looking at this from a Systems perspective to find the simplest solution to do the job.
I fired up an Ubuntu 18.04 machine on GCP and set it up with the Google SDK, authorised it for access to the project and was able to simply run gcloud commands which worked fine. Had some issues with the PHP file calling the shell script to run the same commands but with some testing I can see it's now calling the shell script no worries (it broadcasts wall "test") to console everytime I click the button on the PHP page.
However what does not happen is the execution of the gcloud command. If I manually run this shell script it starts up the instance no worries and broadcasts wall, if I click the button it broadcasts but that's it. I've set the files to have execution rights and I've even added the user nginx runs as to have sudo rights, putting sudo sh in front of the command in the PHP file also made no difference. Please find the bash script below:
#!/bin/bash
/usr/lib/google-cloud-sdk/bin/gcloud compute instances start arma3s1-prod --zone=australia-southeast1-b
wall "test"
Any help would be greatly appreciated, this coupled with an automated shut down would allow our gaming group to save money by only running the servers people want to play on.
Any more detail you want about the underlying system please let me know.
So I asked a PHP dev at work about this and in two seconds flat she pointed out the issue and now I feel stupid. In /etc/passwd the www-data user had /usr/sbin/nologin and after I fixed that running the script gcloud wanted permissions to write a log file to /var/www. Fixed those and it works fine. I'm not terribly worried about the page or even server being hacked and destroyed, I can recreate them pretty easily.
Thanks for the help though! Sometimes I think I just need to take a step back and get a set fresh of eyes on the problem.
When you launch a command while logged in, you have your account access rights to the Google cloud API but the PHP account doesn't have those.
Even if you add the www-data user to root, that won't fix the problem, maybe create some security issues but nothing more.
If you really want to do this you should create a service account and giving the json to the env variable, GOOGLE_APPLICATION_CREDENTIALS, which only have the rights on the compute instance inside your project this way your PHP should have enough rights to do what you are asking him.
Note that the issue with this method is that if you are hacked there is a change the instance hosting your PHP could be deleted too.
You could also try to make a call to prepared cloud function which will create the instance, this way, even if your instance is deleted the cloud function would still be there.
Using Codeigniter 2.2.0 for my project. Is $this->input->is_cli_request() validation is necessary for a cron job?
It is recommended to protect your cronjob not to execute when someone type the URL in their browser. However if you don't have any problem running your cronjob invoked by anyone then can avoid this check.
Refer https://ellislab.com/codeigniter/user-guide/general/cli.html for more details.
It recommented to run cron jobs with command line.
There are many reasons for running CodeIgniter from the command-line, but they are not always obvious.
Run your cron-jobs without needing to use wget or curl
Make your cron-jobs inaccessible from being loaded in the URL by checking for $this->input->is_cli_request()
Make interactive "tasks" that can do things like set permissions, prune cache folders, run backups, etc.
Integrate with other applications in other languages. For example, a random C++ script could call one command and run code in your models!
More info read here
But you also can prevent calling from URL in your server.
we are using Akeeba Backup for backing up our Joomla website. It is possible to start a backup just by calling an URL as described here: https://www.akeebabackup.com/documentation/quick-start-guide/automating-the-backup.html. To automate the backup of our site we want to call this URL using a daily executed Cron job. Our web hoster supports the creation of Cron jobs, but you cannot use any shell scripts or something. Only the execution of a PHP script is supported. So we have to call this URL using a PHP script. I created this script and it works fine when calling it directly using my browser. But when I try to execute it using the Cron job I only receive error 302, which means, that the document has temporarily moved. I don't know what to do with that. This is the script I want to execute:
<?php
$result = file_get_contents("http://www.mysite.net/index.php?option=com_akeeba&view=backup&key=topsecret&format=r");
?>
I am not experienced with Cron jobs or PHP so any help would be nice.
Thanks for your time.
It suffices to read the documentation. It tells you exactly how to use wget or curl for use with a CRON job. Moreover, there is a section called "A PHP alternative to wget". I write the documentation of Akeeba Backup and make available free of charge for a good reason: to be read and prevent such questions ;)
I have enabled the inbuilt logging within Codeigniter. This works fine.
However I'm running a particular script through the command line interface, of which works fine, however none of the normal logs are being updated. If I call the script through http the logs update fine.
Is it some built in feature within CI that running through the CLI won't update the logs, or do I have a problem somewhere?
Well there is normally a difference in the user that CI uses to write the logs.
If you are using the browser it may be using an Apache Web-User like ours does.
If you do it through the command line you are normally logged in as yourself. Do you have write permissions to the log file?