Check successfull cronjob/pgloader - shell

im using crontab on a server to run a shell script which uses pgloader to load data into a postgresql everyday and i have bitbucket pipeline with a python script that runs every week, but i want the bitbucket pipeline only to run if the cronjob was successfull.
I thought of 2 possible ways to solve the problem:
Using hc-ping to get the status of the cronjob, but im not sure i understood the documentation of hc-ping correctly, as i understood it you can only check if crontab functions properly and not the status of the jobs itself?
Another method i thought of was to check wether the data migration with pgloader was successful or not and create a file depending on it which is used in another cronjob and get the hc-ping of that cronjob. If the file was not created then the job would fail and i could check with hc-ping that the crontab was not run.
I appreciate every help i can get.
Best regards

Related

How to run a K6 script locally and send data to remote InfluxDB instance (No Docker)

I'm extremely new at k6 + influxdb + grafana, and I was given a task related to execute certain K6 Scripts locally but save/pass the data over a remote InfluxDB instance.
As of now I'm having issues given that I'm not sure what I'm missing regarding the needed configurations in order to do this since everytime I try to run the script pointing at the InfluxDB instance I'm just getting an error everytime I run it:
The command that I'm executing is:
k6 run --out influxdb="https://my_influxdb_url/write" //sampleScript.js
But the original URL that was handed over to me was something like this:
https://my_influxdb_url/write?db=DB_NAME&u=USERNAME&p=PASSWORD
And when I execute the first mentioned script I'm getting the following error:
ERRO[000X] Couldn't write stats error="404 page not found\n" output=InfluxDB1
So I've tried creating K6_INFLUXDB_USERNAME and K6_INFLUXDB_PASSWORD as environment variables but I'm still getting the same error.
I'm not sure if I might be missing some .yaml file like a datasource in which I should fill those 3 values? (DB_NAME, USERNAME, PASSWORD)
Or maybe I'm just doing it all wrong and not calling the execution command properly for this scenario.
Another weird thing that I noticed is that OUTPUT is throwing InfluxDB1 instead of my actual InfluxDB url which I guess might be where my issue lies.
Any kind of tip would be greatly appreciated since the actual documentation that I've found so far is always run either on a Docker container instance of Grafana+InfluxDB or simply running it locally which is not my case :(
Thanks a lot in advance as always!!

Running a PHP file 24x7 on heroku. Is that possible?

I'm having a PHP file which has some JS in it. I run a script on this file which will check if the date on a firebase database matches with the current date and if not update a few values. So I want this PHP file to run 24x7. I am completely new to heroku. Is it possible to do this on heroku or is there any other simpler solution ? Any help would be appreciated.
I don't see why not. You can set a worker process in your Procfile that will run a non web based process. In that PHP script you can set a timer to perform this action daily, and if were to crash for whatever reason, Heroku would restart the process.
Here's an explanation of how they work: https://devcenter.heroku.com/articles/background-jobs-queueing

Using is_cli_request() is necessary for cron job in codeigniter

Using Codeigniter 2.2.0 for my project. Is $this->input->is_cli_request() validation is necessary for a cron job?
It is recommended to protect your cronjob not to execute when someone type the URL in their browser. However if you don't have any problem running your cronjob invoked by anyone then can avoid this check.
Refer https://ellislab.com/codeigniter/user-guide/general/cli.html for more details.
It recommented to run cron jobs with command line.
There are many reasons for running CodeIgniter from the command-line, but they are not always obvious.
Run your cron-jobs without needing to use wget or curl
Make your cron-jobs inaccessible from being loaded in the URL by checking for $this->input->is_cli_request()
Make interactive "tasks" that can do things like set permissions, prune cache folders, run backups, etc.
Integrate with other applications in other languages. For example, a random C++ script could call one command and run code in your models!
More info read here
But you also can prevent calling from URL in your server.

Crontab job as a service

I have a script that pulls some data from a web service and populates a mysql database. The idea is that this runs every minute, so I added a cron job to execute the script.
However, I would like the ability to occasionally suspend and re-start the job without modifying my crontab.
What is the best practice for achieving this? Or should I not really be using crontab to schedule something that I want to occasionally suspend?
I am considering an implementation where a global variable is set, and checked inside the script. But I thought I would canvas for more apt solutions first. The simpler the better - I am new to both scripting and ruby.
If I were you my script would look at a static switch, like you said with your global variable, but test for a file existence instead of a global variable. This seems clean to me.
Another solution is to have a service not using crontab but calling your script every minute. This service would be like other services in /etc/init.d or (/etc/rc.d depending on your distribution) and have start, stop and restart commands as other services.
These 2 solutions can be mixed:
the service only create or delete the switching file, and the crontab line is always active.
Or your service directly edits the crontab like this, but
I prefer not editing the crontab via a script and the described technique in the article is not atomic (if you change your crontab between the reading and the writting by the script your change is lost).
So at your place I would go for 1.

Start Akeeba Backup Using a Cron Job

we are using Akeeba Backup for backing up our Joomla website. It is possible to start a backup just by calling an URL as described here: https://www.akeebabackup.com/documentation/quick-start-guide/automating-the-backup.html. To automate the backup of our site we want to call this URL using a daily executed Cron job. Our web hoster supports the creation of Cron jobs, but you cannot use any shell scripts or something. Only the execution of a PHP script is supported. So we have to call this URL using a PHP script. I created this script and it works fine when calling it directly using my browser. But when I try to execute it using the Cron job I only receive error 302, which means, that the document has temporarily moved. I don't know what to do with that. This is the script I want to execute:
<?php
$result = file_get_contents("http://www.mysite.net/index.php?option=com_akeeba&view=backup&key=topsecret&format=r");
?>
I am not experienced with Cron jobs or PHP so any help would be nice.
Thanks for your time.
It suffices to read the documentation. It tells you exactly how to use wget or curl for use with a CRON job. Moreover, there is a section called "A PHP alternative to wget". I write the documentation of Akeeba Backup and make available free of charge for a good reason: to be read and prevent such questions ;)

Resources