deleteDirectory not working as Cron job - bash

I have a job which I want cron to run at given interval , this removes some old entries from my db and then removes the files that were uploaded with the db entry.
If I run the task from the terminal it works fine, (removing both db entries and the files uploaded). But when I leave the task to cron, it does remove the entries in the db , but it doesn’t remove the uploaded folders in my public directory.
the code that removes the files looks like this
$machtiging = File::files('icec/'.$icecconsult->accesstoken.' /machtiging');
if(count($machtiging) > 0){
File::deleteDirectory(public_path() . '/icec/'.$icecconsult->accesstoken);
}
so it looks if there there, if so , delete them , but this just doesn’t work , ive tried putting the cron job run as root , www-data , and my user , all with same result . files and folder permission I have set them to 777 to be sure, but this doesn’t seem to be the problem.
Ive also tried adding shell=/bin/bash but that didnt do the trick either
Any help on solving this issue would be much appreciated
Update
the crobline looks like this
* * * * * /bin/bash /home/ice/verlopenicec.sh >> /tmp/output 2&>1
also tried
* * * * * /home/ice/verlopenicec.sh >> /tmp/output 2&>1
and
* * * * * /usr/bin/php /var/www/wemedic/artisan verwijder-verlopen-icec-consult 1>> /dev/null 2>&1
All seem to run . its just it doesnt delete or move the directory it needs to
trying to get some debug data , but nothing is showing up
the verlopenicec.sh script itself looks is just a say reference to the original script that laravel should run . thouth might be handy to make a script to test why laravel aint deleting the directory.
script looks like this
#!/bin/bash
SHELL=/bin/bash
USER=ice
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
/usr/bin/php /var/www/wemedic/artisan verwijder-verlopen-icec-consult
wich runs a laravel command that looks like this
$icecconsult = Icecconsult::where('id', '=', $consult_id)->firstOrFail();
$icecconsult->expire = Carbon::now();
$icecconsult->status = 'Gesloten';
$icecconsult->save();
$icec = Icec::where('id', '=', $icecconsult->icec_id)->firstOrFail();
$icec->delete();
$machtiging = File::files('icec/'.$icecconsult->accesstoken.'/machtiging');
if(count($machtiging) > 0){
// File::deleteDirectory(public_path() . '/icec/'.$icecconsult->accesstoken);
$move = 'mv /var/www/wemedic/public/icec/'.$icecconsult->accesstoken.' /tmp' ;
shell_exec('SHELL=/bin/bash');
shell_exec('PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games');
shell_exec($move);
// File::deleteDirectory('/var/www/wemedic/public/icec/'.$icecconsult->accesstoken);
}
return;
(ive commented out the delete function and tried to move it to the temp directory , but having the same result with moving or deleting. both work If I directly run it , but not when cron runs it . Cron does run the task , cause I can see it beeing fired in the /var/log/syslog and the database entru does get changed )
Ive tryied deleting , then moving it to the temp folder, both work if I run them directly , but none work when I leave it to cron/ laravel cron scheduler
If also tried to get a response (tru/false ) from the delete function, but when I try to save that to the db to see it , the function seems to not execute.
dd($machtiging) returns an array like below, showing the files in the folder , after knowing there are files in the folder, it should go ahead and delete the complete folder allong with any files/sub directories located in it
array:1 [▼
0 => "icec/a89ce4c9010e0a745308b29782b5eeae/machtiging/machtiging.pdf"
]
Thanks for you help

Try with this crontab :
*/1 * * * * ice /bin/bash /home/ice/verlopenicec.sh >> /tmp/output.log
Change your bash script to :
#!/bin/bash
moment="`/bin/date +%y_%m_%d`"
echo "--- The script has been executed on $moment ---"
/usr/bin/php /var/www/wemedic/artisan verwijder-verlopen-icec-consult
It should work better, but if not, could you paste here the content of your generated /tmp/output.log ?

Related

Schedule Joomla Site Offline

I'm wondering if anyone has more details on how to schedule Joomla to be set to "Offline/Maintenance Mode" at a specific date and time. I found this post here on stack overflow and I'm wondering if anyone has been successful in either (1) implementing a custom plugin to add this functionality, or (2) via a script that sets the site into offline/maintenance mode and perhaps a cronjob to run the script at specific time or if (3) maybe there is an extension that already exists that simply adds this offline/maintenance mode scheduling feature.
Based on the previous post I linked to above I'm not sure if a plugin would work or how best to go about the script and cronjob technique. From my understanding based on the responses in the post, it sounded like the script and cronjob would be the only way to accomplish this. If someone can let me know if they were successful implementing this and how that would be great or any suggestions or direction on how to go about it would be helpful.
Using a plugin for such little would not worth it in my opinion.
I would rather use a little script like:
<?php
// Make sure this is only called through command line
if (php_sapi_name() !== "cli") die('Only command line');
// Replace by your joomla configuration file path
$configuration_file_path = '/var/www/joomla/configuration.php';
if (!empty($argv[1])) {
$offline = 1;
} else {
$offline = 0;
}
// Retrieve configuration file content
$configuration_content = file_get_contents($configuration_file_path);
// Replace the offline line by the calculated value
$configuration_content = preg_replace('/(.*)public \$offline =(.*)/m', '$1public $offline = \'' . $offline . '\';' , $configuration_content);
// Write back the configuration file
file_put_contents($configuration_file_path, $configuration_content);
This script can be called through the command line:
php offline.php 1 #to enable offline status
php offline.php 0 #to disable offline status
If you need to run it through a cronjob by editing /etc/crontab or add it in your hosting settings:
# Offline at 4AM each day
0 4 * * * www-data php /path/of/your/script/offline.php 1 >> /dev/null 2>&1
# Online at 4:05AM each day
5 4 * * * www-data php /path/of/your/script/offline.php 0 >> /dev/null 2>&1

Codeigniter url in crontab

How to set codeigniter url in crontab
I have tried with
* * * * * php-cli /var/www/html/msp/index.php partner cron_test
But it did not work
My application url is
localhost/msp/index.php/partner/cron_test
My php code
public function cron_test()
{
if($this->input->is_cli_request())
{
$fo= fopen("/var/www/cron_test".microtime().".txt", "w") or die("Unable to open file!");;
fwrite($fo,"Cron tab test");
fclose($fo);
}
else
{
echo "You dont have access";
}
}
Project located under /var/www/html/
Set correct php installation path or directly use php in command to execute your cron, if you want to find out your installation path try using command.
which php
It will give you installation path like used in below crontab command
* * * * * /usr/bin/php /var/www/html/msp/index.php partner cron_test
If you want to debug your cron url, you can directly run /usr/bin/php /var/www/html/msp/index.php partner cron_test command from Ubuntu terminal. It will show you the direct output So you can debug wheather it's working or not then set in crontab accordingly.

How to run bash script from cron passing it greater argument every 15 minutes?

I have a simple script that I need to run every 15 minutes everyday (until I get to the last record in my database) giving it greater argument. I know how to do this with the constant argument - example:
*/15 * * * * ./my_awesome_script 1
But I need this, let's say, we start from 8:00 AM:
at 8:00 it should run ./my_awesome_script 1
at 8:15 it should run ./my_awesome_script 2
at 8:30 it should run ./my_awesome_script 3
at 8:45 it should run ./my_awesome_script 4
at 9:00 it should run ./my_awesome_script 5
...
How to make something like this?
I came up with temporary solution:
#!/bin/bash
start=$1
stop=$2
for i in `seq $start $stop`
do
./my_awesome_script $i
sleep 900
done
Writing a wrapper script is pretty much necessary (for sanity's sake). The script can record in a file the previous value of the number and increment it and record the new value ready for next time. Then you don't need the loop. How are you going to tell when you've reached the end of the data in the database? You need to know about how you want to handle that, too.
New cron entry:
*/15 * * * * ./wrap_my_awesome_script
And wrap_my_awesome_script might be:
crondir="$HOME/cron"
counter="$crondir/my_awesome_script.counter"
[ -d "$crondir" ] || mkdir -p "$crondir"
[ -s "$counter" ] || echo 0 > "$counter"
count=$(<"$counter")
((count++))
echo "$count" > $counter
"$HOME/bin/my_awesome_script" "$count"
I'm not sure why you use ./my_awesome_script; it likely means your script is in your $HOME directory. I'd keep it in $HOME/bin and use that name in the wrapper script — as shown.
Note the general insistence on putting material in some sub-directory of $HOME rather than directly in $HOME. Keeping your home directory uncluttered is generally a good idea. You can place the files and programs where you like, of course, but I recommend being as organized as possible. If you aren't organized then, in a few years time, you'll wish you had been.

Cronjob with Jelastic and Glassfish

I am running a web-application (MyCronTest) on a Glassfish-Server in a Jelastic-Environment. This web-application contains the servlet (/test), that I would like to call regularly with a cron-job.
So I followed this tutorial from the Jelastic docs, but they use Tomcat instead of Glassfish and I am not so sure about the paths and where to put which file...and now I am lost ;)
the servlet
When calling the servlet directly in my browser it prints out the following line to System.out:
test executed at 05/03/2014 15:00
the bash file to execute
I created a bash script called myCronJob.sh and put it in the directory glassfish3/temp:
#!/bin/bash
curl http://myGlassfish.jelastic.dogado.eu/MyCronTest/test;
I tested it of course, it is executable and it works (at least when I execute it on my computer).
the cron event scheduler
according to the tutorial there is a file /cron/tomcat I need to edit. Well, I found a /cron/glassfish which (I am guessing) should do the same.
# IMPORTANT NOTE!
# Please make sure there is a blank line after the last cronjob entry.
*/1 * * * * /opt/glassfish3/temp/myCronJob.sh
I added an empty line at the end, as they told me to. I even tried it with
*/1 * * * * /bin/bash /opt/glassfish3/temp/myCronJob.sh
as they suggested in the tutorial. But still no output. No error.. just empty log files.
Does anyone have an idea what I am missing here? Am I doing something wrong?
Solution / Edit
Thanks to Damien's Answer I was finally able to narrow down my problem. It was actually the line in my bash-script that caused the problem:
curl http://myGlassfish.jelastic.dogado.eu/MyCronTest/test;
should have been
curl http://localhost/MyCronTest/test;
since I was blocked by a firewall. Lucky for me, my Glassfish is running on the same machine / environment, so localhost works.
Everything else is correct.
Well, I found a /cron/glassfish which (I am guessing) should do the same.
Correct.
But still no output. No error.. just empty log files.
Assuming that you have correctly uploaded your file to /opt/glassfish3/temp/myCronJob.sh, I recommend that you try to direct the cron output to your own log file or email it to you:
MAILTO="your#email.com"
*/1 * * * * /opt/glassfish3/temp/myCronJob.sh 2&1 > /opt/glassfish3/glassfish/nodes/localhost-domain1/instance-168458181/logs/cronoutput.log
Note that the email may be filtered by your spam filters due to things like missing PTR (reverse DNS) and so on - but it's ok to use like this for testing/debugging purposes (just don't rely on these mails getting through for anything critical!)
If these tips don't help you, then I recommend contacting your hosting provider's support team to verify the .sh file's permissions, output when executed manually, and the cron log file contents (all of which only they can help you with).

BASH - how do i detect the file date time is changed so do a copy or move now once its done?

I have a static directory: /var/tmp/files
This directory is only shared with users for upload/download via SFTP, it has some static file names such as:
recording-security.frontdoor.avi
recording-security.backdoor.avi
recording-security.parkingspace.avi
....
from another PC via SFTP those files are getting removed/edited/updated/added etc
Now another path: /var/www/html/livevideo-stream/
those files are copied, moved from /var/tmp/files
How can i using BASH read those files were edited or newly added or overwritten? So, that my script can move valid contents from /var/tmp/files to livevide-stream only those which has been modified or newly added etc?
$ crontab i have:
0 7 * * * /var/tmp/finishit.sh
0 8 * * * /var/tmp/finishit.sh
0 9 * * * /var/tmp/finishit.sh
0 19 * * * /var/tmp/finishit.sh
0 20 * * * /var/tmp/finishit.sh
$ cat /var/tmp/finishit.sh
#!/bin/bash
cd /var/tmp/files
while :
do
"""
how do we now validate those files which was modified or changed or newly added and place them in that directory?
"""
# echo $1 $2
cp -R /var/tmp/files/* /var/www/html/livevideo-stream/
sleep 1
done
Your cron will launch your script several times per hour, and your script does not terminate due to the while : loop... You will end up with a lot of background scripts trying to copy stuff ever second.
You should simply replace the whole script with
rsync -vaq /var/tmp/files/* /var/www/html/livevideo-stream/

Resources