Cakephp Shell Cronjob Controller Action Media Temple Server - shell

I'm trying to create a cron job that will send a weekly newsletter. I tried creating a shell task following what Cakephp manual says. Then I go to the Media Temple Cron jobs and type in the following:
php /home/#####/domains/domain.com/html/cake/console/cake -app /home//#####//domains/domain.com/html/vendors/shells newsletter
I created the shell task on vendors/shell folder and named it newsletter.php and here's the code for it:
class NewsletterShell extends Shell {
function main() {
$this->sendEmailTo("Newsletter","subject","email#gmail.com");
}
}
The sendEmailTo is a controller function I have in my appController so all my controller have access to it.
My problem is every time the Cron Job runs I get this message:
Could not open input file: /home/#####/domains/domain.com/html/cake/console/cake
I even gave all the console files (cake.php , cake.bat etc) 0777 read write properties as well as for the vendors/shell/newsletter.php
The ##### are the site number that media temple gives you but I'm not really sure I have it correct. They show an example of a cron job like this: /home/50838/data/script-name.sh
So my questions are:
Is my cake shell task correct and is the way I'm running it as a cron job accurate?
Also does anyone know where to confirm my media temple site number so I can write that off as a possible error.
Thanks in advance,
Fabian

You can try to var_dump(ROOT) or any one of the other Core Definition Constants to find your directory. Just put it in a controller method somewhere, but make sure to remove it again. Or, if you have SSH access, do pwd on the command line.
Other than that, when invoking the cake console task, the -app parameter is supposed to point to the app directory, not the shells directory.
Try to use this script to run your shell as cron job, there may be some missing shell vars.

ls -l /home/#####/domains/domain.com/html/cake/console/cake
says what? does the cron job run as user #####? if not, the problem is probably permissions on /home/#####/, check them with
ls -ld /home/#####/

Have you given the "cake" file in "cake/console" directory executable permissions as well as cake.php and cake.bat?
The cron command should be:
php /home/#####/domains/domain.com/html/cake/console/cake newsletter -app /home/#####/domains/domain.com/html/app

Related

Have relative paths in my automator script

I am attempting to create an automator app that executes a bash script which in turn uses some resource files (e.g. it deploys a docker-compose.yml file that I created).
I intend to share this app with my colleagues but I am having problems accessing the resource files. Regardless where the app is located if its script does this:
MYPATH=$(pwd) && osascript -e 'display alert "'"$MYPATH"'"'
home ~ gets displayed everytime. This is a problem because I have no way of knowing in which directory my colleagues are going to place the app and its script needs to know where the resource files are located.
Does anyone suggest a good approach to solve this? E.g. a bash command that returns the app location would be nice. I am not interested in any approach that requires instructing someone to place the app in a specific directory.
I solved it using path to me of the Run AppleScript option:
Its return value is passed as argument $1 to the next step.

Running a command before every execution of a script/program

I'm using Bash on Windows and what I'm missing is a good IDE. I've tried running a GUI app with the following tutorial but it doesn't work well every time and it's frustrating.
What I want is to run the script that would copy the files from a folder on Windows into a folder on Unix subsystem, but only the files that are different. And same for the other direction (if I change something from terminal, to be updated in the Windows folder). I want that script to be run every time I call ./SOME_EXECUTABLE in that folder. For the check weather the file was changed or not I can use hg status because I'm mostly working with Mercurial.
Is there a way to do this without making a separate shell script that would combine those calls? Something like a macro.
You could use a function in .bashrc to achieve your requirement, and run your required script that copies stuff from across machines as you needed. Assuming you have the script in place, lets say, e.g. copyScript.sh you can add a function like
function copyOnExecute() {
./copyScript.sh
./EXECUTABLE
}
This way you can call the function copyOnExecute, every time you want to run your executable.
You could add an alias to .bash_aliases such as:
alias execute="./copyScript.sh && ./$1"
You can replace ./ with your scripts path
This runs the executable after your script has finished, and only if it finished successfuly.
It would be easier for you then writing a function in one of the .rc files
If you'd write a function called execute, in the future you might forget where It was written, using an alias helps you avoid this.

Magento log file permission issues

I have a standard Magento application where actions are performed either through web (www-data) or cron scripts (executed by a cron user). By default, Magento creates log files with chmod 0640 so this gives us a problem. Whoever logs an exception/system first (www-data or cron), the other won't be able to append. So if an exception occurs on the web, the var/log/exception.log will be created with www-data as owner so cron scripts won't be able to log exceptions in the same file (cron and www-data are not in the same group but even if they would be, it wouldn't help).
Possible solutions:
1. Run cron with same www-data user (sysadmin won't budge, doesn't agree with the solution)
2. Change Mage.php to generate the log files with a more suitable chmod (even 777 maybe). Doable but this means modifying Magento core files (Mage.php) and it's not really allowed by license.
Mage class is final and I noticed there is no pre- or post- events after logging in order to possible change the chmod in a pre/post hook.
Has anybody encountered the same problem or has any advice on how to properly handle this?
Your proposed first solution sounds as the valid one to me. The cronjob should run with the www-data user. Only then you can guarantee that the file permissions match, regardless if web server or cron job.
Otherwise how can you guarantee that running the cronjob as different user a the web server will give you the same result?
Modifying Mage.php is not a right solution as you already stated. Never modify core files directly as you are going to run into problems when updating Magento - e. g. overwriting files.
From my point of view the web user should not be same user like the shell/cronjob user. They can use the same group, but different users are more secure.
In our case web/http-user is:
httpd:site
and shell/cron user is:
prod:site
So users are different, but belong to the same group. For our case log permission 640 is too restrictive.
Its done in Mage.php
chmod($logFile, 0640);
We change them by script from 0640 to 0660 by s simple script
$command = 'find '.realpath(dirname(__FILE__)).'/var/log/ -type f -exec chmod 0660 {} +';
exec ($command);
which is executed by cron shell and wget

How can I execute a script on AIX after my own .profile/.kshrc takes effect?

background:
My colleagues and I all login a AIX server with user "root", after login everyone loads their .profile/.kshrc/.netrc etc., then start their work, to execute their own shell scripts.
problem:
when I crontab a script, it will fail because some cmds in it is only defined in my own environment.
The failure remains even I add the sentences of source the .profile/.kshrc/.netrc in the script. It appears it just can not remember the former system setting.
question:
How can I edit the script to get the task ran on my own environment?
A script run by cron should set its own PATH to assure it's starting from a known situation.
Make an inventory of all external commands used by the script, list the directories where they live, then add a line to the top of the script:
PATH=/first/dir:/second/dir
Etc...
In most case you want to include /usr/bin and/or /bin -- for scripts run as root /usr/sbin is another favourite.

Ruby cron job in Ubuntu fails silently when trying to back up MySQL to S3

I have two ruby script cron jobs that I'm trying to run under Ubuntu 10.04.2 LTS on an AWS EC2 instance. They are both failing silently - I see them being run in /var/log/syslog, but there's no resulting files, and piping the output into a file creates no result.
The scripts are based on the ruby sql backups here:
http://pauldowman.com/2009/02/08/mysql-s3-backup/
(It's a full backup of the db and an incremental bin-log output. Not sure that matters.)
The script works fine if run from the command line by either root or another user - it runs, and I see the files appearing in the S3 repo
I've tested cron with a simple "touch ~/foo" type entry and that worked fine.
My cron entry under root is this:
*/5 * * * * /home/ubuntu/mysql_s3_backup/incremental_backup.rb
Appreciate any help or debugging suggestions. My thought is that some of the ruby library dependencies might not be available when cron is running the job. But I don't understand why I can't seem to get any output at all returned to me. Very frustrating. Thanks.
The full_backup.rb script you link to contains this:
cmd = "mysqldump --quick --single-transaction ...
#...
run(cmd)
Notice that there is no full path on mysqldump. Cron jobs generally run with a very limited PATH in their environment and I'd guess that mysqldump isn't in that limited PATH. You can try setting your own PATH in your crontab:
PATH='/bin:/usr/bin:/whatever/else/you/need'
*/5 * * * * /home/ubuntu/mysql_s3_backup/incremental_backup.rb
Or in your Ruby script:
ENV['PATH'] = '/bin:/usr/bin:/whatever/else/you/need'
Or specify the full path to mysqldump (and any other external executables) in your backup script.
I'd go with one of the latter two options (i.e. specify ENV['PATH'] in your script or use full paths to executables) as that will reduce your dependence on external factors and these will also help avoid issues with people having their own versions of commands that you need in their PATH.
A bit of error checking and handling on the run call might also be of use.
If any of the necessary Ruby libraries weren't accessible (either due to permissions or path issues) then you'd probably get complaints from the script.

Resources