I've written a bash script that executes a python script to write a file to a directory, then sends that file to Amazon S3. When I execute the script from the command line it executes perfectly, but when I run it with cron, the file writes to the directory, but never gets sent to S3. I must be doing something wrong with cron.
Here is the bash script:
#!/bin/bash
#python script that exports file to home directory
python some_script.py
#export file created by python to S3
s3cmd put /home/bitnami/myfile.csv s3://location/to/put/file/myfile.csv
Like I said before, manually executing works fine using ./bash_script.sh. When I set up the cron job, the file writes to the directory, but never gets sent to S3.
my cron job is:
18 * * * * /home/bitnami/bash_script.sh
Am I using cron incorrectly? Please help.
Cron looks OK, however your path to the .py file will not be found.
You will have to add a path or home like:
location=/home/bitnami/
python $location/some_script.py
Also s3cmd needs to be located correctly:
/bin/s3cmd
Alternative might also need to load your user environment first before executing the script to find username/password/ssh key for s3cmd
Related
Set-up
I have 3 .txt files containing commands to be executed each day.
The berlin_run.txt file executes a.o. the 2 other .txt files. The file is the following,
#!/bin/bash
cd /path/to/folder/containing/berlin_run.txt
PATH=$PATH:/usr/local/bin
export PATH
./spider_apartments_run.txt
./spider_rooms_run.txt
python berlin_apartments_ads.py;python berlin_rooms.py
When I cd to /path/to/folder/containing/berlin_run.txt in my MacOS Terminal, and execute the ./berlin_run.txt command, everything works fine.
It is my understanding that ./ opens the berlin_run.txt, and that #!/bin/bash ensures that the subsequent lines in the berlin_run.txt are automatically executed upon opening.
Problem
I want to automate the execution of berlin_run.txt.
I have written the following cronjob,
10 13 * * * /path/to/folder/containing/berlin_run.txt
It is my understanding that this cronjob should open the berlin_run.txt each day at 13:10. Assuming that is correct, #!/bin/bash should execute all the subsequent lines. But nothing seems to happen.
Where and what am I doing wrong here?
I have a script to convert files, but each time i have to copy that script to the directory where my files are located and open a terminal and run that script every time i need to perform that action.
I have an idea of running a cron job to look after new files to a specific directory, and if any new file is added the script must start executing.
Help me out!
Thanks in advance
I have hadoop installed on centOS system. I have a shell script which merges all the small files of HDFS generated at some particular hour folder location into one single file at another location at hdfs.
The shell works perfectly OK when invoked.
I then placed the shell to run as a cron job at 01:30 AM everyday.
I typed crontab -e and pasted this:
30 1 * * * /home/hadoop/tmp/cron-merge-files.sh > /home/hadoop/tmp/cron-merge-files.txt
But the merge operation does not happen. I see at /var/log/cron file that at 01:30 AM this entry comes but I cant see those files merged at hdfs. When I simply execute
the shell script, then it works perfectly OK and does the said operation written inside the script.
Jul 8 01:30:01 ip-10-1-3-111 CROND[2265463]: (hadoopuser) CMD (/home/hadoop/tmp/cron-merge-files.sh > /home/hadoop/tmp/cron-merge-files.txt)
The content of /home/hadoop/tmp/cron-merge-files.txt is a single echo statement which is written inside a loop. The loop is supposed to run 24 times, and it prints it
24 times.
I am not sure what is happening.
I got the solution for this problem from another forum. There was some problem with the environment variables not getting picked up when same script was run through crontab. Just exported .bash_profile to my script and it worked.
I'm running a cronjob that calls a php script. I get "failed to open stream" when the file is invoked by cron. When I cd to the directory and run the file from that location, all is well. Basically, the include_once() file that I want to include is two directories up from where the php script resides.
Can someone please tell me how I can get this to work from a cronjob?
There are multiple ways to do this: You could cd into the directory in your cron script:
cd /path/to/your/dir && php file.php
Or point to the correct include file relative to the current script in PHP:
include dirname(__FILE__) . '/../../' . 'includedfile.php';
cron is notorious for starting with a minimal environment. Either:
have your script set up it's own environment;
have a special cron script which sets up the environment then calls your script; or
set up the environment within crontab itself.
An example of the last (which is what I tend to use if there's not too many things that need setting up) is:
0 5 * * * (export PATH = /mydir:$PATH ; myexecutable )
you need to see what is the path that the cron run from.
echo pathinfo($_SERVER["PATH_TRANSLATED"]);
according to this do the include
include $path_parts['dirname']."/myfile.php";
I wrote a shell script where I copy my .bashrc file as well as custom dotfiles to a backup folder and then replace them in my home folder with another .bashrc file which will then source my custom dotfiles.
However, after the script does its job, if I try to execute the aliases I included in the new files I get the error No command found. Only after I source the .bashrc file manually in the terminal I have access to them.
From what I understand, the script I'm running is executing in a sub-shell (?) which will terminate on execution.
How can I run the script and have new commands/aliases/functions available without having to source the .bashrc file myself or restarting the terminal?
Well, it appears that instead of running my script via sh script.sh, I can source it like source script.sh, which will behave exactly as I wanted.
Solution