On a CentOS 7.2 server the following command runs successfully manually -
scp usrname#storage-server:/share/Data/homes/AWS/2301KM-$(date +"%Y-%m-%d").csv /home/vyv/data/AWS/ 2>&1 >> /home/vyv/data/AWS/scp-log.txt
This command simply takes the file that has the current date in its filename from a directory on a remote server and stores it to a directory on the local server and prints out the log file in the same directory.
Public key authentication is setup so there is no prompt for the password when run manually.
I have it configured in crontab to run 3 minutes after the top of every hour as in the following format -
3 * * * * scp usrname#storage-server:/share/Data/homes/AWS/2301KM-$(date +"%Y-%m-%d").csv /home/vyv/data/AWS/ 2>&1 >> /home/vyv/data/AWS/scp-log.txt
However, I wait patiently and don't see any files being downloaded automatically.
I've checked the /var/log/cron logs and see an entry on schedule like this -
Feb 9 17:30:01 intranet CROND[9380]: (wzw) CMD (scp usrname#storage-server:/share/Data/homes/AWS/2301KM-$(date +")
There are other similar jobs set in crontab that work perfectly.
Can anyone offer suggestions/clues on why this is not working?
Gratefully,
Rakesh.
Use full path for scp (or any other binary) in crontab:
3 * * * * /usr/bin/scp usrname#storage-server:/share/Data/homes/AWS/2301KM-$(date +"%Y-%m-%d").csv /home/vyv/data/AWS/ 2>&1 >> /home/vyv/data/AWS/scp-log.txt
Related
I'm attempting to have my Raspberry Pi use rysnc to upload files to an SFTP server sporadically throughout the day. To do so, I created a bash script and installed a crontab to run it every couple hours during the day. If I run the bash script, it works perfectly, but it never seems to run using crontab.
I did the following:
"sudo nano upload.sh"
Create the following bash script:
#!/bin/bash
sshpass -p "password" rsync -avh -e ssh /local/directory host.com:/remote/directory
"sudo chmod +x upload.sh"
Test running it with "./upload.sh"
Now, I have tried all the following ways to add it to crontab ("sudo crontab -e")
30 8,10,12,14,16 * * * ./upload.sh
30 8,10,12,14,16 * * * /home/picam/upload.sh
30 8,10,12,14,16 * * * bash /home/picam/upload.sh
None of these work based on the fact that new files are not uploaded. I have another bash script running using method 2 above without issue. I would appreciate any insight into what might be going wrong. I have done this on eight separate Raspberry Pi 3B that are all taking photos throughout the day. The crontab upload works on none of them.
UPDATE:
Upon logging the crontab job, I found the following error:
Host key verification failed.
rsync error: unexplained error (code 255) at rsync.c(703) [sender=3.2.3]
This error also occurred if I tried running my bash script without first connecting to the server via scp and accepting the certificate. How to get around this when calling rsync from crontab?
check if the script is working properly at all ( paste it in shell)
Check if your crond.service working properly - systemctl status crond.service.
Output should be "active (running)"
Then you can try add simply test job to cron: * * * * * echo "test" >> /path/whichyou/want/file.txt
and check if this job work properly
Thanks to logging recommendation from Gordon Davisson in comments, I was able to identify the problem.
A logging error occurred, as mentioned in the original question update above where rsync would choke on a host key verification.
My solution: tell rsync not to check host key certificates. I simply changed the upload.sh bash file the following:
#!/bin/bash
sshpass -p "password" rsync -avh -e "ssh -o StrictHostKeyChecking=no" /local/directory host.com:/remote/directory
Working perfectly now -- hope this helps someone.
Using Integrating Amazon SES with Sendmail I configured SES to allow it to send emails from a verified email address. I was able to successfully send email from the command line using the verified email address:
sudo /usr/sbin/sendmail -f from#example.com to#example.com < file_to_send.txt
Next I setup a bash script to gather some daily report information.
#!/bin/bash
# copy the cw file
cp /var/log/cwr.log /cwr_analysis/cwr.log
# append the cw info to the subject file
cat /cwr_analysis/subject.txt /cwr_analysis/cwr.log > /cwr_analysis/daily.txt
# send the mail
/usr/sbin/sendmail -f from#example.com to#example.com < /cwr_analysis/daily.txt
If I run the bash script manually from the command line the report is gathered and emailed as it should be. I changed the permissions on the file to allow it to be executed by root (similar to other CRON jobs on the AWS instance):
-rwxr-xr-x 1 root root 375 Jan 6 17:37 cwr_email.sh
PROBLEM
I setup a CRON job and set it to run every 5 minutes for testing (the script is designed to be run once per day once production starts):
*/5 * * * * /home/ec2-user/cwr_email.sh
The bash script copies and then appends the daily.txt file properly but does not send the email. There is no bounce in the email spool or any other errors.
I have spent the better part of today searching for an answer and many of the searches end up on dead-ends with little to no information about using a CRON to send email via AWS SES.
How can I fix this issue?
One "problem" with cron is that lack of environment variables (for obvious security reasons). You are probably missing PATH and HOME. You can define those in the script directly or in the crontab file.
Add PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin to the crontab before you call the sendmail script and it should work
#!/bin/bash
#Adding the path
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
# copy the cw file
cp /var/log/cwr.log /cwr_analysis/cwr.log
# append the cw info to the subject file
cat /cwr_analysis/subject.txt /cwr_analysis/cwr.log > /cwr_analysis/daily.txt
# send the mail
/usr/sbin/sendmail -f from#example.com to#example.com < /cwr_analysis/daily.txt
You'll have to test until all the necessary variables are defined as required by the script.
I'm currently using Sikuli to upload a PDF file to a website server. This seems inefficient. Ideally I would like to run a shell script and get it to upload this file on a certain day/time (i.e Sunday at 5AM) without the use of Sikuli.
I'm currently running Mac OS Yosemite 10.10.1 and the FileZilla FTP Client.
Any help is greatly appreciated, thank you!
Create a bash file like this (replace all [variables] with actual values):
#!/bin/sh
cd [source directory]
ftp -n [destination host]<<END
user [user] [password]
put [source file]
quit
END
Name it something like upload_pdf_to_server.sh
Make sure it has right permission to be executed:
chmod +x upload_pdf_to_server.sh
Set a cron job based on your need to execute the file periodically using command crontab -e
0 5 * * * /path/to/script/upload_pdf_to_server.sh >/dev/null 2>&1
(This one will execute the bash file every day at 5AM)
How to set cronjob
Cronjob generator
I need to upload a file (bash script) to a remote sever. I use the scp command. After the file has been copied to the remote server I want to create a cron entry in the crontab file on the remote server.
However, the file upload and writing the cron entry need to occur within a bash shell script so that I only need to execute the script on my local machine and the script is copied to the remote host and the cron entry is written to the crontab.
Is there a way that I can use an ssh command, within the script, that logs me into the remote server, opens the crontab file and writes the cron entry.
Any help is very welcome
I would:
extract the user's crontab with crontab -l > somefile
modify that file with the desired job
import the new crontab with crontab somefile
I just did something like this where I needed to create a multiline line crontab on a remote machine. By far the simplest solution was to pipe the content to the remote crontab command through ssh like this:
echo "$CRON_CONTENTS" | ssh username#server crontab
mailo seemed almost right, but the command would be the second argument to the ssh command, like this:
ssh username#server 'echo "* * * * * /path/to/script/" >> /etc/crontab'
Or if your system doesn't automatically load /etc/crontab you should be able to pipe to the crontab command like this:
ssh username#server 'echo "* * * * * myscript" | /usr/bin/crontab'
Say you want to copy $local to $remote on $host and add an hourly job there to run at 14 past every hour, using a single SSH session;
ssh "$host" "cat >'$remote' &&
chmod +x '$remote' &&
( crontab -l;
echo '14 * * * * $remote' ) | crontab" <"$local"
This could obviously be much more robust with proper error checking etc, but hopefully it should at least get you started.
The two keys here are that the ssh command accepts an arbitrarily complex shell script as the remote command, and gets its standard input from the local host.
(With double quotes around the script, all variables will be interpolated on the local host; so the command executed on the remote host will be something like cat >'/path/to/remote' && chmod +x '/path/to/remote' && ... With the single quotes, you could have whitespace in the file name, but I didn't put them in the crontab entry because it's so weird. If you need single quotes there as well, I believe it should work.)
You meant something like
ssh username#username.server.org && echo "* * * * * /path/to/script/" >> /etc/crontab
?
I have a bash script that does ssh to a remote machine and executes a command there, like:
ssh -nxv user#remotehost echo "hello world"
When I execute the command from a command line it works fine, but it fails when is being executed as a part of crontab (errorcode=255 - cannot establish SSH connection). Details:
...
Waiting for server public key.
Received server public key and host key.
Host 'remotehost' is known and matches the XXX host key.
...
Remote: Your host key cannot be verified: unknown or invalid host key.
Server refused our host key.
Trying XXX authentication with key '...'
Server refused our key.
...
When executing locally I'm acting as a root, crontab works as root as well.
Executing 'id' from crontab and command line gives exactly the same result:
$ id
> uid=0(root) gid=0(root) groups=0(root),...
I do ssh from some local machine to the machine running crond. I have ssh key and credentials to ssh to crond machine and any other machine that the scripts connects to.
PS. Please do not ask/complain/comment that executing anything as root is bad/wrong/etc - it is not the purpose of this question.
keychain
solves this in a painless way. It's in the repos for Debian/Ubuntu:
sudo apt-get install keychain
and perhaps for many other distros (it looks like it originated from Gentoo).
This program will start an ssh-agent if none is running, and provide shell scripts that can be sourced and connect the current shell to this particular ssh-agent.
For bash, with a private key named id_rsa, add the following to your .profile:
keychain --nogui id_rsa
This will start an ssh-agent and add the id_rsa key on the first login after reboot. If the key is passphrase-protected, it will also ask for the passphrase. No need to use unprotected keys anymore! For subsequent logins, it will recognize the agent and not ask for a passphrase again.
Also, add the following as a last line of your .bashrc:
. ~/.keychain/$HOSTNAME-sh
This will let the shell know where to reach the SSH agent managed by keychain. Make sure that .bashrc is sourced from .profile.
However, it seems that cron jobs still don't see this. As a remedy, include the line above in the crontab, just before your actual command:
* * * * * . ~/.keychain/$HOSTNAME-sh; your-actual-command
I am guessing that normally when you ssh from your local machine to the machine running crond, your private key is loaded in ssh-agent and forwarded over the connection. So when you execute the command from the command line, it finds your private key in ssh-agent and uses it to log in to the remote machine.
When crond executes the command, it does not have access to ssh-agent, so cannot use your private key.
You will have to create a new private key for root on the machine running crond, and copy the public part of it to the appropriate authorized_keys file on the remote machine that you want crond to log in to.
Don't expose your SSH keys without passphrase. Use ssh-cron instead, which allows you to schedule tasks using SSH agents.
So I had a similar problem. I came here and saw various answers but with some experimentation here is how I got it work with sshkeys with passphrase, ssh-agent and cron.
First off, my ssh setup uses the following script in my bash init script.
# JFD Added this for ssh
SSH_ENV=$HOME/.ssh/environment
# start the ssh-agent
function start_agent {
echo "Initializing new SSH agent..."
# spawn ssh-agent
/usr/bin/ssh-agent | sed 's/^echo/#echo/' > "${SSH_ENV}"
echo succeeded
chmod 600 "${SSH_ENV}"
. "${SSH_ENV}" > /dev/null
/usr/bin/ssh-add
}
if [ -f "${SSH_ENV}" ]; then
. "${SSH_ENV}" > /dev/null
ps -ef | grep ${SSH_AGENT_PID} | grep ssh-agent$ > /dev/null || {
start_agent;
}
else
start_agent;
fi
When I login, I enter my passphrase once and then from then on it will use ssh-agent to authenticate me automatically.
The ssh-agent details are kept in .ssh/environment. Here is what that script will look like:
SSH_AUTH_SOCK=/tmp/ssh-v3Tbd2Hjw3n9/agent.2089; export SSH_AUTH_SOCK;
SSH_AGENT_PID=2091; export SSH_AGENT_PID;
#echo Agent pid 2091;
Regarding cron, you can setup a job as a regular user in various ways.
If you run crontab -e as root user it will setup a root user cron. If you run as crontab -u davis -e it will add a cron job as userid davis. Likewise, if you run as user davis and do crontab -e it will create a cron job which runs as userid davis. This can be verified with the following entry:
30 * * * * /usr/bin/whoami
This will mail the result of whoami every 30 minutes to user davis. (I did a crontabe -e as user davis.)
If you try to see what keys are used as user davis, do this:
36 * * * * /usr/bin/ssh-add -l
It will fail, the log sent by mail will say
To: davis#xxxx.net
Subject: Cron <davis#hostyyy> /usr/bin/ssh-add -l
Could not open a connection to your authentication agent.
The solution is to source the env script for ssh-agent above. Here is the resulting cron entry:
55 10 * * * . /home/davis/.ssh/environment; /home/davis/bin/domythingwhichusesgit.sh
This will run the script at 10:55. Notice the leading . in the script. It says to run this script in my environment similar to what is in the .bash init script.
Yesterday I had similar problem...
I have cron job on one server, which start some action on other server, using ssh... Problem was user permissions, and keys...
in crontab I had
* * * * * php /path/to/script/doSomeJob.php
And it simply didn't work ( didnt have permissions ).
I tryed to run cron as specific user, which is connected to other server
* * * * * user php /path/to/script/doSomeJob.php
But with no effect.
Finally, i navicate to script and then execute php file, and it worked..
* * * * * cd /path/to/script/; php doSomeJob.php