I'm currently using Sikuli to upload a PDF file to a website server. This seems inefficient. Ideally I would like to run a shell script and get it to upload this file on a certain day/time (i.e Sunday at 5AM) without the use of Sikuli.
I'm currently running Mac OS Yosemite 10.10.1 and the FileZilla FTP Client.
Any help is greatly appreciated, thank you!
Create a bash file like this (replace all [variables] with actual values):
#!/bin/sh
cd [source directory]
ftp -n [destination host]<<END
user [user] [password]
put [source file]
quit
END
Name it something like upload_pdf_to_server.sh
Make sure it has right permission to be executed:
chmod +x upload_pdf_to_server.sh
Set a cron job based on your need to execute the file periodically using command crontab -e
0 5 * * * /path/to/script/upload_pdf_to_server.sh >/dev/null 2>&1
(This one will execute the bash file every day at 5AM)
How to set cronjob
Cronjob generator
Related
I'm attempting to have my Raspberry Pi use rysnc to upload files to an SFTP server sporadically throughout the day. To do so, I created a bash script and installed a crontab to run it every couple hours during the day. If I run the bash script, it works perfectly, but it never seems to run using crontab.
I did the following:
"sudo nano upload.sh"
Create the following bash script:
#!/bin/bash
sshpass -p "password" rsync -avh -e ssh /local/directory host.com:/remote/directory
"sudo chmod +x upload.sh"
Test running it with "./upload.sh"
Now, I have tried all the following ways to add it to crontab ("sudo crontab -e")
30 8,10,12,14,16 * * * ./upload.sh
30 8,10,12,14,16 * * * /home/picam/upload.sh
30 8,10,12,14,16 * * * bash /home/picam/upload.sh
None of these work based on the fact that new files are not uploaded. I have another bash script running using method 2 above without issue. I would appreciate any insight into what might be going wrong. I have done this on eight separate Raspberry Pi 3B that are all taking photos throughout the day. The crontab upload works on none of them.
UPDATE:
Upon logging the crontab job, I found the following error:
Host key verification failed.
rsync error: unexplained error (code 255) at rsync.c(703) [sender=3.2.3]
This error also occurred if I tried running my bash script without first connecting to the server via scp and accepting the certificate. How to get around this when calling rsync from crontab?
check if the script is working properly at all ( paste it in shell)
Check if your crond.service working properly - systemctl status crond.service.
Output should be "active (running)"
Then you can try add simply test job to cron: * * * * * echo "test" >> /path/whichyou/want/file.txt
and check if this job work properly
Thanks to logging recommendation from Gordon Davisson in comments, I was able to identify the problem.
A logging error occurred, as mentioned in the original question update above where rsync would choke on a host key verification.
My solution: tell rsync not to check host key certificates. I simply changed the upload.sh bash file the following:
#!/bin/bash
sshpass -p "password" rsync -avh -e "ssh -o StrictHostKeyChecking=no" /local/directory host.com:/remote/directory
Working perfectly now -- hope this helps someone.
I'm using ubuntu and apache2. Long story short, I'm password protecting a website. But the username and password will change everyday. So I've added entire month of password in /etc/apache2/.htpasswd file and I am going to run cronjob at everyday midnight to run a bash file and delete the password line for that day. My bashfile looks like this
#!/bin/bash</font>
> cat /etc/apache2/.htpasswd
> sudo sed '1d' /etc/apache2/.htpasswd
when I run the script, it shows this
./deleteline.sh: line2 /etc/apache2/.htpasswd: Permission denied
i changed permission for bash shell as well and it works well with other files.
So I would really appreciate it if you can tell me how I get rid of the permission error problem.
I am a web dev trying to do a little bit of Linux admin and could use help. My server needs to retrieve a file daily from a remote location over sftp, name and date/time stamp it, and push it to a directory for archive.
I have adapted a shell script that I had working when doing this over ftp, but sftp is causing me some issues.
I can successfully connect to the server in Filezilla when I have it set to the sftp protocol and choose the "Longon Type" as "Interactive" where it prompts for a password.
When I use the command line to call my script, it seems to resolve but hangs on the logging in step and provides the following error before retrying: "Error in server response, closing control connection. Retrying."
Here is the output:
https://i.imgur.com/dEXYRHk.png
This is the contents of my script where I've replaced any sensitive information with a placeholder in ALL CAPS.
#!/bin/bash
# Script Function:
# This bash script backups up the .csv everyday (dependent on cron job run) with a file name time stamp.
#[Changes Directory]
cd /THEDIRECTORY
wget --no-passive-ftp --output-document=completed`date +%Y-%m-%d`.csv --user=THEUSER --password='THEPASSWORD' ftp://sftp.THEDOMAIN.com:22 completed.csv
Anyone wanna help a newb to get some of them internet points?! :-)
New into Mikrotik scripting, and missing something really obvious. When create a new script with
/system script add name=mail
/system script edit mail source
save the script and run it, everything is just fine.
Now, if I want to push scripts via scp I hit a roadblock. I upload the rsc files but now don't know, how to make i.e. the uploaded script.rsc to be used as the source for a new script. And my google-fu fails me. Any help appreciated here!
To push a file and execute commands on RouterOS/Mikrotik:
Use a Linux server:
Prepare variables:
ROUTEROS_USER=$1
ROUTEROS_HOST=$2
ROUTEROS_SSH_PORT=$3
FILE=somescript.rsc
Push a file using:
scp -P $ROUTEROS_SSH_PORT "$FILE" "$ROUTEROS_USER"#"$ROUTEROS_HOST":"$FILE"
Execute the commnad that will run a command on RouterOS
ssh $ROUTEROS_USER#$ROUTEROS_HOST -p $ROUTEROS_SSH_PORT "/import file-name=$FILE"
command /import file-name=$FILE.rsc" may be different depends on
your RouterOS version
This is Srikanth from Hyderabad.
I the Linux Administrator in one of the corporate company. We have a squid server, So i prepared a Backup squid server, so that when LIVE Squid server goes down i can put the backup server into LIVE.
My squid servers are configured with Centos 5.5. I have prepared a script to take backup of all configuration files in /etc/squid/ of LIVE server to the backup server. i.e It will copy all files from Live server's /etc/squid/ to backup server's /etc/squid/
Here's the script saved as squidbackup.sh in the directory /opt/ with permission 755(rwxr-xr-x)
#! /bin/sh
username="<username>"
password="<password>"
host="Server IP"
expect -c "
spawn /usr/bin/scp -r <username>#Server IP:/etc/squid /etc/
expect {
"*password:*"{
send $password\r;
interact;
}
eof{
exit
}
}
** Kindly note that this will be executed in the backup server that will check for the user which is mentioned in the script. I have created a user in the live server and given the same in the script too.
When i execute this command using the below command
[root#localhost ~]# sh /opt/squidbackup.sh
Everything works fine till now, this script downloads all the files from the directory /etc/squid/ of LIVE server to the location /etc/squid/ of Backup server
Now the problem raises, If i set this in crontab like below or with other timings
50 23 * * * sh /opt/squidbackup.sh
Dont know what's wrong, it is not downloading all files. i.e Cronjob is downloading only few files from /etc/squid/ of LIVE server to the /etc/squid/ of backup server.
**Only few files are downloaded when cron executes the script, If i run this script manually then it is downloading all files perfectly with out any errors or warnings.
If you have any more questions, Please go ahead to post it.
Now i kindly request to give if any solutions are available.
Please Please, Thank you in advance.
thanks for your interest. I have tried what you have said, it show like below, but previously i use to get the same output to mail of the User in the squid backup server.
Even in cron logs it show the same, but i was not able to understand what was the exact error from the below lines.
Please note that only few files are getting downloaded with cron.
spawn /usr/bin/scp -r <username>#ServerIP:/etc/squid /etc/
<username>#ServerIP's password:
Kindly check if you can suggest any thing else.
Try the simple options first. Capture the stdout and stderr as shown below. These files should point to the problem.
Looking at the script, you need to specify the location of expect. That could be an issue.
50 23 * * * sh /opt/squidbackup.sh >/tmp/cronout.log 2>&1