New into Mikrotik scripting, and missing something really obvious. When create a new script with
/system script add name=mail
/system script edit mail source
save the script and run it, everything is just fine.
Now, if I want to push scripts via scp I hit a roadblock. I upload the rsc files but now don't know, how to make i.e. the uploaded script.rsc to be used as the source for a new script. And my google-fu fails me. Any help appreciated here!
To push a file and execute commands on RouterOS/Mikrotik:
Use a Linux server:
Prepare variables:
ROUTEROS_USER=$1
ROUTEROS_HOST=$2
ROUTEROS_SSH_PORT=$3
FILE=somescript.rsc
Push a file using:
scp -P $ROUTEROS_SSH_PORT "$FILE" "$ROUTEROS_USER"#"$ROUTEROS_HOST":"$FILE"
Execute the commnad that will run a command on RouterOS
ssh $ROUTEROS_USER#$ROUTEROS_HOST -p $ROUTEROS_SSH_PORT "/import file-name=$FILE"
command /import file-name=$FILE.rsc" may be different depends on
your RouterOS version
Related
I am a web dev trying to do a little bit of Linux admin and could use help. My server needs to retrieve a file daily from a remote location over sftp, name and date/time stamp it, and push it to a directory for archive.
I have adapted a shell script that I had working when doing this over ftp, but sftp is causing me some issues.
I can successfully connect to the server in Filezilla when I have it set to the sftp protocol and choose the "Longon Type" as "Interactive" where it prompts for a password.
When I use the command line to call my script, it seems to resolve but hangs on the logging in step and provides the following error before retrying: "Error in server response, closing control connection. Retrying."
Here is the output:
https://i.imgur.com/dEXYRHk.png
This is the contents of my script where I've replaced any sensitive information with a placeholder in ALL CAPS.
#!/bin/bash
# Script Function:
# This bash script backups up the .csv everyday (dependent on cron job run) with a file name time stamp.
#[Changes Directory]
cd /THEDIRECTORY
wget --no-passive-ftp --output-document=completed`date +%Y-%m-%d`.csv --user=THEUSER --password='THEPASSWORD' ftp://sftp.THEDOMAIN.com:22 completed.csv
Anyone wanna help a newb to get some of them internet points?! :-)
I have a requirement at hand in Unix where I need to build a shell script.
The requirement is below:
I need to SFTP a file (let's say CSV file) from my dev server to uat server.
After the SFTP is done to that server, as soon as the file comes there and the exit code of the previous SFTP is 0, I need to trigger a task (this task I can take care of).
I have the basic idea on SFTP but I am not aware of how to trigger the next task as soon as the file comes to the uat server.
Please need a pseudo code to start my exploration.
If you want to copy from somewhere to your local machine and run a command locally
If you have access to ssh then it can be done easily which I am doing it usually.
For example I have a backup file from one of my server. We can get a copy this way using scp
scp root#server:/home/weekly.sql.zip .
. means put the file with its name here on this directory I am in now
the problem with this command is that it has an interaction for getting password so to eliminate this we can install sshpass and use it this way:
sshpass -p'your-password' scp root#server:/home/weekly.sql.zip .
Since we are using bash and it take care of exiting code if you add and && operator then you can add a second command so to be triggered after first command has successfully done.
sshpass -p'your-password' scp root#server:/home/weekly.sql.zip . && unzip weekly.sql.zip
First task is copying a file and second is to unzip it.
installing sshpass:
sudo apt install -y sshpass
If you want to copy from your local machine to somewhere and run a command remotely
sshpass -p'your-password' scp test.txt root#address:/home/ && sshpass -p'your-password' ssh root#address cat /home/test.txt
Which does this:
copy file test.txt to the server
then read it by cat command
I'm currently using Sikuli to upload a PDF file to a website server. This seems inefficient. Ideally I would like to run a shell script and get it to upload this file on a certain day/time (i.e Sunday at 5AM) without the use of Sikuli.
I'm currently running Mac OS Yosemite 10.10.1 and the FileZilla FTP Client.
Any help is greatly appreciated, thank you!
Create a bash file like this (replace all [variables] with actual values):
#!/bin/sh
cd [source directory]
ftp -n [destination host]<<END
user [user] [password]
put [source file]
quit
END
Name it something like upload_pdf_to_server.sh
Make sure it has right permission to be executed:
chmod +x upload_pdf_to_server.sh
Set a cron job based on your need to execute the file periodically using command crontab -e
0 5 * * * /path/to/script/upload_pdf_to_server.sh >/dev/null 2>&1
(This one will execute the bash file every day at 5AM)
How to set cronjob
Cronjob generator
I am trying to write a script to automatically upload files to a sftp server. My problem is authentication.
I know it is not possible to store a password in a bash script for sftp.
I can't use keys because the admin of the server won't allow me.
I don't want to use any extras (sshpass/expect) because I can't
guarantee they will be on the machine I'm using (the script are wanted so that the processes are not tied down to a particular machine).
Manual entry of the password is not a problem I just need to get the script to wait for the user to put the password in. At the minute when I run the script it opens terminal, prompts for the password, but when this is entered nothing else happens. If I enter the lines of code manual after it uploads everything correctly.
#!bin/bash/
cd /remote_directory
lcd /local_directory
put some_file.txt
After months of looking for an answer I have finally found the solution. It was in a comment on an answer in some other thread I can't even remember. Hope this can help others out there.
Your bash script should look like this and will connect to the sftp server, prompt the user for the password, and then execute the remaining commands.
#!/bin/bash
sftp user#server <<!
cd /the/remote/directory
lcd /your/local/directory
put/get some.file
!
This is Srikanth from Hyderabad.
I the Linux Administrator in one of the corporate company. We have a squid server, So i prepared a Backup squid server, so that when LIVE Squid server goes down i can put the backup server into LIVE.
My squid servers are configured with Centos 5.5. I have prepared a script to take backup of all configuration files in /etc/squid/ of LIVE server to the backup server. i.e It will copy all files from Live server's /etc/squid/ to backup server's /etc/squid/
Here's the script saved as squidbackup.sh in the directory /opt/ with permission 755(rwxr-xr-x)
#! /bin/sh
username="<username>"
password="<password>"
host="Server IP"
expect -c "
spawn /usr/bin/scp -r <username>#Server IP:/etc/squid /etc/
expect {
"*password:*"{
send $password\r;
interact;
}
eof{
exit
}
}
** Kindly note that this will be executed in the backup server that will check for the user which is mentioned in the script. I have created a user in the live server and given the same in the script too.
When i execute this command using the below command
[root#localhost ~]# sh /opt/squidbackup.sh
Everything works fine till now, this script downloads all the files from the directory /etc/squid/ of LIVE server to the location /etc/squid/ of Backup server
Now the problem raises, If i set this in crontab like below or with other timings
50 23 * * * sh /opt/squidbackup.sh
Dont know what's wrong, it is not downloading all files. i.e Cronjob is downloading only few files from /etc/squid/ of LIVE server to the /etc/squid/ of backup server.
**Only few files are downloaded when cron executes the script, If i run this script manually then it is downloading all files perfectly with out any errors or warnings.
If you have any more questions, Please go ahead to post it.
Now i kindly request to give if any solutions are available.
Please Please, Thank you in advance.
thanks for your interest. I have tried what you have said, it show like below, but previously i use to get the same output to mail of the User in the squid backup server.
Even in cron logs it show the same, but i was not able to understand what was the exact error from the below lines.
Please note that only few files are getting downloaded with cron.
spawn /usr/bin/scp -r <username>#ServerIP:/etc/squid /etc/
<username>#ServerIP's password:
Kindly check if you can suggest any thing else.
Try the simple options first. Capture the stdout and stderr as shown below. These files should point to the problem.
Looking at the script, you need to specify the location of expect. That could be an issue.
50 23 * * * sh /opt/squidbackup.sh >/tmp/cronout.log 2>&1