How to copy files from one server to another server? I have 3 servers from which I have to copy files from one server to another server by choice. I know scp is the command to be used but I just wanted to know how to write a shell script which makes me copy files from one server to another service by choice. Any help is appreciated.
I would write a script that:
Has a static, defined list of three servers.
list_of_things=(one two three)
Takes as input/argument a file/file path on the local machine
first_argument=$0
second_argument=$1
Checks that the file exists
if [ -e first_argument ]
Loops through the list of servers, and uses scp to transfer the file to each server.
for item IN list_of_things; do
// do something here
then
Read through a bash scripting tutorial for more guidance: https://ryanstutorials.net/bash-scripting-tutorial/
Related
I have 60 ec2 instances which share the same folder structure are similar to one another but not completely identical. The incorrect file was uploaded to all 60 instances and I was wondering what would be the best way to replace that file with the correct one? The file is named the same and is placed in the same location throughout all the instances. Am new to using AWS in general so any help would be much appreciated.
Assuming you don't want to use something like ansible, have access to the servers and want to use just bash you could do something like:
Put all your IP addresses of your servers into a file, one on each line - like so:
IpAddresses.txt
10.20.15.1
10.20.15.44
10.20.15.65
Then create a script:
myscript.sh
#!/bin/bash
while read line; do
ssh -i path_to_key.pem ec2-user#$line 'sudo rm -rf /path_to_directory | command 2 | command 3'
done < IpAddresses.txt
Maybe you could do something like the above to first remove the directories you don't want and then do an scp to copy the correct file in.
Depends on the commands you need to correct the problem, but this is an option.
Note, I haven't tested this command exactly - so you may need to correct/test a bit.
Refs:
https://www.shellhacks.com/ssh-execute-remote-command-script-linux/
If your EC2 instances have the correct IAM permissions, you could use the Simple Systems Manager (SSM) console, using the Run Command service. Click 'Run a command', then select AWS-RunShellScript from the list of command documents. In the text box you can specify a shell command to run, and below that you can choose the set of instances you want to run the command on.
This is the recommended way to update and administer large fleets of instances such as you have.
Im trying to copy a zip file located on a server by a ssh2 library.
the way i'm about to do is using less command and write it down on client side.
Less -r -L -f zipfile
but the output file is bigger than the original.
i know this not a good practice but i have to.
so how can i handle this to have my zip file on the client machine?
Is Less an mandatory command to do that ?
You can simply use scp to achieve that by providing user and host and then typing the directory, where to copy the file from the server to local host, like on the example below:
scp your_username#remotehost.edu:foobar.txt /some/local/directory
This question already has answers here:
scp or sftp copy multiple files with single command
(19 answers)
Closed 6 years ago.
I am a greenhand in bash so please be patient with me. Thank you.
I am writing a bash script. There are three arguments: server, login id, and password. I want to copy all of the files in the id's home directory on the server with .c extension as well as encryption executable: encryptor to my current directory.
I tried to using scp and expect, but I have other code that need to run under #!/bin/bash, so that using expect won't work. Please suggest how could I implement this. Thank you very much!!
Difference from scp or sftp copy multiple files with single command:
The id and password are command line argument when run the script so
that I do not need to type in password after the script starts. For
example, it works as "./example.sh server id password".
I don't know the structure of the home directory on the remote server. Is there any way to select certain file and copy them to local using command in one bash script? will grep work? How to combine it with scp?
You may need to use public key authentication with ssh in order to accomplish this without expect. Generate an RSA key pair and place it in the authorized_keys file in the remote host.
Here are the steps:
https://sureshvv.wordpress.com/2009/04/07/how-to-setup-ssh-so-that-manual-password-entry-is-not-needed/
Here is the full background:
https://www.digitalocean.com/community/tutorials/how-to-configure-ssh-key-based-authentication-on-a-linux-server
There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.
I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *