Transfer files between servers without downloading and uploading - macos

I need to get one (huge) file from one server to another and I have a slow Internet connection. I tried using Transmit, the ftp program, but I believe it's downloading the file and uploading it to the other server.
So, is there a way to move it directly from one server to the other, either using and ftp client or the Mac terminal, without having to download and upload the file?

If you have shell access to one of the servers, simply login to that server using telnet or ssh. Start a simple ftp client in the shell and log into the other server. Use a basic ftp command (put or get) to copy the file. Usually though, sysadmins are likely to make shell access difficult.
If you have no shell access, but you do have a webserver with PHP, then the easiest is to write a simple PHP program to do the job. Upload it and trigger it from a browser. Here's one I wrote:
<?php
// qdftp.php - Quick & Dirty FTP
// Place this script in a web accessible
// folder alongside the file you want to send,
// then invoke it from a browser.
//===============================
$server = "123.123.123.123"; //target server address or domain name
$user = "username"; //username on target server
$pass = "password"; //password on target server
$file = "myfile.zip"; //source file
//================================
$sessid = ftp_connect($server); //connect
$login_ok = ftp_login($sessid, $user, "$pass"); //login
if ((!$sessid) || (!$login_ok)):
echo "failed to connect: check hostname, username & password";
exit; //failed? bail!
endif;
$xfer = ftp_put($sessid, $file, $file, FTP_BINARY); //transfer
echo "file transfer " . ($xfer)? "succeded" : "failed" ;
ftp_close($sessid);
?>
Then trigger it from your browser
http://mysourceserver.com/qdftp.php
Last thing: delete qdftp.php when you're done - it's got your userename and password!

The FTP protocol does not support 3rd-party transfers like this. You might try:
$ scp host1:file host2:file2
But I don’t know how that’s implemented. It might bounce everything through you again, which is what you’re trying to avoid. That is, I suspect that it does this, which routes everything through your local pipe:
$ (ssh host1 "cat <file") | (ssh host2 "cat >file")
But you really want:
$ ssh host1 "scp file host2:"

Related

Laravel SSH put files to remote server

I am trying to upload a file from my laravel project to a remote server.
$fileName = base_path().'/Storage/app/public/uploads/'.$file;
SSH::into('production')->put($fileName, './');
The result is a blank screen with no errors or anything and the file is not on the remote server. I know my ssh config is correct (keys and username/host stuff) because this works fine:
SSH::into('production')->run('ls', function($line)
{
echo $line.PHP_EOL;
});
What am i missing? What can i do to see any verbose logging of the SSH call?

change ftp to sftp in a shell script

I have this shell script which transfers CSV files to another server using the FTP service, and I need to change this service to SFTP. Can anyone help me?
ftp -inv >$FTP_LOG_FILE <<EOF
open $FTP_HOST
user $FTP_USERNAME $FTP_PASSWORD
lcd $REPORT_LOCAL_SOURCE
cd $DESTINATION_DIRECTORY
mput *$FILE_TYPE
exit
EOF
Can you use public key authentication? That makes it pretty easy - no password required.
Also, personal preference - ftp gives you no way to reasonably interact with the file transfers and react to misbehavior. Try scp.
Assuming automatic public-key authentication and the same vars you used above -
scp $REPORT_LOCAL_SOURCE/*$FILE_TYPE $FTP_USERNAME#$FTP_HOST:$DESTINATION_DIRECTORY/
or, with shorter names...
if scp ldir/*$ext $me#$host:$dir/
then echo "No errors"
else echo "There were errors"
fi
Generally, try to never use all cap vars.

UNIX - FTP Using Parameterized Values

I have been stuck in this problem in a few days now and I really need help. My goal is to FTP a certain file into a bridge server. But before I can FTP, I need to enter some login credentials first. I want the login part to be automated that's why I created a separated parameter file. That parameter file has the login details.
So when I run the script, first it will create a txt file. Then the text file will be passed into the bridge server. Now, the script will also pass the login details from the parameter file to access the bridge server and finally a successful FTP. Any way to do this?
FTPFILE="File to be ftped"
Lets say the parameterised file has the details in the format
HostName username password.
Read the file contents using a loop statement like or however you like
I am using a while loop here
while read hostname username password
do
HOST=${hostname}
LOGIN=${username}
PWD=${password}
done
write the details - hostname,login, password to the $HOME/.netrc file
echo "machine ${HOST} login ${LOGIN} password ${PWD}" > /$HOME/.netrc
echo "macdef init" >> /$HOME/.netrc
echo "put ${FTPFILE} " >> /$HOME/.netrc
echo "bye" >> /$HOME/.netrc
echo >> /$HOME/.netrc
Ftp statement (Ftp first looks for .netrc file in $HOME directory to initiate the login process. If the file is not found then the username and password will be prompted)
ftp -i $HOST
This code will do the job:
#!/bin/sh
FTP_USERNAME=username
FTP_PASSWORD=password
FTP_SERVER=server_domaine
touch /directory/textfile.txt
run_access_server()
{
lftp <<STOP
#automatically access the server
open -u $FTP_USERNAME,$FTP_PASSWORD $FTP_SERVER
#changing directory
cd /directory/on/server
lcd /from/where/you/fetch/
#upload the file using get
mget textfile.txt
bye
STOP
}
run_access_server
Tell me how it works out with you.Regards

Why am I getting connection refused?

I am transferring files over sftp. Both my bash and Perl script are not able to connect.
I am able to connect to the remote server with psftp and copy files, but the idea is to be able to automate.
Following advice on SO posts I have installed ssh and sshd on my local machine. However on the remote server there is no .ssh directory, so I could not copy the id_dsa.pub file there.
My bash script below
#!/bin/bash
USERNAME="rema"
HOSTS="aa.bb.ccc.ddd"
PASSWORD="rema"
SCRIPT="cd data; pwd; cp gateway_data* /home/meteo/AWS_Data/Data"
ssh -l ${USERNAME} ${HOSTNAME} ${PASSWORD} ${SCRIPT}
I need correction if this is the correct way of copying all files gateway_data to a local PC, but that is not the current problem.
The script keeps asking for password and on the third attempt stop with (publickey,password)
rema#meteo's password:
Permission denied, please try again.
rema#meteo's password:
Permission denied, please try again.
rema#meteo's password:
Permission denied (publickey,password).
My perl script
#!/usr/bin/perl -w
use Net::FTP;
use strict;
use warnings;
use POSIX qw{strftime};
use File::Path;
use File::Copy;
my $debug = 0;
my $User = 'rema';
my $Pass = 'rema';
my $host = 'aa.bb.ccc.ddd';
my $remote_basedir = '/';
my $local_basedir = '../Data';
my $remote_datadir ='data';
my $yr = strftime "%Y", localtime();
my $hr = strftime "%H", localtime();
my $dat = strftime "%Y%m%d", localtime();
my $localdir= "$local_basedir/$dat$hr";
if(! -d $localdir) {
print "mkpath $localdir\n" if $debug;
mkpath ($localdir) or die "mkpath '$localdir' failed: $! ($^E)";
}
# Setup to do the ftp
print "Connecting to $host ...\n";
my $ftp = Net::FTP->new($host) or die "Error connecting to $host: $!";
$ftp->setcwd($remote_datadir) or die "unable to change cwd: " . $ftp->error;
# retrieve data
print "Copying data\n";
$ftp->cwd($remote_datadir);
print "Retrieving files from $remote_datadir to $localdir\n";
my #files = $ftp->ls;
foreach my $file (#files) {
next if -d $file;
next unless $file =~ /^gateway_data/;
print $file;
print "Getting $file\n" if $debug;
$ftp->get($file) or warn "Failed '$file': $! ($^E)";
}
$ftp->close;
print "copying ends\n";
exit 0;
This gives "Connection refused at ./AWS_ftptransf_rema.pl line 44."
A copy of this script is put on the remote machine to send files to the local machine. That script does not give any error. It actually lists the files to be copied but does nothing. Here is a code snippet from this third script, after making connection
opendir(DIR, './');
my #files=readdir(DIR);
foreach my $file (#files) {
next if -d $file;
next unless $file =~ /^gateway_data/;
$ftp->put($file) or warn "Failed '$file': $! ($^E)";
# $ftp->send($file) or warn "Failed '$file': $! ($^E)";
}
Both put and send do the same thing, i.e. notihng
Help will be appreciated.
Connection refused isn't a perl error message, it's an OS error message. It means literally, that there was nothing listening on the port you tried to connect to.
I think the core of your problem might be that you're trying to ssh with one script and FTP with the other. That's just not going to work.
Can I suggest rather than scripting it you might want to take a look at rsync which is a system designed for synchronising directories between two different systems?
Other than that:
your first problem is that ssh will not accept inline passwords. You NEED to get your ssh public/private key auth sorted to do that.
Connection refused : Thats thrown by underlying Kernel. One of the reason could be firewall might be blocking the incoming connections. Default policy of input might have set to drop. As ssh uses port 22, you need to explicitly open the port using
ufw allow 22/tcp
Ufw is uncomplicated firewall.
Even if firewall allows traffic distined to a particular port, you need a service to be run on that port.
For instance say
You have opened port 7777 using ufw allow 7777/tcp and then scan the system using
echo > /dev/tcp/127.0.0.1/7777 you get bash connection refused.
It literally not able to send packets to your own loop back interface.
However echo> /dev/tcp/www.google.com/80 does work. It is because there is a service running behind the port 80.
In above case, even you tell firewall to accept incoming connection, there is no service running on the port 7777.
So what you could is run a service like netcat or in my case I wrote a python code to accept the connection to port 7777.
I have also tried to code in c programming. Everything you required is socket programming. In your case that could be a pearl script.

Specify password to sftp in a Bash script [duplicate]

This question already has answers here:
How to run the sftp command with a password from Bash script?
(12 answers)
Closed 7 years ago.
I am trying to write a script to back up a file over SFTP. The problem is, it requires a password, and I see no way to manually specify a password to SFTP. I've heard about requiring no password by using public keys, but that requires being able to ssh into the remote server and modify some configuration files, which I cannot do.
Currently my solution is to use cURL, but that is insecure (uses normal FTP). I also looked at the .netrc file, but that seems to be for FTP instead of SFTP. How do I manually specify a password for sftp?
Lftp allows specifying passwords for both ftp and sftp and does not require public keys at all. Your sh sync script may look like this:
#!/bin/sh
# Define folders
THEFOLDER='/mnt/my/folder'
# List files
THEFILES=`ls -p $THEFOLDER | grep -v "/"`
for file in $THEFILES
do
echo "Processing $file"
lftp -u login,password -e "put $THEFOLDER/$file;quit" theftp/sub/folder
done
cURL can support sftp, as documented by the manual:
USING PASSWORDS
FTP
To ftp files using name+passwd, include them in the URL like:
curl ftp://name:passwd#machine.domain:port/full/path/to/file
or specify them with the -u flag like
curl -u name:passwd ftp://machine.domain:port/full/path/to/file
FTPS
It is just like for FTP, but you may also want to specify and use
SSL-specific options for certificates etc.
Note that using FTPS:// as prefix is the "implicit" way as described in the
standards while the recommended "explicit" way is done by using FTP:// and
the --ftp-ssl option.
SFTP / SCP
This is similar to FTP, but you can specify a private key to use instead of
a password. Note that the private key may itself be protected by a password
that is unrelated to the login password of the remote system. If you
provide a private key file you must also provide a public key file.
You might also want to consider using python (the paramiko module), as it can quickly be called from the shell.
Install the Module
pip install paramiko
Example FTP Upload Script
import paramiko
username = 'my_username'
password = 'my_password'
transport = paramiko.Transport((server, 22))
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
local_filename = '/tmp/filename'
remote_filename = 'MyFiles/temp.txt'
sftp.put( local_filename, remote_filename )
Bash program to wait for sftp to ask for a password then send it along:
#!/bin/bash
expect -c "
spawn sftp username#your_host
expect \"assword\"
send \"your_password_here\r\"
interact "
Put that in a file called sftp_autologin.sh. The \r sends an to sftp to execute the command. I don't include the 'p' in password because on some systems it's uppercase, others lowercase. expect spawns the sftp command. Waits for the string 'assword' to be seen and sends a command. Then ends.
To get this to work:
Install expect, I'm using 5.44.1.15
Make sure you can sftp to your box in interactive mode and supply a password.
Make sure this bash script has executable permissions.
Then run it:
chmod +x sftp_autologin.sh
./sftp_autologin.sh
It should drop you into the sftp commandline without prompting you for a password.
Is it insecure?
It's about the most unsecure command you can run. It exposes the password to the commandline history, to anyone else who can read 'ps' output, and basically defeats the entire purpose of passwords all together.
But hey what's another log on the fraud fire, it's only about 250b dollars in victim losses per year. Lets go for 500b.
This automatically runs some commands with the sftp shell and exits automatically when done:
#!/bin/bash
expect -c "
spawn sftp myuser#myserver.com
expect \"assword\"
send \"yourpassword\r\"
expect \"sftp\"
send \"get your_directory/yourfilename.txt\r\"
expect \"sftp\"
send \"exit\r\"
interact "
In order to use public keys you do not need to modify any "configuration files". You merely need to leave a copy of your public key in a place where ssh knows to look (normally ~/.ssh/authorized_keys). You can do this with sftp. If you haven't established any authorized_keys file on the server, you can simply put your id_rsa.pub file in its place.
You can't specify a password to ssh / scp or sftp from the command line. The only way to connect without prompting for a password is to use public key authentication.
You say that you can't ssh to the server to modify configuration files but if you can sftp to the server you can probably upload your public key.
Your public key just has to go under the .ssh directory in your home directory.

Resources