Send files to other directories using ftp - ftp

I am new to FTP configuration. What I am trying to do is as follows:
I am running a shell script on my localhost and downloading some files to my machine. Now I want a functionality where the files which I downloaded should be stored in a temporary directory, and then it should be transferred to a location(other directory) which I specify. I feel this mechanism is achievable by FTP communication and will be helpful when I host this on a domain, but I am not getting resources from where I can teach myself how to set this up.

OK, having visited many sites, here are some resources you might find handy:
For configuring vsftpd, here's a manual of how to install, configure and use.
About receiving many files recursively via FTP, you can use wget (extracted from this site):
cd /tmp/ftptransfer
wget --mirror --username=foo --password=bar ftp://ftp.originsite.com/path/to/folder
About sending many files recursively, many people find the only way of doing so by tar-n-send; the only problem is that the files will remain tarred until you extract them by going to the other machine (remotely or via ssh) to extract the manually. There is an alternative, not using FTP, but using ssh and pipes which lets you have files extracted on target machine:
tar -cf - /tmp/ftptransfer | ssh geek#targetsite "cd target_dir; tar -xf -"
Explained:
tar is the application to make tar files
-c: create file
-f -: file name is "stdout"
/tmp/ftptransfer: include this folder and all subdirectories in the tar
|: Make a pipe to the next program (connect stdout to stdin)
ssh: Secure Shell program
geek#targetsite: username # machinename where you want to connect to
"..." command to send to the remote host
cd target_dir: changes the dir of output
tar -xf -: extracts the file received by "stdin"
For configuring SSH on Ubuntu, have a look here.
If you need more help, don't be afraid to ask! :)

Related

Converting FTP "ls -ltr" to SFTP

I was migrating an FTP functionality to SFTP. In shell script that does FTP, I saw some lines and I can not convert them to SFTP. I searched but couldn't get a result.
With ftp, I can get remote file names to local log file like this:
ftp>
cd remote_folder
ls -ltr local_file.tmp
With sftp (OpenSSH client) this doesn't work. It says "file could not be found" about ls command. And it also says -t or -r options are invalid with ls command. How can I do same thing with sftp?
Thank you.
We do not know what "ftp" client you were using. So it's hard to tell what the ls -ltr was doing. Though my guess is that your "ftp" client was transparently passing the switches to the FTP server, not processing them locally. Again, we do not know what FTP server you were using. But I know of only one FTP server that supports -ltr and that is ProFTPD. In ProFTPD the -ltr means long directory listing sorted reversely by timestamp. The OpenSSH sftp client supports the same switches for ls command with the very same meaning. If it is not working for you, you are probably using a very old version of OpenSSH. The -tr switches seem to be supported since OpenSSH 3.6 (February 2001) and the -l even longer.
OpenSSH sftp does not support writing the listing to a file. But you can redirect whole sftp output to a file.
Something like this:
sftp -b script.txt user#example.com > local_file.tmp
With script.txt containing:
cd remote_folder
ls -ltr

How to download automatically all newer files which are in remote ftp folder in shell script?

For example i have two servers 1. Server A & 2. Server B
Server A has directory called /testdir with some files, I need a shell script which will run in Server B to download (FTP) the files from Server A /testdir. This download should happen automatically whenever a new file is added in Server A /testdir and old files should be neglected.
Consider using 'lftp' incremental transfer (mirror). As an alternative, 'wget' has similar mirroring functionality:
With wget:
wget -mirror -nH -o ftp://serverA/testdir
With lftp:
lftp
open ftp://serverA/
mirror /testdir .

Copy folder from an http source to a local directory

I am trying to copy a folder from an http source using the following statement:
FileUtils.cp_r 'http://else.repository.labs/static/lit/MDMInternalTools/', 'c:\Users\Public\Desktop\'
However I get this error:
EINVAL: Invalid argument - https://else.repository.labs/static/lit/MDMInternalTools/
If that's a folder on a server that you have access to via ssh, then you can use scp to copy individual files, or a folder with all subfolders/files, using the -r option. The command will be something like
scp -r username#http://else.repository.labs:/path/to/rails/public/static/lit/MDMInternalTools c:\Users\Public\Desktop
This is assuming you can use scp. It looks like you're in windows, you'll need a decent command-line shell where you can install ssh.
https://en.wikipedia.org/wiki/Secure_copy
You can check if the server supports webdav, have an ftp or an ssh access.
Otherwise your only choice could be to use wget to get a local mirror:
wget -mk http://else.repository.labs/static/lit/MDMInternalTools/

How do I copy a folder from remote to local using scp?

How do I copy a folder from remote to local host using scp?
I use ssh to log in to my server.
Then, I would like to copy the remote folder foo to local /home/user/Desktop.
How do I achieve this?
scp -r user#your.server.example.com:/path/to/foo /home/user/Desktop/
By not including the trailing '/' at the end of foo, you will copy the directory itself (including contents), rather than only the contents of the directory.
From man scp (See online manual)
-r Recursively copy entire directories
To use full power of scp you need to go through next steps:
Public key authorisation
Create SSH aliases
Then, for example if you have this ~/.ssh/config:
Host test
User testuser
HostName test-site.example
Port 22022
Host prod
User produser
HostName production-site.example
Port 22022
you'll save yourself from password entry and simplify scp syntax like this:
scp -r prod:/path/foo /home/user/Desktop # copy to local
scp -r prod:/path/foo test:/tmp # copy from remote prod to remote test
More over, you will be able to use remote path-completion:
scp test:/var/log/ # press tab twice
Display all 151 possibilities? (y or n)
For enabling remote bash-completion you need to have bash-shell on both <source> and <target> hosts, and properly working bash-completion. For more information see related questions:
How to enable autocompletion for remote paths when using scp?
SCP filename tab completion
To copy all from Local Location to Remote Location (Upload)
scp -r /path/from/local username#hostname:/path/to/remote
To copy all from Remote Location to Local Location (Download)
scp -r username#hostname:/path/from/remote /path/to/local
Custom Port where xxxx is custom port number
scp -r -P xxxx username#hostname:/path/from/remote /path/to/local
Copy on current directory from Remote to Local
scp -r username#hostname:/path/from/remote .
Help:
-r Recursively copy all directories and files
Always use full location from /, Get full location/path by pwd
scp will replace all existing files
hostname will be hostname or IP address
if custom port is needed (besides port 22) use -P PortNumber
. (dot) - it means current working directory, So download/copy from server and paste here only.
Note: Sometimes the custom port will not work due to the port not being allowed in the firewall, so make sure that custom port is allowed in the firewall for incoming and outgoing connection
What I always use is:
scp -r username#IP:/path/to/server/source/folder/ .
. (dot): it means current folder. so copy from server and paste here only.
IP: can be an IP address like 125.55.41.311 or it can be host like ns1.mysite.example.
Better to first compress catalog on remote server:
tar czfP backup.tar.gz /path/to/catalog
Secondly, download from remote:
scp user#your.server.example.com:/path/to/backup.tar.gz .
At the end, extract the files:
tar -xzvf backup.tar.gz
Typical scenario,
scp -r -P port username#ip:/path-to-folder .
explained with an sample,
scp -r -P 27000 abc#10.70.12.12:/tmp/hotel_dump .
where,
port = 27000
username = "abc" , remote server username
path-to-folder = tmp/hotel_dump
. = current local directory
And if you have one hell of a files to download from the remote location and if you don't much care about security, try changing the scp default encryption (Triple-DES) to something like 'blowfish'.
This will reduce file copying time drastically.
scp -c blowfish -r user#your.server.example.com:/path/to/foo /home/user/Desktop/
Go to Files on your unity toolbar
Press Ctrl + l and write here_goes_your_user_name#192.168.10.123
The 192.168.1.103 is the host that you want to connect.
The here one example
In case you run into "Too many authentication failures", specify the exact SSH key you have added to your severs ssh server:
scp -r -i /path/to/local/key user#remote.tld:/path/to/folder /your/local/target/dir
The question was how to copy a folder from remote to local with scp command.
$ scp -r userRemote#remoteIp:/path/remoteDir /path/localDir
But here is the better way for do it with sftp - SSH File Transfer Protocol (also Secure File Transfer Protocol, or SFTP) is a network protocol that provides file access, file transfer, and file management over any reliable data stream.(wikipedia).
$ sftp user_remote#remote_ip
sftp> cd /path/to/remoteDir
sftp> get -r remoteDir
Fetching /path/to/remoteDir to localDir 100% 398 0.4KB/s 00:00
For help about sftp command just type help or ?.
I don't know why but I was had to use local folder before source server directive . to make it work
scp -r . root#888.888.888.888:/usr/share/nginx/www/example.org/
For Windows OS, we used this command.
pscp -r -P 22 hostname#IP:/path/to/Downloads ./
The premise of the question is incorrect. The idea is, once logged into ssh, how to move files from the logged-in machine back to the client that is logged in. However, scp is not aware of nor can it use the ssh connection. It is making its own connections. So the simple solution is create a new terminal window on the local workstation, and run scp that transfers files from the remote server to local machine. E.g., scp -i key user#remote:/remote-dir/remote-file /local-dir/local-file

Ftp command to remove bunch of files

I can download files using wget "ftp://user:pass#host/prefix*, but I cannot remove downloaded files from FTP. Any easy solution to do this in bash script?
As WhoSayln and Skilldrick said, you should use ftp to download files, and remove files from the server (if you have the permission to).
But in your question you're saying "I cannot remove downloaded files from FTP". Do you want to remove the local files from your computer (the ones you just downloaded from ftp server) or the files on remote server?
If is local, then just a rm -f file will do it :p
But if it's remote, and this is running on a script (a typical job in a batch) so try something like:
jyzuz#dev:/jean> ftp -n -i remoteserver.com << EOF
> user $username $password
> cd /remote/directory/
> rm filename.txt
> bye
> EOF
More or less? =P
If you need to script some operation on a FTP server, I would point you to lftp.
Main website
Tutorial
You want to use ftp for that.
wget is not the command you are lookin for. you can use ftp command instead. here is a large documentation about this;
http://linux.about.com/od/commands/l/blcmdl1_ftp.htm

Resources