Trying to setup a backup script to upload a folder recursively to a remote ftp folder ( sftp is not supported, only ftp ).
I tested curlftpfs it mount but just create empty files.
Any custom script you have tested and it is working?
Already searched the internet
In the past I used the good ncftp, see https://linux.die.net/man/1/ncftp, came with a ncftpput which has a -R (recursive) option.
Related
I am doing file copy from Linux to Windows share through FTP.
Once copy is done I am moving it to isilon storage which is shared on network path (manually).
Now I have created a batch file, which will do copy from FTP shared path to network path.
So how can I start batch file after FTP download? How can I automate it completely?
from Linux
ftp -n ip
user "user" "pwd"
put app.tar.gz
Once it is done i want to move it network shared path
Just use the copy command in a batch file, after the ftp.exe finishes:
ftp.exe -s:download.txt
copy c:\dowloadtarget\myfile.txt \\server\share\target\myfile.txt
You can run the copy even from the FTP script (the download.txt), if you need it for some reason. Use the ! (escape to shell) command.
get /remote/path/myfile.txt c:\dowloadtarget\myfile.txt
! copy c:\dowloadtarget\myfile.txt \\server\share\target\myfile.txt
But why don't you download the file directly to the shared folder?
get /remote/path/myfile.txt \\server\share\target\myfile.txt
I'm a newbie to FTP. I want to create a folder if it doesn't exists already.
I know there is an option "mkdir -p foldername", but doing this in ftp creates a folder by the name "-p".
I'm trying to transfer files from one remote server to another via ftp and create folders in receiving server if not already present.
One solution would be to always attempt to create the folder - and then ignore any errors. Of course, after creating the folder you need to cd into it - if that gives an error then you've got bigger problems.
The -p option in the shell mkdir is there to ensure no error is raised. FTP doesn't use the shell: it does it for itself.
I'm using a (cheap branded) local media station as an FTP server and I'm using FIleZilla to transfer files to it.
When I try to move or rename a file located on the media station, I'm getting
Command: RNFR [filename]
Response: 503 Command not understood.
I don't know whether this is because of an old or corrupted FTP version (it's a device older than 5 years and I think there are no updates available).
Is there an alternative to perform FTP rename or move commands?
Is there an alternative to perform FTP rename or move commands?
If you have telnet or SSH access to the machine you could do the renaming their. If not you might try to use the FTP SITE command with "mv from-name to-name". But I doubt that the server will support this if it does not even support the standard way of FTP to rename files.
Apart from that the only alternative is probably to download the file, remove it on the server and upload it again with a different name.
There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.
Using FTP commands I want to upload a large file once and then copy that file to many directories on the remote FTP server. All the copy commands seems to relate to copying from local to remote or the other way around.
Is there an FTP command to copy remote to remote?
are you trying to move the file? if yes, you can do it using rename command to move the file, as for copy i guess you still have to do the the get,send from local-remote way.
as for move command should be something like this
rename /oldpath/file2move.txt /newpath/file2move.txt
Basically just rename your file path that is infront of the file that you wish to move.
As far as I know, there is no such command available in FTP protocol. There are some extensions to SFTP protocol to do this (and, having SSH access, you can issue cp commands), but SFTP is not an FTP.