How to use GREP in FTP client where Destination Machine is Unix based - ftp

How to use GREP in FTP client where Destination Machine is Unix based?
I want to find files containing specific string amoung all files but i can not initiate telnet session. I can only initiate FTP session and needed to find files/string in various folders & subfolders. Kindly help.

grep is not part of the standard ftp commands. I'm afraid you'll have to make do with ls. As far as I know it doesn't support searching through subfolders though. See this thread for another client.

Related

How to use mget to ftp files containing specific string?

I am using mget to retrieve files from a remote server to local directory in Windows.
lcd C:\E920_1\autopkg\saveE1logafterDir\serverlog
mget /slot/ems2576/appmgr/jdedwards/e920/6210/log/jde_*.log
Now, I wish to add additional step to retrieve out of this list, only the files which contains the word "PACKAGE BUILD" inside it.
How do I accomplish it?
It's not possible. FTP protocol does not have an API to find files by their contents.
See also Search Within Files On Remote FTP Site.
So any implementation you will use will have to download all log files and search their contents locally.
In a batch file, you can use findstr command for that:
Batch file to search a keyword in all files of a directory
You may have a different way of accessing the server files. For example, if you have a (SSH) shell access, you can search the files directly on the server. But that's a completely different topic.

How to list the sub directories in a Windows FTP server?

I'm trying to list all the directories and sub directories in a windows server from Unix FTP command. I tried dir -r command but it only displays the directories in current folder. dir /s command is not displaying anything. I don't have utilities like winexe also. Any idea would greatly help me. Thanks in advance.
There's no command to list directories recursively in the common *nix ftp command-line client.
Some FTP servers (like ProFTPD) support switches to the LIST command (and similar). But that's a non-standard behavior that does not have any backing in the FTP specification/RFC.
You didn't specify what Windows FTP server you are using. Assuming IIS: The IIS does not support any switches at all, what is the correct behavior. It is a task for the client to do the recursion. But again, the common *nix ftp client does not support that.
Similar question:
Get a whole FTP directory listings recursively in one call possible to reduce time
Try dir -R. dir /s is a windows command, and dir -r is reversing the output order.

503 RNFR command not understood

I'm using a (cheap branded) local media station as an FTP server and I'm using FIleZilla to transfer files to it.
When I try to move or rename a file located on the media station, I'm getting
Command: RNFR [filename]
Response: 503 Command not understood.
I don't know whether this is because of an old or corrupted FTP version (it's a device older than 5 years and I think there are no updates available).
Is there an alternative to perform FTP rename or move commands?
Is there an alternative to perform FTP rename or move commands?
If you have telnet or SSH access to the machine you could do the renaming their. If not you might try to use the FTP SITE command with "mv from-name to-name". But I doubt that the server will support this if it does not even support the standard way of FTP to rename files.
Apart from that the only alternative is probably to download the file, remove it on the server and upload it again with a different name.

Bash script for recursive directory listing on FTP server without -R

There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.

FTP backup script with hard links using

Usually I use rsync based backup.
But now I have to make backup script from Windows server to linux.
So, there is no rsync - only FTP.
I like ideas of hard links using to save disk space and incremental backup to minimize traffic.
Is there any similar backup script for ftp instead of rsync?
UPDATE:
I need to backup Windows server through FTP. Backup script executes at Linux backup server.
SOLUTION:
I found this useful script to backup through FTP with hard links and incremental feature.
Note for Ubuntu users: there is no md5 command in Ubuntu. Use md5sum instead.
# filehash1="$(md5 -q "$curfile"".gz")"
# filehash2="$(md5 -q "$mysqltmpfile")"
filehash1="$(md5sum "$curfile"".gz" | awk '{ print $1 }')"
filehash2="$(md5sum "$mysqltmpfile" | awk '{ print $1 }')"
Edit, since the setup was not clear enough for me from the original question.
Based on the update of the question the situation is, that you need to pull the data on the backup server from the windows system via ftp. In this case you could adapt the script you find yourself (see comment) or use a similar idea like:
Use cp -lr to clone the previous backup with hard links.
Use lftp --mirror to overwrite this copy with anything which got updated on the remote system.
But I assumed initially that you need to push the data from the windows system to the backup server, that is the FTP server is on the backup system. This case can not handled this way (original answer follows):
Since FTP has no idea of links at all any transfers will only result in new or overwritten files. The only way would be to using the SITE command to issue site specific commands and deal this way with hard links. But site specific commands are usually restricted heavily so that you can do something like change permissions but not do anything with hard links.
And even if you could support hard links with SITE you have to implement the logic which decides when to use such links. With rsync this logic is built into the rsync server and executed on the server site. With FTP you have to built all the logic at the client site, which means that you would have to download a file to compare it with a local file and then decide if you would need to upload the new file or if a hard link to an existing file could be used.

Resources