List all directories and files recursively from FTP site - bash

I am looking to spider an FTP directory and all inclusive sub-directories and files and write the information to a file.
I tried using lftp, but I noticed the site does not support lftp> ls -1R > rammb.txt
so now I am trying to figure out the best route. I would like to include the date as far as information sent to the file.
Previously, what I tried was lftp> find -d 10 > rammb.txt
but it did not provide the dates of the files. Any suggestions?

Use find --ls, it will output full file information. The option is available since lftp version 4.4.12.

Related

How get have WGET search for a file within all subdirectories without listing each subdirectory?

There are some file that I want to download using WGET in a bash script. Each image is in one of many available subdirectories. I'm able to use WGET to sift through each subdirectory by listing them out by name. However, there are so many subdirectories that I'd like to find a way to ask WGET to search all subdirectories without listing them out (20150125_028445 20150125_028450 20150126_028454 are examples, but there are more). How do I do this? Here is my current code:
declare -a image=("lor_0299176871_0x636_sci.fit" "lor_0299176907_0x636_sci.fit" "lor_0299176943_0x636_sci.fit")
address='https://pds-smallbodies.astro.umd.edu/holdings/nh-p-lorri-3-pluto-v2.0/data/'
possibilities=( 20150125_028445 20150125_028450 20150126_028454 )
for file in "${image[#]}"
do
for possibility in "${possibilities[#]}"
do
wget "$address$possibility/$file" && break
done
done

Bash script for recursive directory listing on FTP server without -R

There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.

Bash: How to recursively ftp a certain file type under multiple directories

Is it possible to have two wildcards? If not, is there another way of going about this problem?
I am trying to recursively get a file type from an ftp server using Bash. But, what I am having trouble with is that I am trying to remove files from multiple directories. Many of these directory names will have matching strings. The client will look for all directories with the matching string and get a certain file type from each. Each directory can have many files with the same extension.
What I have tried to do is use wget recursively.
wget -r 'ftp://anonymous:#$HOST/$PATH/$DIRSTRING*/*.$FILEEXT
This gives me an error message saying the $PATH/$DIRSTRING*/ file or directory could not be found.
I know wget supports globbing. But, is it possible to have two wildcards? If not, is there another way of going about this problem?
Best Regards
wget is not really suited for this kind of ftp usage...but lftp is very good at mirroring ftp site data, it even supports globbing! :)
for your example:
lftp -e "mirror -I '$DIRSTRING*/*$FILEEXT' /$RPATH mirrorSite" ftp://anonymous#$HOST
see man lftp / mirror command

DOS ftp listing to local file

I'm trying to find a way to see if a file exists on an ftp site via DOS. I tried a get command on the file hoping that if it didn't exist it wouldn't download it to my local directory. However it seams that it still does, but it's an empty file. This doesn't work for me however because the file I'm looking for is just a empty trigger file so I can't tell the difference.
I would like to dump a listing ls of the ftp directory to a text file on my local drive and so I try
ls > listing.txt.
It creates the listing.txt file locally but it's always empty even though there are files on the ftp site.
What are my options with this?
I have used dir > listing.txt and ls > listing.txt and every time listing.txt is empty even though there are files in the directories I'm running those commands on.
Sorry if I didn't make this clear, but I'm trying to get the listing for an automated process and not simply for my visual when manually doing this.
Unless you're on FreeDOS, you're probably not using DOS. Perhaps you're using ftp.exe in the windows console? If that's the case, don't use a normal file redirect. Instead check here the syntax for ls in the standard Windows ftp client:
ls [RemoteDirectory] [LocalFile]
So you can do a ls . listing.txt to get a list of files in the current remote directory. The listing.txt file will appear in your user directory, e.g. c:\Users\user.

WGET only the file names in an FTP directory

I am attempting to create a dynamic list of files available in an ftp directory. I assume wget can help with this but I'm not really sure how...so my question is:
What is the syntax for retrieving file names from an ftp directory using wget?
Just execute
wget --no-remove-listing ftp://myftpserver/ftpdirectory/
This will generate two files: .listing (this is what you are looking for) and index.html which is the html version of the listing file.

Resources