WGET only the file names in an FTP directory - ftp

I am attempting to create a dynamic list of files available in an ftp directory. I assume wget can help with this but I'm not really sure how...so my question is:
What is the syntax for retrieving file names from an ftp directory using wget?

Just execute
wget --no-remove-listing ftp://myftpserver/ftpdirectory/
This will generate two files: .listing (this is what you are looking for) and index.html which is the html version of the listing file.

Related

How to download multiple files in multiple sub-directories using curl?

I am downloading multiple files using curl. The base URL for all the files is the same like
https://mydata.gov/daily/2017
The data in these directories are further grouped by date and file type. So the first data that I need has this directory
https://mydata.gov/daily/2017/001/17d/Roger001.gz
The second data being
https://mydata.gov/daily/2017/002/17d/Roger002.gz
I need to download up until the data for the last day of 2017 which is
https://mydata.gov/daily/2017/365/17d/Roger365.gz
How can I use curl or any other similar tool to download all the files to a single local folder, preferably adopting the original file names?
use for f in {001..365}; do curl https://mydata.gov/daily/2017/"$f"/17d/Roger"$f".gz -o /your-directory/Roger"$f".gz; done in bash terminal.
replace your-directory with your directory which you want to save files.

List all directories and files recursively from FTP site

I am looking to spider an FTP directory and all inclusive sub-directories and files and write the information to a file.
I tried using lftp, but I noticed the site does not support lftp> ls -1R > rammb.txt
so now I am trying to figure out the best route. I would like to include the date as far as information sent to the file.
Previously, what I tried was lftp> find -d 10 > rammb.txt
but it did not provide the dates of the files. Any suggestions?
Use find --ls, it will output full file information. The option is available since lftp version 4.4.12.

How to download all files from hidden directory

I have do download all log files from a virtual directory within a site. The access to virtual directory is forbidden but files are accessible.
I have manually entered the file names to download
dir="Mar"
for ((i=1;i<100;i++)); do
wget http://sz.dsyn.com/2014/$dir/log_$i.txt
done
The problem is the script is not generic and most of the time I need to find out how many files are there and tweak the for loop. Is there a way to trigger wget to fetch all files without me bothering to specify the exact count.
Note:
If I use the browser to view http://sz.dsyn.com/2014/$dir, it is 403 forbidden. I cant pull all the files via browser tool/extension.
First of all check this similar question If this is not what you are looking for, you need to generate a file of URLs within and feed wget. e.g.
wget --input-file=http://sz.dsyn.com/2014/$dir/filelist.txt
wget will have the same problem your browser has: it cannot read the directory. Just pull until your first failure then quit.

How to download a file with wget that starts with a word and it has a specific extension?

Im trying to do a bash script and i need to download certain files with wget
like libfat-nds-1.0.11.tar.bz2 but after some times the version of this file may change so i would like to download a file that start with libfatnds and ends in .tar.bz2 .Is this possible with wget?
Using only wget, it can be achieved by specifying filename with wildcards in the list of accepted extensions.
wget -r -np -nd --accept='libfat-nds-*.tar.bz2'
The problem is that HTTP doesn't support wildcard downloads
. But if there is content listing enabled on the server or you have a index.html containing the available file names you could download that, extract the file name you need and then download the file with wget.
Something in this order
Download the index with curl
Use grep and/or sed to extract the exact file name
Download the file with wget (or curl)
If you pipe the commands you can do it on one line.

Batch download zips from ftp

Good evening.
Can You help me please with some batch file?
I have an ftp, where in root directory located few randomly named zip archives.
So i need to download that archives to local D:\temp
I know, how to do it via ftp.exe, but only for one file, witch name i know:
file: save.bat
ftp -s:1.txt
file: 1.txt
open myftp.com
123login
321pass
prompt
binary
hash
get file.zip D:\test\12.zip
bye
Maybe u can tell me how to download all * zip archives on ftp in loop?
Thanks!
You can use the mget command to download multiple files through ftp. This also supports wildcards.
You can simply use:
mget *.zip

Resources