Download file with different local name using wget - shell

Hi i have a text file where download links are given like -
http://www.example.com/10.10.11/abc.jpg
http://www.example.com/10.10.12/abc.jpg
http://www.example.com/10.10.13/abc.jpg
Here 10.10.* is the date of the image.
I need to download all the images using wget where the image name will be the corresponding date (eg. 10.10.111.jpg).
PS. I tried using:
wget -i download.txt
So, any solution?
Thanks in advance

You can instruct Wget to create subdirectories based on the URL, and then do the renaming after the download has finished.

I'd suggest a batch script that downloads the files one by one using the -O option, and a bit of sed/awk magic to get the names right
But careful! given the -O option, you have to call wget on a per file basis

This should do the trick.
#!/bin/sh
while read url; do
urldir=${url%/*}
dir=${urldir##*/}
wget -O $dir.jpg $url
done < download.txt

This might work for you:
sed '\|/\([^/]*\)/[^/]*\1[^/.]*.jpg|!d' download.txt | wget -i -
Explanation:
Filter the download.txt file to contain only those files which you require and then pass them on to wget.

I have developed a script that does just this bulkGetter. Super easy to use, you just need an input file with all the links you want to download and use option "-rb" (refer to link).

Related

Considering a specific name for the downloaded file

I download a .tar.gz file using wget using this command:
wget hello.tar.gz
This is a part of a long script, sometimes when I want to download this file, an error occurs and when for the second time the file is downloaded the name of the downloaded file changes to something like this:
hello.tar.gz.2
the third time:
hello.tar.gz.3
How can I say that the whatever the name of the downloaded is, change it to hello.tar.gz?
In other words I don't want the name of the downloaded file be anything other than hello.tar.gz?
wget hello.tar.gz -O <fileName>
wget have internal option like -r, -p to change default behavior
So just try the following:
wget -p <url>
wget -r <url>
Since now you noticed the incremental change. Discard any repeated files and rely on the following as initial condition:
wget hello.tar.gz
mv hello.tar.gz.2 hello.tar.gz

unix unzip utility: is there a way to give the extracted folder a different name than the zip file name?

I can do the command:
unzip some-zip.zip
and it will produce a some-zip folder.
I don't want a default folder name, but to create my own. Nor do I want to do a mv after.
I don't see a command line option to handle this. Can I accomplish this easily with redirection (if indeed no command line option)? If so, will that work efficiently for a fairly large zip file (52 MB)?
Thanks
unzip file.zip -d destination_folder

How to download a file with wget that starts with a word and it has a specific extension?

Im trying to do a bash script and i need to download certain files with wget
like libfat-nds-1.0.11.tar.bz2 but after some times the version of this file may change so i would like to download a file that start with libfatnds and ends in .tar.bz2 .Is this possible with wget?
Using only wget, it can be achieved by specifying filename with wildcards in the list of accepted extensions.
wget -r -np -nd --accept='libfat-nds-*.tar.bz2'
The problem is that HTTP doesn't support wildcard downloads
. But if there is content listing enabled on the server or you have a index.html containing the available file names you could download that, extract the file name you need and then download the file with wget.
Something in this order
Download the index with curl
Use grep and/or sed to extract the exact file name
Download the file with wget (or curl)
If you pipe the commands you can do it on one line.

How can I FTP many files I have listed in a TXT?

I have a list of files inside a TXT file that I need to upload to my FTP. Is there any Windows Bat file or Linux shell script that will process it?
cat ftp_filelist | xargs --max-lines=1 ncftpput -u user -p pass ftp_host /remote/path
You can use the wput command.
The syntax is somewhat like this
wput -i [name of the file.txt]
Go through this link
http://wput.sourceforge.net/wput.1.html
It works for linux.With this it will upload all the URLs given in the text file onto your ftp server one by one.
You may want to check out Jason Faulkners' batch script about which he wrote here.

Bash command to copy images from remote url

I'm using mac's terminal.
I want to copy images from remote url: http://media.pragprog.com/titles/rails4/code/depot_b/public/images/ to a local directory.
What's the command to do that?
Tnx,
You can use curl
curl -O "http://media.pragprog.com/titles/rails4/code/depot_b/public/images/*.jpg"
for example.
alternatively you may want just all the images, from a website. wget can do this with a recursive option such as:
$ wget -r -A=jpeg,jpg,bmp,png,gif,tiff,xpm,ico http://www.website.com/
This should only download the comma delimited extensions recursively starting at the site index. This works like a web-spider so if its not referenced anywhere on the site it will miss the image.
wget will work, assuming the server has directory listing:
wget -m http://media.pragprog.com/titles/rails4/code/depot_b/public/images
You can do this with Wget or cURL. If I recall correctly, neither come out-of-the-box w/ OS X, so you may need to install them with MacPorts or something similar.

Resources