Wget Folder in Bash - bash

I'm trying to use wget in bash to get a folder from my ftp host, but when I download the files it makes new folders for the folder that I'm downloading. For example, when I use this script to download my folder:
wget -r "ftp://$USER:$PASS#example.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
It creates a folder called "example.com" then within that one it makes "subdomains" and so on. I just want it to download the $theme_root folder that I'm downloading into the folder that I used cd to get into (/theme_builder/themes). Sorry if I'm not explaining this well but thanks in advance!

wget -nH --cut-dirs=4 -r url
I hope, than counted right... if not, change the 4 to another number.

Related

how to download a folder from bash?

I want to download a complete folder from a jenkins project containing many folder
I try with a wget and it works for only one file doing :
wget http://jenkinsmedia:XXXX/job/Lib//280/artifact/tool.rpm
but in the same place there is tool.rpm, there is a folder Test
I want to download the whole folder, is it possible? Or will I have to pick files one by one?
Try using wget -r http://jenkinsmedia:XXXX/job/Lib//280/artifact/
This will create a folder, in that folder it will have a folder job and so on..
use wget -nH --cut-dirs=4 -r http://jenkinsmedia:XXXX/job/Lib//280/artifact/ if you just want to have the folder with the documents

bash script wget download files by date

I'm new to the world of bash scripting. Hoping to seek some help here.
Been messing about with the 'wget' command and found that it is quite neat! At the moment, it gets all contents from a https site, including all directories, and saves them all accordingly. Here's the command:
wget -r -nH –cut-dirs=1 -R index.html -P /home/snoiniM/data/in/ https://www.someWebSite.com/folder/level2 --user=someUserName --password=P#ssword
/home/snoiniM/data/in/folder/level2/level2-2013-07-01.zip saved
/home/snoiniM/data/in/folder/level2/level2-2013-07-02.zip saved
/home/snoiniM/data/in/folder/level2/level2-2013-07-03.zip saved
/home/snoiniM/data/in/folder/level3/level3-2013-07-01.zip saved
/home/snoiniM/data/in/folder/level3/level3-2013-07-02.zip saved
/home/snoiniM/data/in/folder/level3/level3-2013-07-03.zip saved
That is fine for all intends and purposes. But what if I really just want to get a specific date from all its directories? E.g. just levelx-2013-07-03.zip from all dirs within folder and save all to 1 directory locally (e.g. all *zip will be in ...folder/)
Does anyone know how to do this?
I found that dropping -cut-dirs=1 and on the URL www.someWebsite.com/folder/ is sufficient.
Also, with that in mind, added the -nd option. This means no directories -- "Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering."
This means, we're left with one more part -- how to write a bash script, which gets yesterday date, parse it to the wget command as a parameter?
E.g.
wget -r -nH -nd -R index.html -A *$yesterday.zip -P /home/snoiniM/data/in/ https://www.someWebSite.com/folder/ --user=someUserName --password=P#ssword
Just the snippet you are looking for:
yesterday=$(date --date="#$(($(date +%s)-86400))" +%Y-%m-%d)
And no need of the * before yesterday; just treat it as a suffix.

Wget Not Downloading Every Folder

Hey, I have bash script running a wget command to get a directory:
wget -r -nH --cut-dirs=5 "ftp://$USER:$PASS#stumpyinc.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
And what it's supposed to do is download a folder structure that looks like this:
$theme_root/Library/Themes/$theme_name.theme/Icons
For some reason, it wont download any folder that's inside of the $theme_name.theme folder. There's also a UIImages folder in there that's not showing up, although files that are in that folder are being downloaded. Does anyone notice anything that I might have done wrong? Thanks in advance!
EDIT
if you add --level=inf it works perfectly!
Wget's default directory retrieval depth is 5 directories as per the wget manual. If the files you are trying to get are deeper than that from your starting position, it will not go down there. You can try giving a larger --level option or as your edit --level=inf.

wget on Windows command line

Basically I'm trying to download images from a website using the following command (SwiftIRC is an easy example to use):
wget.exe -r -l1 -A.png --no-parent www.swiftirc.net/index.php
This command works fine, however one of the ways I am trying to do it isn't working.
When I fire up an elevated command prompt, default to windows\system32.
If I use to following two commands everything works fine:
cd c:\users\tom\downloads\\
wget.exe -r -l1 etc. etc.**
The images are saved in the folder www.swiftirc.net in my downloads folder.
However if I try to do this in one line like this:
c:\users\tom\downloads\wget.exe -r -l1 etc. etc.
The response from wget on the cmd is exactly the same, but the images are not saved on my hard disk.
Does anyone know what I'm doing wrong?
Try adding c:\users\tom\downloads\ to PATH or put wget.exe into your windows/system32 folder.
I beleive it's because windows doesn't allow users to write files on the disk root, when you run "c:\users\tom\downloads\wget.exe" you have C:\ as a working directory so the files should be saved there but it's not allowed by the common strategies

Downloading the contents of a web directory?

If I have example.com/dir and dir is basically a folder in the example.com server, how can I download the contents of the folder to my hard drive?
Is this a webserver and you download over the net? Then (with shell access) you might try:
$ wget --wait 2 -rkc --no-parent http://example.com/dir
Works with ftp, too.

Resources