Wget Not Downloading Every Folder - ftp

Hey, I have bash script running a wget command to get a directory:
wget -r -nH --cut-dirs=5 "ftp://$USER:$PASS#stumpyinc.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
And what it's supposed to do is download a folder structure that looks like this:
$theme_root/Library/Themes/$theme_name.theme/Icons
For some reason, it wont download any folder that's inside of the $theme_name.theme folder. There's also a UIImages folder in there that's not showing up, although files that are in that folder are being downloaded. Does anyone notice anything that I might have done wrong? Thanks in advance!
EDIT
if you add --level=inf it works perfectly!

Wget's default directory retrieval depth is 5 directories as per the wget manual. If the files you are trying to get are deeper than that from your starting position, it will not go down there. You can try giving a larger --level option or as your edit --level=inf.

Related

Can't see files with Symlink

I need my client to be able to see the file in the directory they are allowed on. So I soft link the directory they are allowed on but can't see the files inside even tho they have the right (rwx).
ex:
/home/user1/project1.link/(couple of files)**
/clients/client_shamwow/project1/(couples of files)
**: Can't see the files.
This is the line I used:
ln -s /clients/client_shamwow/projet_prod /home/user1/projet_prod
is there something wrong that I am doing so they can't see the files in project_prod or I should use something else?
Your command doesn't match your example, but I assume you mean /home/user1/project1.link is a soft (symbolic) link, and when you run ls it lists just that name, rather than the contents of the directory the link points to. If that's the case, add the -L option to your ls command.
ls -lL /home/user1/project1.link
The man page says:
-L, --dereference
when showing file information for a symbolic link, show information
for the file the link references rather than for the link itself
Another way is simply to append /. to the end of your command, as in
ls -l /home/user1/project1.link/.
If that doesn't answer your question, I think you need to be more clear, and perhaps clean up the inconsistencies in your question. Even show some real output and the commands you ran.
Solved. No idea what happend. I just recreated the link the exact same way I did before and now I am able to see AND modify the files as the user1 w/o him being able to go anywhere else than what is in the folder project_prod. Thx for your time :)

how to download a folder from bash?

I want to download a complete folder from a jenkins project containing many folder
I try with a wget and it works for only one file doing :
wget http://jenkinsmedia:XXXX/job/Lib//280/artifact/tool.rpm
but in the same place there is tool.rpm, there is a folder Test
I want to download the whole folder, is it possible? Or will I have to pick files one by one?
Try using wget -r http://jenkinsmedia:XXXX/job/Lib//280/artifact/
This will create a folder, in that folder it will have a folder job and so on..
use wget -nH --cut-dirs=4 -r http://jenkinsmedia:XXXX/job/Lib//280/artifact/ if you just want to have the folder with the documents

bash script wget download files by date

I'm new to the world of bash scripting. Hoping to seek some help here.
Been messing about with the 'wget' command and found that it is quite neat! At the moment, it gets all contents from a https site, including all directories, and saves them all accordingly. Here's the command:
wget -r -nH –cut-dirs=1 -R index.html -P /home/snoiniM/data/in/ https://www.someWebSite.com/folder/level2 --user=someUserName --password=P#ssword
/home/snoiniM/data/in/folder/level2/level2-2013-07-01.zip saved
/home/snoiniM/data/in/folder/level2/level2-2013-07-02.zip saved
/home/snoiniM/data/in/folder/level2/level2-2013-07-03.zip saved
/home/snoiniM/data/in/folder/level3/level3-2013-07-01.zip saved
/home/snoiniM/data/in/folder/level3/level3-2013-07-02.zip saved
/home/snoiniM/data/in/folder/level3/level3-2013-07-03.zip saved
That is fine for all intends and purposes. But what if I really just want to get a specific date from all its directories? E.g. just levelx-2013-07-03.zip from all dirs within folder and save all to 1 directory locally (e.g. all *zip will be in ...folder/)
Does anyone know how to do this?
I found that dropping -cut-dirs=1 and on the URL www.someWebsite.com/folder/ is sufficient.
Also, with that in mind, added the -nd option. This means no directories -- "Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering."
This means, we're left with one more part -- how to write a bash script, which gets yesterday date, parse it to the wget command as a parameter?
E.g.
wget -r -nH -nd -R index.html -A *$yesterday.zip -P /home/snoiniM/data/in/ https://www.someWebSite.com/folder/ --user=someUserName --password=P#ssword
Just the snippet you are looking for:
yesterday=$(date --date="#$(($(date +%s)-86400))" +%Y-%m-%d)
And no need of the * before yesterday; just treat it as a suffix.

Wget Folder in Bash

I'm trying to use wget in bash to get a folder from my ftp host, but when I download the files it makes new folders for the folder that I'm downloading. For example, when I use this script to download my folder:
wget -r "ftp://$USER:$PASS#example.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
It creates a folder called "example.com" then within that one it makes "subdomains" and so on. I just want it to download the $theme_root folder that I'm downloading into the folder that I used cd to get into (/theme_builder/themes). Sorry if I'm not explaining this well but thanks in advance!
wget -nH --cut-dirs=4 -r url
I hope, than counted right... if not, change the 4 to another number.

wget on Windows command line

Basically I'm trying to download images from a website using the following command (SwiftIRC is an easy example to use):
wget.exe -r -l1 -A.png --no-parent www.swiftirc.net/index.php
This command works fine, however one of the ways I am trying to do it isn't working.
When I fire up an elevated command prompt, default to windows\system32.
If I use to following two commands everything works fine:
cd c:\users\tom\downloads\\
wget.exe -r -l1 etc. etc.**
The images are saved in the folder www.swiftirc.net in my downloads folder.
However if I try to do this in one line like this:
c:\users\tom\downloads\wget.exe -r -l1 etc. etc.
The response from wget on the cmd is exactly the same, but the images are not saved on my hard disk.
Does anyone know what I'm doing wrong?
Try adding c:\users\tom\downloads\ to PATH or put wget.exe into your windows/system32 folder.
I beleive it's because windows doesn't allow users to write files on the disk root, when you run "c:\users\tom\downloads\wget.exe" you have C:\ as a working directory so the files should be saved there but it's not allowed by the common strategies

Resources