Downloading the contents of a web directory? - ftp

If I have example.com/dir and dir is basically a folder in the example.com server, how can I download the contents of the folder to my hard drive?

Is this a webserver and you download over the net? Then (with shell access) you might try:
$ wget --wait 2 -rkc --no-parent http://example.com/dir
Works with ftp, too.

Related

wget+ftp: How to download remote directory without remote tree

I'm trying to download a single directory from a remote ftp server. I'm using this command
wget -nH -r -N -l inf --ask-password ftp://ftp.server.com/some/remote/dir/xyz -P dirName
I'd like the remote xyz directory to be copied and called dirName. There is a local directory called dirName, but its contents are dirName/some/remote/dir/xyz, which is not what I wanted.
After a careful reading of the man page I found the --cut-dirs option which cuts parent directories from the local storage. That does what I need it to do!

Ftp script upload entire directory

Trying to setup a backup script to upload a folder recursively to a remote ftp folder ( sftp is not supported, only ftp ).
I tested curlftpfs it mount but just create empty files.
Any custom script you have tested and it is working?
Already searched the internet
In the past I used the good ncftp, see https://linux.die.net/man/1/ncftp, came with a ncftpput which has a -R (recursive) option.

how to download a folder from bash?

I want to download a complete folder from a jenkins project containing many folder
I try with a wget and it works for only one file doing :
wget http://jenkinsmedia:XXXX/job/Lib//280/artifact/tool.rpm
but in the same place there is tool.rpm, there is a folder Test
I want to download the whole folder, is it possible? Or will I have to pick files one by one?
Try using wget -r http://jenkinsmedia:XXXX/job/Lib//280/artifact/
This will create a folder, in that folder it will have a folder job and so on..
use wget -nH --cut-dirs=4 -r http://jenkinsmedia:XXXX/job/Lib//280/artifact/ if you just want to have the folder with the documents

Wget Folder in Bash

I'm trying to use wget in bash to get a folder from my ftp host, but when I download the files it makes new folders for the folder that I'm downloading. For example, when I use this script to download my folder:
wget -r "ftp://$USER:$PASS#example.com/subdomains/cydia/httpdocs/theme/themes/$theme_root"
It creates a folder called "example.com" then within that one it makes "subdomains" and so on. I just want it to download the $theme_root folder that I'm downloading into the folder that I used cd to get into (/theme_builder/themes). Sorry if I'm not explaining this well but thanks in advance!
wget -nH --cut-dirs=4 -r url
I hope, than counted right... if not, change the 4 to another number.

wget on Windows command line

Basically I'm trying to download images from a website using the following command (SwiftIRC is an easy example to use):
wget.exe -r -l1 -A.png --no-parent www.swiftirc.net/index.php
This command works fine, however one of the ways I am trying to do it isn't working.
When I fire up an elevated command prompt, default to windows\system32.
If I use to following two commands everything works fine:
cd c:\users\tom\downloads\\
wget.exe -r -l1 etc. etc.**
The images are saved in the folder www.swiftirc.net in my downloads folder.
However if I try to do this in one line like this:
c:\users\tom\downloads\wget.exe -r -l1 etc. etc.
The response from wget on the cmd is exactly the same, but the images are not saved on my hard disk.
Does anyone know what I'm doing wrong?
Try adding c:\users\tom\downloads\ to PATH or put wget.exe into your windows/system32 folder.
I beleive it's because windows doesn't allow users to write files on the disk root, when you run "c:\users\tom\downloads\wget.exe" you have C:\ as a working directory so the files should be saved there but it's not allowed by the common strategies

Resources