I have a website hosted on a server ***.***.**.**
I would like to download a folder, public_html which contains my website files.
How can I download the folder with all it's subdirectories intact and in order?
I'm using:
binary
cd public_html
lcd D:\websiteFiles
mget */*
but, the problem is that this dumps all the contents of public_html and its sub-folders into websiteFiles without the sub-folders.
how can I maintain the structure of public_html and all its sub-folders on download?
I want to do this in windows 7 cmd without assistance from ftp tools like fireftp, filzilla....
One alternative would be to use wget to do this. The following command should work as long as you have wget installed on your system:
wget --mirror -p --convert-links -P . http://example.com/public_html
You might have to add a Referer header to your request:
--header "Referer: exampe.com"
Another way would be to use rsync:
rsync -avz -e ssh <username>#<server>:/public_html/ /www/
You can download wget here https://www.gnu.org/software/wget/
The scp command can also be used to achieve this:
scp -r <username>#<server>:/public_html /www/
You could install cygwin in order to use rsync or scp
This is what worked:
wget -m ftp://username:password#ip.of.old.host
...
Related
I need to download folders from amazon web services (aws) EC2. How can I do that using wget command? I am on Ubuntu and have installed aws-cli.
You can use wget only if the folder id web-accessible, you must know the path to the folder then (http://domain.com/path/to-folder)
If it is not web accessible you can use scp or rsync to get the content, e.g:
rsync -e 'ssh -i /path/to/key.pem' -av user#host.tv:~/from/ /local/path/to/
Make sure 'from' is really correct path.
Also, you may want to set up sFTP and use it as well.
I have an issue while I need from script to upload all files which stored in some directory. Every time I get this issue:
curl: (9) Server denied you to change to the given directory
#!/bin/sh
for file in /export/test/*
do
curl -T ${file} ftp://192.168.10.10/${file} --user tester:psswd
done
I checked vsftpd config and I have permissions to write/read and when I do it manually It runs.
for example when I run this command, everything is OK.
curl -T /export/test/testing.txt ftp://192.168.10.10/export/status/testing.txt --user tester:psswd
Have someone else also this problem?
I don't have any idea how to solve it, I tried everything.
By the way: My ftp root folder is /var/www/stats and I need to rewrite files in subfolders which is named: /var/www/stats/export/test.
FIXED
my bad: error is in that file variable putting full path to server and I put one more slash there.
so final conclusion is this:
#!/bin/sh
for file in /export/test/*
do
curl -T ${file} ftp://192.168.10.10${file} --user tester:psswd
done
It works. Done.
I am trying to copy a folder from an http source using the following statement:
FileUtils.cp_r 'http://else.repository.labs/static/lit/MDMInternalTools/', 'c:\Users\Public\Desktop\'
However I get this error:
EINVAL: Invalid argument - https://else.repository.labs/static/lit/MDMInternalTools/
If that's a folder on a server that you have access to via ssh, then you can use scp to copy individual files, or a folder with all subfolders/files, using the -r option. The command will be something like
scp -r username#http://else.repository.labs:/path/to/rails/public/static/lit/MDMInternalTools c:\Users\Public\Desktop
This is assuming you can use scp. It looks like you're in windows, you'll need a decent command-line shell where you can install ssh.
https://en.wikipedia.org/wiki/Secure_copy
You can check if the server supports webdav, have an ftp or an ssh access.
Otherwise your only choice could be to use wget to get a local mirror:
wget -mk http://else.repository.labs/static/lit/MDMInternalTools/
I am new to FTP configuration. What I am trying to do is as follows:
I am running a shell script on my localhost and downloading some files to my machine. Now I want a functionality where the files which I downloaded should be stored in a temporary directory, and then it should be transferred to a location(other directory) which I specify. I feel this mechanism is achievable by FTP communication and will be helpful when I host this on a domain, but I am not getting resources from where I can teach myself how to set this up.
OK, having visited many sites, here are some resources you might find handy:
For configuring vsftpd, here's a manual of how to install, configure and use.
About receiving many files recursively via FTP, you can use wget (extracted from this site):
cd /tmp/ftptransfer
wget --mirror --username=foo --password=bar ftp://ftp.originsite.com/path/to/folder
About sending many files recursively, many people find the only way of doing so by tar-n-send; the only problem is that the files will remain tarred until you extract them by going to the other machine (remotely or via ssh) to extract the manually. There is an alternative, not using FTP, but using ssh and pipes which lets you have files extracted on target machine:
tar -cf - /tmp/ftptransfer | ssh geek#targetsite "cd target_dir; tar -xf -"
Explained:
tar is the application to make tar files
-c: create file
-f -: file name is "stdout"
/tmp/ftptransfer: include this folder and all subdirectories in the tar
|: Make a pipe to the next program (connect stdout to stdin)
ssh: Secure Shell program
geek#targetsite: username # machinename where you want to connect to
"..." command to send to the remote host
cd target_dir: changes the dir of output
tar -xf -: extracts the file received by "stdin"
For configuring SSH on Ubuntu, have a look here.
If you need more help, don't be afraid to ask! :)
I can download files using wget "ftp://user:pass#host/prefix*, but I cannot remove downloaded files from FTP. Any easy solution to do this in bash script?
As WhoSayln and Skilldrick said, you should use ftp to download files, and remove files from the server (if you have the permission to).
But in your question you're saying "I cannot remove downloaded files from FTP". Do you want to remove the local files from your computer (the ones you just downloaded from ftp server) or the files on remote server?
If is local, then just a rm -f file will do it :p
But if it's remote, and this is running on a script (a typical job in a batch) so try something like:
jyzuz#dev:/jean> ftp -n -i remoteserver.com << EOF
> user $username $password
> cd /remote/directory/
> rm filename.txt
> bye
> EOF
More or less? =P
If you need to script some operation on a FTP server, I would point you to lftp.
Main website
Tutorial
You want to use ftp for that.
wget is not the command you are lookin for. you can use ftp command instead. here is a large documentation about this;
http://linux.about.com/od/commands/l/blcmdl1_ftp.htm