Wget - How do I download some files from ftp server? - ftp

I am trying to download some files from a directory in ftp server. I would like to use "wget" command, but I can't get them.
ftp server URL: ftp://192.168.0.10
ftp user name: GL840
ftp password: no password
folder name in ftp: SD1/181004/
I am using the following command to download all files in the folder SD1/181004 in ftp server.
command is "wget -r -nd ftp://GL840:#192.168.0.10/SD1/181004/ -P /root/wang/powerdata/"
However, the following message is displayed and files are not downloaded.
Could you please tell me how to modify my command to download files?

I have tried accessing file on ftp server. Usually I follow the following commands and this works for me. You can give a try with this command
wget --user='GL840' --password='' ftp://192.168.0.10/SD1/181004/

Related

PSFTP - File Transfer Not Happening

I want a file to transfer automatically for one server to another server. For that i have installed putty.
Batch File: Transfer.Bat
cd "C:\Program Files\PuTTY"
psftp Username#RemoteServer.com -pw password -b MovementFileTransferScript.txt -be
Text File: MovementFileTransferScript.txt
open RemoteServer.com
put D:/Backup/FULL_DB_BACKUP-DontDelete.txt R:\DB\Backup
quit
In the above text file, D:/ belongs to Source Server and R:/ belongs to destination server.
When i am running the batch file i am getting the below error.
Using username "Username".
Remote working directory is /
psftp: already connected
local: unable to open D:/Backup/FULL_DB_BACKUP-DontDelete.txt
I have checked the permissions of the folders and everything looks good. Can some one let me know what was the issue here.?

Copy folder from an http source to a local directory

I am trying to copy a folder from an http source using the following statement:
FileUtils.cp_r 'http://else.repository.labs/static/lit/MDMInternalTools/', 'c:\Users\Public\Desktop\'
However I get this error:
EINVAL: Invalid argument - https://else.repository.labs/static/lit/MDMInternalTools/
If that's a folder on a server that you have access to via ssh, then you can use scp to copy individual files, or a folder with all subfolders/files, using the -r option. The command will be something like
scp -r username#http://else.repository.labs:/path/to/rails/public/static/lit/MDMInternalTools c:\Users\Public\Desktop
This is assuming you can use scp. It looks like you're in windows, you'll need a decent command-line shell where you can install ssh.
https://en.wikipedia.org/wiki/Secure_copy
You can check if the server supports webdav, have an ftp or an ssh access.
Otherwise your only choice could be to use wget to get a local mirror:
wget -mk http://else.repository.labs/static/lit/MDMInternalTools/

Automating ftp downloads with shell script

I want download a bunch of .txt.gz files by ftp. I've written this shell script. How do I get all the files on the sever with out specifying each file?
Some code..
#!/bin/bash
ftp -i -n <<Here
open ftplink.com
user Username password
bin
get XXX_xxxx_mp.txt.gz
get XXX_xxxx_mp.txt.gz
close
quit
Here
Use wget instead:
wget ftp://user:pass#example.com/dir/*_mp.txt.gz

Send files to other directories using ftp

I am new to FTP configuration. What I am trying to do is as follows:
I am running a shell script on my localhost and downloading some files to my machine. Now I want a functionality where the files which I downloaded should be stored in a temporary directory, and then it should be transferred to a location(other directory) which I specify. I feel this mechanism is achievable by FTP communication and will be helpful when I host this on a domain, but I am not getting resources from where I can teach myself how to set this up.
OK, having visited many sites, here are some resources you might find handy:
For configuring vsftpd, here's a manual of how to install, configure and use.
About receiving many files recursively via FTP, you can use wget (extracted from this site):
cd /tmp/ftptransfer
wget --mirror --username=foo --password=bar ftp://ftp.originsite.com/path/to/folder
About sending many files recursively, many people find the only way of doing so by tar-n-send; the only problem is that the files will remain tarred until you extract them by going to the other machine (remotely or via ssh) to extract the manually. There is an alternative, not using FTP, but using ssh and pipes which lets you have files extracted on target machine:
tar -cf - /tmp/ftptransfer | ssh geek#targetsite "cd target_dir; tar -xf -"
Explained:
tar is the application to make tar files
-c: create file
-f -: file name is "stdout"
/tmp/ftptransfer: include this folder and all subdirectories in the tar
|: Make a pipe to the next program (connect stdout to stdin)
ssh: Secure Shell program
geek#targetsite: username # machinename where you want to connect to
"..." command to send to the remote host
cd target_dir: changes the dir of output
tar -xf -: extracts the file received by "stdin"
For configuring SSH on Ubuntu, have a look here.
If you need more help, don't be afraid to ask! :)

How would I construct a terminal command to download a folder with wget from a Media Temple (gs) server?

I'm trying to download a folder using wget on the Terminal (I'm usin a Mac if that matters) because my ftp client sucks and keeps timing out. It doesn't stay connected for long. So I was wondering if I could use wget to connect via ftp protocol to the server to download the directory in question. I have searched around in the internet for this and have attempted to write the command but it keeps failing. So assuming the following:
ftp username is: serveradmin#mydomain.ca
ftp host is: ftp.s12345.gridserver.com
ftp password is: somepassword
I have tried to write the command in the following ways:
wget -r ftp://serveradmin#mydomain.ca:somepassword#s12345.gridserver.com/path/to/desired/folder/
wget -r ftp://serveradmin:somepassword#s12345.gridserver.com/path/to/desired/folder/
When I try the first way I get this error:
Bad port number.
When I try the second way I get a little further but I get this error:
Resolving s12345.gridserver.com... 71.46.226.79
Connecting to s12345.gridserver.com|71.46.226.79|:21... connected.
Logging in as serveradmin ...
Login incorrect.
What could I be doing wrong?
Use scp on the Mac instead, it will probably work much nicer.
scp -r user#mediatemplehost.net:/folder/path /local/path

Resources