How can I FTP many files I have listed in a TXT? - bash

I have a list of files inside a TXT file that I need to upload to my FTP. Is there any Windows Bat file or Linux shell script that will process it?

cat ftp_filelist | xargs --max-lines=1 ncftpput -u user -p pass ftp_host /remote/path

You can use the wput command.
The syntax is somewhat like this
wput -i [name of the file.txt]
Go through this link
http://wput.sourceforge.net/wput.1.html
It works for linux.With this it will upload all the URLs given in the text file onto your ftp server one by one.

You may want to check out Jason Faulkners' batch script about which he wrote here.

Related

What would be the best way to download a list of files from a server?

I have a list of SWF files that I want to download to my PC. Is there a quick way to do this from my server, like using WGET or something similar. Each file line list is a new line:
http://super.xx-cdn.com/730_silvxxxen/v/v.swf
http://super.xx-cdn.com/730_sixxxxheen/73xxxxversheen.swf
http://super.xx-cdn.com/730_rxxxd/v/v.swf
There are thousands of lines.
If you use ssh over putty to access your server you could easily use winscp from putty
otherwise you could also use pscp
If you do not have putty installed get it and make up a ssh to your server
Another easy way to download them is just getting an FTP client and download them over FTP
You can use simple SH script, if I correctly understand your question:
#!/bin/sh
while IFS= read -r line
do
wget $line
done < "urls.txt"

Download file with different local name using wget

Hi i have a text file where download links are given like -
http://www.example.com/10.10.11/abc.jpg
http://www.example.com/10.10.12/abc.jpg
http://www.example.com/10.10.13/abc.jpg
Here 10.10.* is the date of the image.
I need to download all the images using wget where the image name will be the corresponding date (eg. 10.10.111.jpg).
PS. I tried using:
wget -i download.txt
So, any solution?
Thanks in advance
You can instruct Wget to create subdirectories based on the URL, and then do the renaming after the download has finished.
I'd suggest a batch script that downloads the files one by one using the -O option, and a bit of sed/awk magic to get the names right
But careful! given the -O option, you have to call wget on a per file basis
This should do the trick.
#!/bin/sh
while read url; do
urldir=${url%/*}
dir=${urldir##*/}
wget -O $dir.jpg $url
done < download.txt
This might work for you:
sed '\|/\([^/]*\)/[^/]*\1[^/.]*.jpg|!d' download.txt | wget -i -
Explanation:
Filter the download.txt file to contain only those files which you require and then pass them on to wget.
I have developed a script that does just this bulkGetter. Super easy to use, you just need an input file with all the links you want to download and use option "-rb" (refer to link).

Rename multiple files using ftp

I have a set of files in my ftp folder. I have access to only ftp mode. I want to rename those files with extension .txt to .done
Ex:
1.txt, 2.txt, 3.txt
to
1.done, 2.done, 3.done
Only rename command is working in this ftp. I am expecting something like
rename *.txt *.done
to rename them all in a single command.
In short: You can't.
FTP is very basic and does not support mass renaming. You can either write a small script for it, or download some helper software, such as the one here.
Hallo to all,
Even if the question is quite old, I think could be usefull for others to read my suggestion.
I found a great and easy solution combining curlftpfs, "A FTP filesystem based on cURL and FUSE" as they define it, and rename linux and unix multi rename tool.
I tested on linux mint 17 (and I think it should work in other debian based distributions)
install curlftpfs
sudo apt-get install curlftpfs
create the mount folder
sudo mkdir /mnt/ftp_remote_root
mount remote ftp on folder
sudo curlftpfs -o allow_other -o user="USERWITH#CHARACTERTOO:PASSWORDTOACCESSUSER" ftp://my_ftp_server.com /mnt/ftp_remote_root/
jump into desired ftp remote folder
cd /mnt/ftp_remote_root/path/to/folder
rename as you need files (-v shw new names, -n show interested files, omitt them to rename files)
sudo rename -v -n 's/match.regexp/replace.regexp/' *.file.to.change
It could took few seconds because it works on network.
I think it is really powerfull and easy to use.
Let me know if you find any problems.
Bye
Lorenzo
try something like this:
the following example move/rename files on the FTP server
for f in $(lftp -u 'username,password' -e 'set ssl:verify-certificate
no; ls /TEST/src/*.csv; quit' ftp.acme.com| awk '{print $9;}'); do
lftp -u 'username,password' -e "set ssl:verify-certificate no; mv
/TEST/src/$f /TEST/dst/$f; quit" ftp.acme.com; done
note: use .netrc to store username and password.
Use the following command:
ren *.txt *.done

cron job to update a file from ftp

I tried something like:
wget ftp://username:password#ftp.somedomain.com/bla/blabla.txt -x /home/weptile/public_html/bla/blabla.txt
Appereantly -x writes the output :) I thought it was overwriting the file I need.
So what I'm trying to do is do daily updates on blabla.txt in this specific subdirectory from an external ftp file. I want to get the file from ftp and overwrite the old file on my server.
Use wget -N to overwrite existing files.
If you get stuck on stuff like this, try man wget or heck, even Google.

FTP using Batch file

i want to automate the task of uploading of a file at FTP site "uploads.google.com"
how can i achive this
please see here for example
One of the example is depicted as follows :
Prepare a file (say ftp_cmd.txt)with all the ftp commands to upload the files to our specific site
as below:
binary
cd
mput file.*
bye
Now create a batch file with the following content:
ftp -i -v -s:
ex: ftp -i -v -s:ftp_cmd.txt updates.google.com
Now, when you execute this batch file, it will put all files with format file.* to the specified directory.

Resources