How can I ftp multiple files? - ftp

I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?

I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene

See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.

I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.

If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.

It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.

I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *

Related

Reading and redirecting zip file by Less

Im trying to copy a zip file located on a server by a ssh2 library.
the way i'm about to do is using less command and write it down on client side.
Less -r -L -f zipfile
but the output file is bigger than the original.
i know this not a good practice but i have to.
so how can i handle this to have my zip file on the client machine?
Is Less an mandatory command to do that ?
You can simply use scp to achieve that by providing user and host and then typing the directory, where to copy the file from the server to local host, like on the example below:
scp your_username#remotehost.edu:foobar.txt /some/local/directory

How can I send an HTTPS request from a file?

Let's assume I have a file request.txt that looks like:
GET / HTTP/1.0
Some_header: value
text=blah
I tried:
cat request.txt | openssl -s_client -connect server.com:443
Unfortunately it didn't work and I need to manually copy & paste the file contents. How can I do it within a script?
cat is not ideally suited to download remote files, it's best used for files local to the file system running the script. To download a remote file you have other commands that you can use which handle this better.
If your environment has wget installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
wget https://server.com/request.txt
If your environment has curl installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
curl -O https://server.com/request.txt
Please note that if you want to store the response in a variable for further modification you can do this as well with a bit more work.
Also worth noting is that if you really must use cat to download a remote file it's possible, but it may require ssh to be used and I'm not a fan of using that method as it requires access to a file via ssh where it's already publicly available over HTTP/S. There isn't a practical reason I can think of to go about it this way, but for the sake of completion I wanted to mention that it could be done but probably shouldn't.

Bash script for recursive directory listing on FTP server without -R

There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.

FTP backup script with hard links using

Usually I use rsync based backup.
But now I have to make backup script from Windows server to linux.
So, there is no rsync - only FTP.
I like ideas of hard links using to save disk space and incremental backup to minimize traffic.
Is there any similar backup script for ftp instead of rsync?
UPDATE:
I need to backup Windows server through FTP. Backup script executes at Linux backup server.
SOLUTION:
I found this useful script to backup through FTP with hard links and incremental feature.
Note for Ubuntu users: there is no md5 command in Ubuntu. Use md5sum instead.
# filehash1="$(md5 -q "$curfile"".gz")"
# filehash2="$(md5 -q "$mysqltmpfile")"
filehash1="$(md5sum "$curfile"".gz" | awk '{ print $1 }')"
filehash2="$(md5sum "$mysqltmpfile" | awk '{ print $1 }')"
Edit, since the setup was not clear enough for me from the original question.
Based on the update of the question the situation is, that you need to pull the data on the backup server from the windows system via ftp. In this case you could adapt the script you find yourself (see comment) or use a similar idea like:
Use cp -lr to clone the previous backup with hard links.
Use lftp --mirror to overwrite this copy with anything which got updated on the remote system.
But I assumed initially that you need to push the data from the windows system to the backup server, that is the FTP server is on the backup system. This case can not handled this way (original answer follows):
Since FTP has no idea of links at all any transfers will only result in new or overwritten files. The only way would be to using the SITE command to issue site specific commands and deal this way with hard links. But site specific commands are usually restricted heavily so that you can do something like change permissions but not do anything with hard links.
And even if you could support hard links with SITE you have to implement the logic which decides when to use such links. With rsync this logic is built into the rsync server and executed on the server site. With FTP you have to built all the logic at the client site, which means that you would have to download a file to compare it with a local file and then decide if you would need to upload the new file or if a hard link to an existing file could be used.

Bash: How to recursively ftp a certain file type under multiple directories

Is it possible to have two wildcards? If not, is there another way of going about this problem?
I am trying to recursively get a file type from an ftp server using Bash. But, what I am having trouble with is that I am trying to remove files from multiple directories. Many of these directory names will have matching strings. The client will look for all directories with the matching string and get a certain file type from each. Each directory can have many files with the same extension.
What I have tried to do is use wget recursively.
wget -r 'ftp://anonymous:#$HOST/$PATH/$DIRSTRING*/*.$FILEEXT
This gives me an error message saying the $PATH/$DIRSTRING*/ file or directory could not be found.
I know wget supports globbing. But, is it possible to have two wildcards? If not, is there another way of going about this problem?
Best Regards
wget is not really suited for this kind of ftp usage...but lftp is very good at mirroring ftp site data, it even supports globbing! :)
for your example:
lftp -e "mirror -I '$DIRSTRING*/*$FILEEXT' /$RPATH mirrorSite" ftp://anonymous#$HOST
see man lftp / mirror command

Resources