Compare FTP files with MD5 - ftp

I want to use batch to upload from the mput command to FTP. But, what I do not want to do is upload a file that has already been uploaded before, so I need the MD5 of each file and compare the new ones with the existing ones.
How are the commands I need?

First, make sure your remote server supports the checksum calculation at all. Many do not. I believe there's even no standard FTP command to calculate a checksum of a remote file. There were many proposals and there are many proprietary solutions.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
Some of the commands that can be used to calculate MD5 checksum are: XMD5, MD5, and HASH.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or the checksum scripting command. If they work, enable logging and check, what command and what syntax WinSCP uses against your server.
Then you can execute the command in your command-line FTP client. You didn't tell us, what client are you using. In common Windows or *nix command-line ftp client, you can use quote command to execute an arbitrary FTP protocol command, like:
quote MD5 filename
If you are on Windows, you can use WinSCP scripting. As in GUI, WinSCP will find out for you, what command to use. You simply use checksum command:
checksum md5 index.html
(I'm the author of WinSCP)
Though note, that you better use SHA-1. MD5 is not to be trusted anymore.

Related

Transfer file in Secure Shell

I use Secure Shell as a Chrome extension (https://chrome.google.com/webstore/detail/secure-shell/pnhechapfaindjhompbnflcldabbghjo?hl=da)
Now I have finished some programming and I need the file on my computer.
How can I transfer the file to my computer? Can I send it by email or something?
I have tried yanking all the lines in vim, but I still don't get it copied to my windows clipboard.
One entertaining (and moderately ridiculous) approach would be sprunge.
Run this on the remote machine:
cat myFile | curl -F 'sprunge=<-' http://sprunge.us
And then visit the URL it prints on your local machine. :D
I presume that you are using Windows OS and trying to download your file from a Linux like OS.
You use MobaXterm and it comes with a file transfer features.
http://mobaxterm.mobatek.net
On a CLI you can use "scp" to download and upload.
Another one is you can also use FileZilla using SFTP protocol

How can I send an HTTPS request from a file?

Let's assume I have a file request.txt that looks like:
GET / HTTP/1.0
Some_header: value
text=blah
I tried:
cat request.txt | openssl -s_client -connect server.com:443
Unfortunately it didn't work and I need to manually copy & paste the file contents. How can I do it within a script?
cat is not ideally suited to download remote files, it's best used for files local to the file system running the script. To download a remote file you have other commands that you can use which handle this better.
If your environment has wget installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
wget https://server.com/request.txt
If your environment has curl installed you can download the file by URL. Here is a link for some examples on how it's used. That would look like:
curl -O https://server.com/request.txt
Please note that if you want to store the response in a variable for further modification you can do this as well with a bit more work.
Also worth noting is that if you really must use cat to download a remote file it's possible, but it may require ssh to be used and I'm not a fan of using that method as it requires access to a file via ssh where it's already publicly available over HTTP/S. There isn't a practical reason I can think of to go about it this way, but for the sake of completion I wanted to mention that it could be done but probably shouldn't.

MIME type check via SFTP connection

I want to list images by SFTP and save this list, so another script may further process it.
Unfortunately, there are also many other files there, so I need to identify which are images. I am filtering out everything with wrong file extension, but I would like to go a step further and check also the content of the file.
Downloading everything to check it with file --mime-type on local machine is too slow. Is there a way how to check MIME type of a file on remote SFTP before the download?
We found a way, downloading only first 64 bytes. It is a lot faster than downloading whole file, but still enough to see if it looks like an image:
curl "sftp://sftp.example.com/path/to/file.png" -u login:pass -o img.tmp -r 0-64
file --mime-type img.tmp
MIME type is supported by SFTP version 6 and newer only.
Most SFTP clients and servers, including the most widespread one, OpenSSH, support SFTP version 3 only.
Even the servers that I know of to support SFTP version 6, like Bitvise or ProFTPD mod_sftp, do not support the "MIME type" attribute.
So while in theory it's possible to determine MIME type of remote files over SFTP, in practice, you won't be able to do it.
You can run any command remotely using ssh:
ssh <destination> '<command_to_run>'
In this case that would be something like:
ssh <remote_machine_name_or_ip_address> 'file --mime-type ./*'

Windows Command Line FTP to deploy website

Trying to set up a post build script on my CI server to push changes to our web server by FTP. In as few lines as possible how can i push a folder of files to my webserver using windows FTP? For example deployment folder is:
c:\deployment\*.*
How can i recursively push all files to replace on the web server?
I'm open to using cmd or powershell - MS Windows only
Thanks
Windows' built-in command-line FTP client doesn't have recursion built-in. The easiest way would be to use a different FTP client. NcFTP will do what you're looking for. See the manual page for ncftpput. The syntax is basically as follows:
cd c:\deployment
ncftpput -u user -p pass -R ftp.ftpserver.com /path/on/ftp/server .\*
Or if your web server also runs an ssh service, then rsync would be even better.
Fsync is good, I am using it for long. It allows to push only what has changed. Recursion of course. Exclude files, too. Track client-side (much faster) what has changed... Biggest only drawback: No SFTP./ProductList/Fsync.html

How can I ftp multiple files?

I have two unix servers in which I need to ftp some files.
The directory structure is almost same except a slight difference, like:
server a server b
miabc/v11_0/a/b/c/*.c miabc/v75_0/a/b/c/
miabc/v11_0/xy/*.h miabc/v11_0/xy/
There are many modules:
miabc
mfabc
The directory structure inside them is same in both the servers except the 11_0 and 75_0. And directory structure in side different modules is different
How can I FTP all the files in all modules into the corresponding module in second server b by any of scripting languages like awk, Perl, shell, ksh using FTP?
I'd say if you want to go with Perl, you have to use Net::FTP.
Once, I needed a script that diffs a directory/file structure on an FTP
server with a corresponding directory/file structure on a local harddisk,
which lead me to write this script. I don't know if it is efficient or elegant, but you might find one or another
idea in it.
hth / Rene
See you need to use correct path of directory where you want to send files.
You can create small script with php .
php provide good ftp functions.using php you can easily ftp your file. but before that, once check your ftp settings of IIS server or file zilla
I have used following code for sending files on ftp this is in php :-
$conn_id = ftp_connect($FTP_HOST) or die("Couldn't connect to ".$FTP_HOST);
$login_result =ftp_login($conn_id, $FTP_USER, $FTP_PW);
ftp_fput($conn_id, $from, $files, $mode) // ths is the function to put files on ftp
This code is just for reference , go through php manual before using it.
I'd use a combination of Expect, lftp and a recursive function to walk the directory structure.
If the file system supports symlinking or hardlinking, I would use a simple wget to mirror the ftp server. in one of them when you're wgetting just hack the directory v11_0 to point to 75_0, wget won't know the difference.
server a:
go to /project/servera
wget the whole thing. (this should place them all in /project/servera/miabc/v11_0)
server b:
go to /project/serverb
create a directory /project/serverb/miabc/75_0, link it to /project/servera/v11_0:
ln -s /project/serverb/miabc/75_0 /project/servera/v11_0
wget serverb, this will be followed when wget tries to cwd into in 75_0 it will find itself in /project/servera/v11_0
Don't make the project harder than it needs to be: read the docs on wget, and ln. If wget doesn't follow symbolic links, file a bug report, and use a hard link if your FS supports it.
It sounds like you really want rsync instead. I'd try to avoid any programming in solving this problem.
I suggest you could login on any of the server first and go to the appropraite path miabc/v75_0/a/b/c/ . From here you need to do a sftp to the other server.
sftp user#servername
Go to the appropraiate path which files needs to be transferred.
write the command mget *

Resources