Generating an MD5 for flute software - shell

Hi I have a software that is called MAD FLUTE, It send files from a server to several clients using multicast,it uses FEC, the problem that I have is that when I want to send a list of files in the form of an FDT_TSI.xml it adds the MD5 of each file for the receiver to check if the file are correct.
Since I'm doing the FDT file with a script I haven't found a way to generate the MD5 like the software
Content-MD5="c+kuA8jR7esYd1k1PZgVJw==" even if I use openssl-md5 to generate it
What can I do?

Related

How to download a CSV from a HTTPS URL to file using Pentaho Data Integration - Spoon (Kettle)?

When googling this question, it seems to have been asked, and partially (and poorly) answered a number of times, mostly for older versions.
Question: How can I download a CSV to a local file, with the below constraints? I'm designing in Spoon.
URL: Will always be the same. https://example.com/data/my.csv . The website prepares the csv and provides it back to the web client as a file download after about 4-5 seconds. In a browser this means it is downloaded as a .csv, and not displayed.
Authentication: The website does not require authentication for access. The data isn't sensitive.
Local file path: The downloaded CSV will overwrite the existing csv. eg: d:\data\my.csv . Ie, I can set this on a timer and have it download the newest csv every hour or so.
Proxy: It is quite likely I will need to traverse a network proxy. eg badproxy.mynetwork.internal:8080 and that proxy requires a username and password. It's far better if I can set this password in a single location so any future things created can reference it. Not really sure on how to approach this either.
The rest of my process focuses on addressing the content of the csv, and already works fine.
The processes I've found on google show using the Http Client component, though it's not particularly straightforward how this translates into a file being saved locally into a known location.
Thanks for any pointers.
PDI v9.0.0.0-423
The HTTP client step needs to be triggered. Use a Row generator step generating e.g. 1 empty row and link that with a hop to the HTTP client step.
for your solution , try this:
Data Grid -->HTTP Client-->CSV File Input->Text file output(extension with csv)

Calculate file checksum in FTP server using Apache FtpClient

I am using FtpClient of Apache Commons Net to upload videos to FTP server. To check if the file has really been successfully transferred, I want to calculate the checksum of remote file, but unfortunately I found there is no related API I could use.
My question is: Whether there is a need to calculate file checksum in ftp server? If the answer is yes, how to get checksum in FtpClient?
If the answer is no, how do FtpClient know if the file has really been successfully and completely transferred?
With FTP, I'd recommend to verify the upload, if possible.
The problem is that there's no widespread standard API for calculating checksum with FTP.
There are many proposals for checksum calculation command for FTP. None were accepted yet.
The latest proposal is:
https://datatracker.ietf.org/doc/html/draft-bryan-ftpext-hash-02
As a consequence, different FTP servers support different checksum commands, with a different syntax. HASH, XSHA1, XSHA256, XSHA512, XMD5, MD5, XCRC, to name some. You need to check what, and if any at all, your FTP server supports.
You can test that with WinSCP. The WinSCP supports all the previously mentioned commands. Test its checksum calculation function or checksum scripting command. If they work, enable logging and check what command and what syntax WinSCP uses against your server.
> 2015-04-28 09:19:16.558 XSHA1 /test/file.dat
< 2015-04-28 09:19:22.778 213 a98faefdb2c36ca352a2d9b01668aec6b641cf4b
Then execute the command using Apache Commons Net sendCommand method:
if (FTPReply.isPositiveCompletion(ftpClient.sendCommand("XSHA1", "filename"))
{
String[] reply = ftpClient.getReplyStrings();
}
(I'm the author of WinSCP)
If your server does not support any of the checksum commands, you do not have many options:
Download the file back and check it locally.
When using encryption (TLS/SSL), chances of the file being corrupted during transfer are significantly lower. The receiving party (server in this case) would otherwise fail to decrypt the data. So if you are sure that the file transfer completed (no decryption errors and the size of the uploaded file is the same as size of the original local file), you can be pretty sure that the uploaded file is correct.
Just a addition of how I implemented this. When dealing with standard ftp servers without any additionak modules loaded for checksum checking, all i did was creating a list of MD5 CRC hashes about each file into an SFV file. Say its called uploads.sfv (just in the same format as sfv generator would do). This allows you to do further checksum checks.
Examples about the server side support checksum checking support:
PZS-ng for cuftpd, glftpd
mod_digest for ProFTPD
Of course as #MartinPrikryl highlighted, none of these are standardized.
That's a long shot, but if the server supports php, you can exploit that.
Save the following as a php file (say, check.php), in the same folder as your name_of_file.txt file:
<? php
echo md5_file('name_of_file.txt');
php>
Then, visit the page check.php, and you should get the md5 hash of your file.
Related questions:
FTP: copy, check integrity and delete
How to perform checksums during a SFTP file transfer for data integrity?
https://serverfault.com/q/98597/401691

How to verify if upload is finished in SFTP [duplicate]

This question already has answers here:
How to confirm SFTP file delivery?
(3 answers)
Closed 1 year ago.
I'm uploading the file through Sftp to destination server using bash scripts.
How I can be sure that the file which is uploaded is complete upload in the case sftp will not return anything or network connection could be broken?
I see that I can get the size of the file before uploading to the server and then I can compare it with the existing size for the file on the server.
Perhaps you can mention about other better options?
Thank you.
I think getting the size is a good option.
What I could imagine :
Client side :
- Put the size of the file, and its md5 in a file, like ".fileinfo"
- Send the fileinfo to the server
- Send the (interesting) File to the server
Server side :
- Check periodically files of a folder (with "watch ls" command for example)
- If a ".fileinfo" exists, read it, and check if the size corresponds to an existing file of the same name (without ".filefome"). If the size corresponds, do an "md5sum" of the file, and check if it corresponds. If yes, move your file into your final destination folder, and delete the ".fileinfo" file. If not reiterate.
Many sites for downloading softwares will provide both the software and its checksum.
we can use the same technique to check our uploading file.
upload the file together with its checksum, on the server side compare the file's checksum with uploaded checksum,
if the two don't match, you will know
The file uploaded is corrupted, or
The checksum uploaded is corrupted, or
Both the checksum and file uploaded are corrupted.
Test exit code of sftp. If it returns 0 you can be pretty sure that everything is ok (assuming you are using OpenSSH sftp). This works only when you use -b switch (what I assume you are doing).
SFTP protocol allows checksum calculation, but I suppose you are stuck with OpenSSH (or either or both sides) that does not support this.
To be 100% sure, you can download the file back and compare with original.

What's the best way to (programatically) determine a file's network origin?

For an application I'm writing, i want to programatically find out what computer on the network a file came from. How can I best accomplish this?
Do I need to monitor network transactions or is this data stored somewhere in Windows?
When a file is copied to the local system Windows does not keep any record of where it was copied. So unless the application that created it saved such information in the file then it will be lost.
With file auditing file and directory operations can be tracked, but I don't think that will include the source path with file copies (just who created it and when).
Yes, it seems like you would either need to detect the file transfer based on interception of network traffic, or if you have the ability to alter the file in some way, use public key cryptography to sign files using a machine-specific key before they are transferred.
Create a service on either the destination computer, or on the file hosting computers which will add records to an Alternate Data Stream attached to each file, much the way that Windows handles ZoneInfo for files downloaded from the internet.
You can have a background process on machine A which "tags" each file as having been tagged by machine A on such-and-such a date and time. Then when machine B downloads the file, assuming we are using NTFS filesystems, it can see the tag from A. Or, if you can't have a process at the server, you can use NTFS streams on the "client" side via packet sniffing methods as others have described. The bonus here is that future file-copies will retain the data as long as it is between NTFS systems.
Alternative: create a requirement that all file transfers must be done through a Web portal (as opposed to network drag-and-drop). Built in logging. Or other type of file retrieval proxy. Do you have control over procedures such as this?

Encrypted FTP Storage

I guess this is kind of a programming question, because I'm going to write a program if this doesn't exist.
So I found a very cheap web-host (I don't really care about the actual web hosting). They will give me a domain name and ftp server with a ton of storage space. Anyway, I want to backup a few hundred gigs of data (mostly family photos and scans of important documents). I also want to backup any future family photos / documents. I don't care if everything on my local NAS dies in a fire, I just want to have the photos and important documents backed up off-site.
So I want some program that lets me select folders locally and schedules them to be backed up to the ftp server. I'm a bit of a security nut, so i'd like the files to be encrypted locally before being transferred up onto the server.
I know I can do this with truecrypt volumes, but I don't want to transfer an entire encrypted volume blob up to the server ever time I change a file in it. So I could do multiple true crypt volumes but that will be a pain to manage
Also this must be mac/linux compatible although I'll primarily be on linux.
I basically need rsync + truecrypt + cron + sftp all rolled into a cryptographically secure program.
I've been searching for days with no luck. Any ideas?
mozyBackup does this - it doesn't use FTP, it has a custom uploader.
ps. Remember a typical home ADSL connection only does about 1Gb/day upstream
Linux option.
Out of the box option probably duplicity ( for example see http://www.howtoforge.com/creating-encrypted-ftp-backups-with-duplicity-and-ftplicity-on-debian-lenny )
Otherwise if these are basically rarely changed archive copies of files - I would roll my own gnupg (or dpad) individual file encryption, a file changed script, and ftp or rsync.

Resources