Automatically uploading files into AWS Linux from Windows - amazon-ec2

Short Version: I need a code which automatically uploads a file into AWS Linux server.
Longer Version: There is a software called MinKNOW which does DNA sequencing and it takes hours to complete a single sequencing. As an output it creates a file in fastq format. The output file gets bigger while MinKNOW does sequencing. I want to write a code which uploads these files into my AWS Linux server automatically once in a while.

Related

How can I use Packer to build an image in Azure with a large file?

I want to build an image in Azure on Windows 2019 with a large file. The file contains the artifacts I need to create my image. All was going well getting the VM going until I tried to copy a 200MB file to the host. That was slated to take 11 hours. That's when I found out about WinRM and its file copy limitations.
I then found out about the http_directory parameter to copy a file to the host over http, but that is not available with the azure-arm builder.
I then saw that I can use SSH. The issue is I cannot start with the ssh provisioner and switch to the WinRM provisioner to run my ps1 configuration script. I read I can start with ssh, take an image and then start it up with WinRM. Not ideal.
I came across this software: "https://github.com/winfsp/sshfs-win" - but I am unaware of a way for a provisioned host to directly access the packer host it was provisioned by.
I came across winrmcp but that seems old and not likely to work if I could ever figure out how to get it going.
I am leaning towards blob storage, but that seems like a cop-out.
What is the best way to start a Windows 2019 Server, copy a 200MB file to it and run a ps1 script?

When uploading new files to FTP server, how to prevent reupload of files that were deleted on the server meanwhile

I need to automate the upload of some files from client PCs to a central server. We're building central statistics for an online gaming community, processing game replay files.
target is my own small VPS server running ubuntu
upload file size 2-3MB
20-40 different clients running windows spread around the globe
I expect ~6GB of wanted data to be uploaded over the course of 7 weeks (a season in our game) and 5-10x that amount of "unwanted" data.
The files are processed on the server, and then they're not required anymore, and ought to be deleted to not run out of disk space eventually. I also only need some of the files, but due to the files requiring very complex processing including decryption, so i can only determine that after the server processed it.
My initial idea was to use a scriptable client such as WinSCP, and use some Windows scheduler entry to automate it. WinSCP documentation looks very nice. I am a bit hesitant because I see the following problems:
after deletion on the server, how to prevent re-upload ?
ease of setup to technical novices
reliability of the solution
I was thinking maybe someone has done the same before and can give some advice.
There's article on WinSCP site that deals with all this:
How do I transfer new/modified files only?
For advanced logic, like yours, it uses PowerShell script with use of WinSCP .NET assembly.
Particularly, there is a section that you will be interested in: Remembering the last timestamp – It shows how to remember the timestamp of the last uploaded file, so that the next time you will transfer only newer files, even if the previously uploaded files are not on the server anymore.
The example is for downloads with Session.GetFiles, but it will with small changes work for uploads with Session.PutFiles too.
It also points to another article: Remember already downloaded files so they are not downloaded again, which shows another method – To store names of already transferrer file to a file and use it the next time to decide, which files are new.

Macintosh: Converting RAW files to dng using a shell script

A while back I built a simple droplet app that took raw files and moved them from a desktop to a server location and did a DB update. I've now received a request from the photo dept. to have it also convert raw files to dng format.
The app is built in XCode but just runs an AppleScript. I've used shell scripts for updating databases and cURL and getting meta data. I don't want to and really can't pop open an app (like PS CS5) to do the conversion, is there a shell script out there that'll take care of this for me?

Methods to transfer files from Windows server to linux server

I need to transfer webserver-log-like-files containing periodically from windows production servers in the US to linux servers here in India. The files are ~4 MB in size each and I get about 1 file per minute. I can take about 5 mins lag between the files getting written in windows and them being available in the linux machines. I am a bit confused between the various options here as I am quite inexperienced in such design:
I am thinking of writing a service in C#.NET which will periodically archive, compress and send them over to the linux machines. These files are pretty compressible. WinRAR can convert 32 MB of these files into a 1.2 MB archive. So that should solve the network transfer speed issue. But then how exactly do I transfer files to linux? I could mount linux drive on windows server using samba, or should I create an ftp server, or send the file serialized as a POST request. Which one would be good? Also, I have to minimize the load on the windows server.
Mount the windows drive on linux instead. I could use the mount command or I could use samba here (What are the pros and cons of these two?). I can then write the compressing and copying part in linux itself.
I don't trust the internet connection to be very stable, so there should be a good retry mechanism and failure protection too. What are the potential gotchas in these situations, and other points that I must be worried about?
Thanks,
Hari
RAR is bad. Stick to 7zip or bzip2. Transfer it using ssh, probably with rsync since it can be link-failure-tolerant.
WinSCP can help you transfer files from Windows to Linux in batch with script. Then configure Windows Task Scheduler to run the script periodically.
I learnt from this post step by step: https://techglimpse.com/batch-script-automate-file-transfer-winscp/

Are there any FTP programs which can automatically send the contents of a folder to a remote server?

Are there any FTP programs which can automatically copy (or rather 'move') the contents of a folder to a remote server? I have of course googled this but only really found one or two ancient products which look really clunky and unmaintained. I was wondering if there's a way to do this from the command line or any better solution to the base problem.
In more detail, new files get written to a folder every few hours. These new files need to be FTP'd elsewhere and then deleted. Mirroring or synchonisation systems are probably out of the picture as we need to delete the source files once they've been successfully transferred.
If it's easier, the 'solution' could pull the files off the server (rather than the server pushing them to the client). The computers will both be Windows OS.
You could use any off the shelf FTP program that supports command line and schedule a task on Windows Scheduler to run every 10 minutes. Check the folder, and move any files to the FTP site.
In the end I used a program called FTP Auto Sync: http://ftp-auto-sync.com/

Resources