Streaming a dynamic zip from Amazon S3 - ruby

I am looking for a way to dynamically stream download a zip of files from Amazon S3.
The application is hosted on EC2 and the files are stored on S3.
Need to give users the ability to select from a group of files which will then get bundled up and downloaded to them.
Have heard about a few Actionscript libraries (aszip and fzip) that might be possible, or could do this in Ruby, or even possibly PHP.
The files do not need any compression, zip is just being used to bundle the files up into one single download....

I use Nginx Zip Module to stream local files, but there is option to stream from remote locations. Otherwise you could use it with VFS mounted S3 storage as local filesystem.
It supports seek - resumable and accelerated downloads

If you can use Mono, DotNetZip will do it.
Response.Clear();
Response.BufferOutput= false; // necessary for chunked output
String ReadmeText= "This content goes into an entry in the " +
"zip file. Timestamp, MD5, whatever." ;
string archiveName= String.Format("archive-{0}.zip", DateTime.Now.ToString("yyyy-MMM-dd-HHmmss"));
Response.ContentType = "application/zip";
Response.AddHeader("content-disposition", "filename=" + archiveName);
using (ZipFile zip = new ZipFile())
{
zip.AddEntry("Readme.txt", "", ReadmeText, Encoding.Default);
zip.AddFiles(filesToInclude, "files");
zip.Save(Response.OutputStream);
}
HttpContext.Current.ApplicationInstance.CompleteRequest();
DotNetZip is open source, free to use.

Java supports streaming zips too. take a look at the java.utils.zip package. i used that to implement a pipline consisting of FTP, UNZIP, XSLT, CSV units. it works like a charm.
Martin

Related

Flutter web: Download large files by reading a stream from the server?

There are already several articles about starting downloads from flutter web.
I link this answer as example:
https://stackoverflow.com/a/64075629/15537341
The procedure is always similar: Request something from a server, maybe convert the body bytes to base64 and than use the AnchorElement to start the download.
It works perfectly for small files. Let's say, 30MB, no problem.
The whole file has to be loaded into the browser first, than the user starts the download.
What do to if the file is 10GB?
Is there a way to read a stream from the server and write a stream to the users download? Or is an other way preferable like to copy the file to a special folder that is directly hosted by the webserver?

How can I use named pipes to stream a GCP Cloud Storage object to an executable that wants input files?

I have a third-party executable that takes a directory path as an argument and in turn looks there for a collection of .db files. I have said collection of files stored in a Google Cloud Storage bucket and would like to stream the content of those files into some local named pipes that can be used as input to the executable.
I'm writing an application to perform the above in Go and am using the "cloud.google.com/go/storage" package to work with cloud storage objects.
As a note, I need all pipes/files to be available for reading at the time I run the executable.
What is the best way to go about this? I'm looking to essentially used the named pipe as a proxy of sorts to make remote files look local to this executable. Possible?

How could I watch the recorded live streams using wowza engine?

We wanted to let our clients review the live streams made. We checked the option ‘Record all live streams’ from the Wowza Engine Manager. We know that the streamings are being saved inside the wowza content folder but since our engine is located in a EC2 instance we could find no easy way for our clients to watch them but to download them through console.
Can the manager be configured to show the videos there like it is on Wowza Streaming Cloud?
in my case I set up a webserver(apache2) on the same machine listening on port 8080 (wowza uses 80 for hls streaming), then I set a symbolic link from /var/www/html/content to {Wowza installation Folder} /content this way the users can reach the recordings at http://youserver.com:8080/content
by default apache will list all files on the folder and if the file is .mp4 the browser will play the video, if file is .flv it will be downloaded
If it's an option for you, you can move your recordings to s3. You should first mount an s3 bucket in your filesystem (s3fs), then configure the module ModuleMediaWriterFileMover to move the recorded files to the mount dir.
A better approach:
Move the files to an S3 bucket as soon as they are ready.
Wowza actually has a module for this (of course it does, everybody needs it)
https://www.wowza.com/forums/content.php?813-How-to-upload-recorded-media-to-an-Amazon-S3-bucket-(ModuleS3Upload)
So, as you do with every other module,
1- include files in lib folder
2- go to the engine manager UI and add the module
3- set your keys and bucket in the manager properties
Restart and done. Works like a charm and no files are uploaded before they are ready.
Note: Be careful because unless you are naming each stream with a timestamp like I'm doing, amazon will overwrite the file when uploading one with the same name.

Spring API to unzip files

I know Spring has MultipartFile component.
I am wondering if there is any API to unzip files or read zip files to do some processing?
I have a zip file that following a certain format.
photos\
audio\
report.xml
when the user upload it via web, I wish to scan the zip file and do some processing.
Is there a solution for this issue?
I do not know spring have any such type of API,
but you can use other API for ZIP or UNZIP files.
1) http://commons.apache.org/compress/
2) java.util.zip
and also see
What is a good Java library to zip/unzip files?
There are a couple of Java SE APIs for reading ZIP files:
java.util.zip.ZipInputStream - gives you a one-pass reader
java.util.zip.ZipFile - gives you a reader that allows you to read the entries and the files in any order.
You should be able to use one or the other of these, depending on the nature of your processing.
If the processing requires the images to be in actual files, you would have to create the directories and write the files yourself. In this case, it would probably be simpler to use an external command to do the ZIP extraction.

Unzip the .GZ file in worker Process of Azure

Can any1 provide me an Idea, How to implement unzipping of .gz format file through Worker. If i try to write unzipping of file then, where i need to store unzipped file(i.e one text file
) , Will it be loaded in any location in azure. how can i specify the path in Windows Azure Worker process like current execting directory. If this approach doesnot work, then i need to create one more blob to store unzipped .gz file i.e txt.
-mahens
In your Worker Role, it is up to you how a .gz file arrive (downloaded from Azure Blob storage) however on the file is available you can use GZipStream to compress/uncompress a .GZ file. You can also find code sample in above link with Compress and Decompress function.
This SO discussion shares a few tools and code to explain how you can unzip .GZ using C#:
Unzipping a .gz file using C#
Next when you will use Decompress/Compress code in a Worker Role you have ability to store it directly to local storage (as suggested by JcFx) or use MemoryStream to store directly to Azure Blob Storage.
The following SO article shows how you can use GZipStream to store unzipped content into MemoryStream and then use UploadFromStream() API to store directly to Azure Blob storage:
How do I use GZipStream with System.IO.MemoryStream?
If you don't have any action related to your unzipped file then storing directly to Azure Blob storage is best however if you have to do something with unzipped content you can save locally as well as storage to Azure Blob storage back for further usage.
This example, using SharpZipLib, extracts a .gzip file to a stream. From there, you could write it to Azure local storage, or to blob storage:
http://wiki.sharpdevelop.net/GZip-and-Tar-Samples.ashx

Resources