I have many small svg files (around 200KB) in Azure Blob Storage and I'm trying to download few hundred at a time. For this I'm using something like that:
Parallel.ForEach(files, file =>
{
var stream = fileManager.DownloadFile(file);
})
MemoryStream DownloadFile(string fileName)
{
var ms = new MemoryStream();
GetFileReference(fileName).DownloadToStream(ms);
ms.Seek(0L, SeekOrigin.Begin);
return ms;
}
private static IFile GetFileReference(string fileName)
{
return Storage.GetContainerReference(ApplicationScope.ConfigManager.GetStorageConnectionString(),
ApplicationScope.ConfigManager.GetsContainerName()).GetFileReference(fileName);
};
Could you please tell me if this is the best way to access those files, if my main concern here is speed?
Do you suggest to store files in some other Azure data store and why?
Related
I created a Blazor Server app that would allow end users to upload large excel files that would be consumed in downstream logic.
I use a the standard .NET core 5 InputFile component to upload the excel file to the app, within the app, I read the stream async, copy it into a memory stream and then use ExcelDataReader to convert it into dataset.
The challenge I see is that the upload takes a long time specifically when App is deployed to Azure. To dig a bit deeper into what exactly was consuming the time, I track progress of the StreamCopy operation:
The following code handles my upload:
private async Task OnInputFileChange(InputFileChangeEventArgs e)
{
this.StateHasChanged();
IReadOnlyList<IBrowserFile> selectedFiles;
selectedFiles = e.GetMultipleFiles();
foreach (var file in selectedFiles)
{
DataSet ds = new DataSet();
{
bool filesuccesfullRead = false;
//allowing a 100MB file at once
var timer = new Timer(new TimerCallback(_ =>
{
if (fileTemplateData.uploadProgressInfo.percentage <= 100)
{
// Note that the following line is necessary because otherwise
// Blazor would not recognize the state change and not refresh the UI
InvokeAsync(() =>
{
StateHasChanged();
});
}
}), null, 1000, 1000);
using (Stream stream = file.OpenReadStream(104857600))
using (MemoryStream ms = new MemoryStream())
{
fileTemplateData.uploadProgressInfo = new GlobalDataClass.CopyProgressInfo();
await ExtensionsGeneric.CopyToAsync(stream, ms, 128000, fileTemplateData.uploadProgressInfo);
System.Text.Encoding.RegisterProvider(System.Text.CodePagesEncodingProvider.Instance);
try
{
using (var reader = ExcelReaderFactory.CreateReader(ms))
{
ds = reader.AsDataSet(new ExcelDataSetConfiguration()
{
ConfigureDataTable = _ => new ExcelDataTableConfiguration()
{
UseHeaderRow = false
}
});
filesuccesfullRead = true;
}
}
catch (Exception ex)
{
Message = "Unable to read provided file(s) with exception " + ex.ToString();
}
stream.Close();
ms.Close();
}
}
ds.Dispose();
ds = null;
}
fileTemplateData.fileloading = false;
this.StateHasChanged();
}
Here is the CopyToAsync Function which is same as regular stream copy but provides progress tracking:
public static async Task CopyToAsync(this Stream fromStream, Stream destination, int bufferSize, GlobalDataClass.CopyProgressInfo progressInfo)
{
var buffer = new byte[bufferSize];
int count;
progressInfo.TotalLengthinBytes = fromStream.Length;
while ((count = await fromStream.ReadAsync(buffer, 0, buffer.Length)) != 0)
{
progressInfo.BytesTransfered += count;
progressInfo.percentage = Math.Round((((double)progressInfo.BytesTransfered /(double) progressInfo.TotalLengthinBytes) * 100), 1);
await destination.WriteAsync(buffer, 0, count);
}
}
public class CopyProgressInfo
{
public long BytesTransfered { get; set; }
public long TotalLengthinBytes { get; set; }
public double percentage { get; set; }
public DateTime LastProgressUpdateVisualized = new DateTime();
}
Now Let me put the question:
Using this code, I achieve a fair upload speed when the app is running on a local host(A 75MB file with tonnes of data would upload in around 18 seconds). When the app is deployed to an Azure App service plan, the same file would take more than 10 minutes to upload, which makes me feel something is seriously wrong. Using progress tracking, I was able to confirm that the time is being consumed by the CopytoAsync function and not the logic after that.
Here's what I have investigated:
I checked my internet upload speed on two seprate connections with a stable upload bandwidth of more than 25Mbps, so this is not an issue.
I upgraded the app service plan to a higher tier momentarily to see if upload bandwidth was somehow linked with Azure App Service plan tier, even increasing it to a powerful P3V2 tier made no difference.
To see if the specific Datacenter where my App service sits in was offering poor upload performance from my part of the world, I checked average upload speed using https://www.azurespeed.com/Azure/UploadLargeFile and a 75Mb file would upload in around 38 seconds to Azure West Europe Datacenter. So I donot see if the connectivity is the problem here.
With all that is mentioned above, what could be causing such a poor file upload speed when uploading the file onto a Deployed Blazor Server Web App.
I don't see such performance impact. I upload to azure blob storage though.
My implementation summary:
razor component called imageUpload.razor that contains
public async Task HandleFileSelected(InputFileChangeEventArgs e)
and calls a service like:
await hService.UploadImgToAzureAsync(imageFile.OpenReadStream(), fileName);
service that contains the following:
public async Task UploadImgToAzureAsync(Stream fileStream, string fileName)
{
return await ImageHelper.UploadImageToStorage(fileStream, fileName);
}
ImageHelper calls AzureStorage.cs
AzureStorage.cs that handles calls UploadFromStreamAsync
I finally managed to improve the upload performance, unfortunately Blazor's built in InputFile component doesn't seem to be designed very well for large file uploads specially when the app has been deployed. I used Tewr's upload file component with a larger buffer size(128000) and that has significantly improved performance(3X reduction). Tewr's sample code is available here:
https://github.com/Tewr/BlazorFileReader/blob/master/src/Demo/Blazor.FileReader.Demo.Common/IndexCommon.razor
I'm using the export Google Drive API to retrieve a Google Doc as Pdf: https://developers.google.com/drive/v3/reference/files/export
I'm having the following problem: for documents bigger than a certain size (I don't know exactly the threshold, but it happens even with relatively small files around 1,5 MB) the API return a 200 response code with a blank result (normally it should contains the pdf data as byte stream), as you can see in the following screenshot:
I can successfully export the file via GoogleDrive/GoogleDoc UI with the "File -> Download as.. -> Pdf" command, despite it takes a bit of time.
Here is the file used for test (1.180 KB exported from Google Doc), I shared it so you can access to try export:
https://docs.google.com/document/d/18Cz7kHfEiDLeTWHyyoOi6U4kFQDMeg0D-CCJzILMMCk/edit?usp=sharing
Here is the (Java) code I'm using to perform the operation:
#Override
public GoogleDriveDocumentContent downloadFileContentAsPDF(String executionGoogleUser, String fileId) {
GoogleDriveDocumentContent documentContent = new GoogleDriveDocumentContent();
String conversionMimeType = "application/pdf";
try {
getLogger().info("GDrive APIs - Downloading file content in PDF format ...");
InputStream gDriveFileData = getDriveService(executionGoogleUser).files()
.export(fileId, conversionMimeType)
.executeMediaAsInputStream();
getLogger().info("GDrive APIs - File content as PDF format downloaded.");
documentContent.setFileName(null);
documentContent.setMimeType(conversionMimeType);
documentContent.setData(gDriveFileData);
} catch (IOException e) {
throw new RuntimeException(e);
}
return documentContent;
}
Does anyone has the same issue and know how to solve it?
The goal is to generate a pdf from a Google Doc.
Thanks
I think you should try using media downloadeder you will have to alter it for Google drive rather than storage service.
{
// Create the service using the client credentials.
var storageService = new StorageService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "APP_NAME_HERE"
});
// Get the client request object for the bucket and desired object.
var getRequest = storageService.Objects.Get("BUCKET_HERE", "OBJECT_HERE");
using (var fileStream = new System.IO.FileStream(
"FILE_PATH_HERE",
System.IO.FileMode.Create,
System.IO.FileAccess.Write))
{
// Add a handler which will be notified on progress changes.
// It will notify on each chunk download and when the
// download is completed or failed.
getRequest.MediaDownloader.ProgressChanged += Download_ProgressChanged;
getRequest.Download(fileStream);
}
}
static void Download_ProgressChanged(IDownloadProgress progress)
{
Console.WriteLine(progress.Status + " " + progress.BytesDownloaded);
}
Code ripped from here
My requirement is to use Web API to send across the network, a zip file (consisting a bunch of files in turn) which should not be written anywhere locally (not written anywhere on the server/client disk). For zipping, I am using DotNetZip - Ionic.Zip.dll
Code at Server:
public async Task<IHttpActionResult> GenerateZip(Dictionary<string, StringBuilder> fileList)
{
// fileList is actually a dictionary of “FileName”,”FileContent”
byte[] data;
using (ZipFile zip = new ZipFile())
{
foreach (var item in filelist.ToArray())
{
zip.AddEntry(item.Key, item.Value.ToString());
}
using (MemoryStream ms = new MemoryStream())
{
zip.Save(ms);
data = ms.ToArray();
}
}
var result = new HttpResponseMessage(HttpStatusCode.OK);
MemoryStream streams = new MemoryStream(data);
//, 0, data.Length-1, true, false);
streams.Position = 0;
//Encoding UTFEncode = new UTF8Encoding();
//string res = UTFEncode.GetString(data);
//result.Content = new StringContent(res, Encoding.UTF8, "application/zip");
<result.Content = new StreamContent(streams);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("application/zip");
//result.Content.Headers.ContentLength = data.Length;
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
result.Content.Headers.ContentDisposition.FileName = "test.zip";
return this.Ok(result);
}
The issue I am facing is that after the zip file downloaded at client end when modified as a test.bin has its stream contents (byte[] data in this example’s contents) missing. (I am getting back a test.zip file. When I change the file locally from test.zip to test.bin, I am seeing that the File’s contents as shown below. It does not contain the Response.Content values. P.S. I have also tried the MIME type “application/octet-stream” as well. No luck!)
Test.zip aka test.bin’s contents:
{"version":{"major":1,"minor":1,"build":-1,"revision":-1,"majorRevision":-1,"minorRevision":-1},
"content":{"headers":[{"key":"Content-Type","value":["application/zip"]},
{"key":"Content-Disposition","value":["attachment; filename=test.zip"]}]},
"statusCode":200,"reasonPhrase":"OK","headers":[],"isSuccessStatusCode":true}
Can someone please help me on how we can set result.Content with a MemoryStream object (I have seen example of “FileStream” at other places on google to set “result.Content” but I want to use MemoryStream object only!). I am highlighting this because I think the problem lies with setting the MemoryStream object to the result.Content (in order to properly save the streams content into the result.Content object)
P.S. I have also gone thru Uploading/Downloading Byte Arrays with AngularJS and ASP.NET Web API (and a bunch of other links) but it did not help me much… :(
Any help is greatly appreciated. Thanks a lot in advance :)
I got my issue solved!!
All I did was to change the Response Type to HttpResponseMessage and use "return result" in the last line rather than Ok(result) { i.e. HttpResponseMessage Type rather than OKNegiotatedContentResult Type)
I'm using rad controls(charts and gridview) for developing an application,which i need to export the controls(each) into image.I have tried each control converting them into bytes format and send to webservice and converting them to images but sometime sending the byte data to service throws an error.Is any other way to convert each control into image.I have tried another way like.
Stream fileStream = File.OpenRead(#"\\HARAVEER-PC\TempImages\FlashPivot.png");
//PART 2 - use the stream to write the file output.
productChart.ExportToImage(fileStream, new Telerik.Windows.Media.Imaging.PngBitmapEncoder());
fileStream.Close();
It throwing me an error like cannot access to the folder TempImages.I have given sharing permissions to everyone but it doesn't access the folder.
Any solution is much appreciated.
private BitmapImage CreateChartImages()
{
Guid photoID = System.Guid.NewGuid();
string photolocation = #"D:\Temp\" + photoID.ToString() + ".jpg";
BitmapImage bi = new BitmapImage(new Uri(photolocation, UriKind.Absolute));
using (MemoryStream ms = new MemoryStream())
{
radChart.ExportToImage(ms, new PngBitmapEncoder());
bi.SetSource(ms);
}
return bi;
}
I was trying to save multiple images into isolated storage by using creating a imageFolder in isolated storage and storing all my images inside.But it have an error so please anyone could help me solve the error or got other method way help me thanks.IF possible I would appreciate if you guys can show me your code that works. Actually my code would like to be under a button event handler.Thanks And the error is :Operation not permitted on IsolatedStorageFileStream.
My Code :
private void SaveToLocalStorage(string imageFolder, string imageFileName)
{
imageFileName = name.Text;
MessageBox.Show(imageFileName);
var isf = IsolatedStorageFile.GetUserStoreForApplication();
if (isf.DirectoryExists(imageFolder))
{
isf.CreateDirectory(imageFolder);
}
string filePath = Path.Combine(imageFolder, imageFileName);
MessageBox.Show(filePath);
using (var stream = isf.CreateFile(filePath))
{
var bmp= new WriteableBitmap(inkCanvas, inkCanvas.RenderTransform);
bmp.SaveJpeg(stream, bmp.PixelWidth, bmp.PixelHeight, 0, 100);
}
}
First off, you probably want to be creating the directory if it DOESN'T exist, not if it does:
if (!isf.DirectoryExists(imageFolder))
{
isf.CreateDirectory(imageFolder);
}