I have created an application using "window azure cloud service" where i m uplodaing/downloading azure blob files like this
Upload Code
public ActionResult UploadImage_post(HttpPostedFileBase fileBase)
{
if (fileBase.ContentLength > 0)
{
Microsoft.WindowsAzure.StorageClient.CloudBlobContainer blobContainer =
_myBlobStorageService.GetCloudBlobContainer();
Microsoft.WindowsAzure.StorageClient.CloudBlob blob =
blobContainer.GetBlobReference(fileBase.FileName);
blob.UploadFromStream(fileBase.InputStream);
}
return RedirectToAction("UploadImage");
}
Download Code
Microsoft.WindowsAzure.StorageClient.CloudBlobContainer blobContainer =
_myBlobStorageService.GetCloudBlobContainer();
Microsoft.WindowsAzure.StorageClient.CloudBlob blob =
blobContainer.GetBlobReference(filename);
return Redirect(blobContainer.GetBlobReference(filename).Uri.AbsoluteUri);
Here is my view
#foreach (var item in Model)
{
<img src="#item" width="200" height="100" />
Download
}
Situation is, i m setting a Download limit for Users, so i need to get File size before downloading so that i can check if "Download Limit" is reached or not.
you can see here that there is a "Data Left" field. Before downloading another file i need to check if
Downloaded File size > Data Left
then it should not download.
Please suggest
You would have to check the size of the blob before deciding if there is enough bandwidth for the download.
Microsoft.WindowsAzure.StorageClient.CloudBlob blob =
blobContainer.GetBlobReference(filename);
blob.FetchAttributes();
var blobsize = blob.Properties.Length; // this is in bytes
// is size less than bandwidth left. etc
Spoiler : this will actually query the blob, so I would assume a transaction charge every hit not to mention the increased latency of having to wait the request out; I would suggest you record the filesize on upload and save that off to your db or other persistent storage.
Related
I created a Blazor Server app that would allow end users to upload large excel files that would be consumed in downstream logic.
I use a the standard .NET core 5 InputFile component to upload the excel file to the app, within the app, I read the stream async, copy it into a memory stream and then use ExcelDataReader to convert it into dataset.
The challenge I see is that the upload takes a long time specifically when App is deployed to Azure. To dig a bit deeper into what exactly was consuming the time, I track progress of the StreamCopy operation:
The following code handles my upload:
private async Task OnInputFileChange(InputFileChangeEventArgs e)
{
this.StateHasChanged();
IReadOnlyList<IBrowserFile> selectedFiles;
selectedFiles = e.GetMultipleFiles();
foreach (var file in selectedFiles)
{
DataSet ds = new DataSet();
{
bool filesuccesfullRead = false;
//allowing a 100MB file at once
var timer = new Timer(new TimerCallback(_ =>
{
if (fileTemplateData.uploadProgressInfo.percentage <= 100)
{
// Note that the following line is necessary because otherwise
// Blazor would not recognize the state change and not refresh the UI
InvokeAsync(() =>
{
StateHasChanged();
});
}
}), null, 1000, 1000);
using (Stream stream = file.OpenReadStream(104857600))
using (MemoryStream ms = new MemoryStream())
{
fileTemplateData.uploadProgressInfo = new GlobalDataClass.CopyProgressInfo();
await ExtensionsGeneric.CopyToAsync(stream, ms, 128000, fileTemplateData.uploadProgressInfo);
System.Text.Encoding.RegisterProvider(System.Text.CodePagesEncodingProvider.Instance);
try
{
using (var reader = ExcelReaderFactory.CreateReader(ms))
{
ds = reader.AsDataSet(new ExcelDataSetConfiguration()
{
ConfigureDataTable = _ => new ExcelDataTableConfiguration()
{
UseHeaderRow = false
}
});
filesuccesfullRead = true;
}
}
catch (Exception ex)
{
Message = "Unable to read provided file(s) with exception " + ex.ToString();
}
stream.Close();
ms.Close();
}
}
ds.Dispose();
ds = null;
}
fileTemplateData.fileloading = false;
this.StateHasChanged();
}
Here is the CopyToAsync Function which is same as regular stream copy but provides progress tracking:
public static async Task CopyToAsync(this Stream fromStream, Stream destination, int bufferSize, GlobalDataClass.CopyProgressInfo progressInfo)
{
var buffer = new byte[bufferSize];
int count;
progressInfo.TotalLengthinBytes = fromStream.Length;
while ((count = await fromStream.ReadAsync(buffer, 0, buffer.Length)) != 0)
{
progressInfo.BytesTransfered += count;
progressInfo.percentage = Math.Round((((double)progressInfo.BytesTransfered /(double) progressInfo.TotalLengthinBytes) * 100), 1);
await destination.WriteAsync(buffer, 0, count);
}
}
public class CopyProgressInfo
{
public long BytesTransfered { get; set; }
public long TotalLengthinBytes { get; set; }
public double percentage { get; set; }
public DateTime LastProgressUpdateVisualized = new DateTime();
}
Now Let me put the question:
Using this code, I achieve a fair upload speed when the app is running on a local host(A 75MB file with tonnes of data would upload in around 18 seconds). When the app is deployed to an Azure App service plan, the same file would take more than 10 minutes to upload, which makes me feel something is seriously wrong. Using progress tracking, I was able to confirm that the time is being consumed by the CopytoAsync function and not the logic after that.
Here's what I have investigated:
I checked my internet upload speed on two seprate connections with a stable upload bandwidth of more than 25Mbps, so this is not an issue.
I upgraded the app service plan to a higher tier momentarily to see if upload bandwidth was somehow linked with Azure App Service plan tier, even increasing it to a powerful P3V2 tier made no difference.
To see if the specific Datacenter where my App service sits in was offering poor upload performance from my part of the world, I checked average upload speed using https://www.azurespeed.com/Azure/UploadLargeFile and a 75Mb file would upload in around 38 seconds to Azure West Europe Datacenter. So I donot see if the connectivity is the problem here.
With all that is mentioned above, what could be causing such a poor file upload speed when uploading the file onto a Deployed Blazor Server Web App.
I don't see such performance impact. I upload to azure blob storage though.
My implementation summary:
razor component called imageUpload.razor that contains
public async Task HandleFileSelected(InputFileChangeEventArgs e)
and calls a service like:
await hService.UploadImgToAzureAsync(imageFile.OpenReadStream(), fileName);
service that contains the following:
public async Task UploadImgToAzureAsync(Stream fileStream, string fileName)
{
return await ImageHelper.UploadImageToStorage(fileStream, fileName);
}
ImageHelper calls AzureStorage.cs
AzureStorage.cs that handles calls UploadFromStreamAsync
I finally managed to improve the upload performance, unfortunately Blazor's built in InputFile component doesn't seem to be designed very well for large file uploads specially when the app has been deployed. I used Tewr's upload file component with a larger buffer size(128000) and that has significantly improved performance(3X reduction). Tewr's sample code is available here:
https://github.com/Tewr/BlazorFileReader/blob/master/src/Demo/Blazor.FileReader.Demo.Common/IndexCommon.razor
I have Page X with a SignaturePad on my Xamarin.Forms app and I save the signature into the database in the following way:
if (!sp.IsBlank)
{
Stream image = await sp.GetImageStreamAsync(SignatureImageFormat.Png);
using (BinaryReader br = new BinaryReader(image))
{
br.BaseStream.Position = 0;
byte[] SignatureImage = br.ReadBytes((int)image.Length);
}
}
I save this byte array in my SQL database table. I navigate to a couple of other pages in my app and then go back to Page X and want to reload all the information into this page from the saved information in the database. How do I reload the Signature? Do I need to save the signature as strokes or points instead? How do I save strokes or points in the database? Some sample code would be really helpful
Thank you.
I'm trying to understand the property db.blobColumns in the database connector --- I've got essentially a massive string of 500,000 characters and I want to use db.blobColumns to upload this text. By the inherent name of blob I am assuming that it is expecting a binary large object? If anyone's used this property before for large text files please help me! I'm at a loss with this particular situation.
Here are the docs: https://developers.google.com/cloud-search/docs/guides/database-connector#content-fields
I have tried using the db.blobColumn field with database blob(binary) content and it works well by extracting text from the file and doing OCR if its an image. But yes, it also accepts text content in the form of database's CLOB type.
I suggest you take a look at the code of database connector here. Main two files that matter here are DatabaseAccess.java and DatabaseRepository.java.
private ByteArrayContent createBlobContent(Map<String, Object> allColumnValues) {
byte[] bytes;
Object value = allColumnValues.get(columnManager.getBlobColumn());
if (value == null) {
return null;
} else if (value instanceof String) {
bytes = ((String) value).getBytes(UTF_8);
} else if (value instanceof byte[]) {
bytes = (byte[]) value;
} else {
throw new InvalidConfigurationException( // allow SDK to send dashboard notification
"Invalid Blob column type. Column: " + columnManager.getBlobColumn()
+ "; object type: " + value.getClass().getSimpleName());
}
return new ByteArrayContent(null, bytes);
}
Above code snippet from the DatabaseRepository.java file is responsible for generating the blob content(binary) which is pushed to Cloud Search. The content of Clob and Blob comes to this function in the form of a byte[]. And is pushed as-is to Cloud Search.
Note from here:
Google Cloud Search will only index first 10 MB of your content
regardless of whether its a text file or binary content.
I'm trying to use the API REST of Windows Azure for creating a virtual machine deployment. However, I've got a problem when trying to specify an OS image in the following XML file:
<Deployment xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<Name>SomeName</Name>
<DeploymentSlot>Production</DeploymentSlot>
<Label></Label>
<RoleList>
<Role i:type="PersistentVMRole">
<RoleName>SomeName</RoleName>
<OsVersion i:nil="true"/>
<RoleType>PersistentVMRole</RoleType>
<ConfigurationSets>
<ConfigurationSet i:type="WindowsProvisioningConfigurationSet">
<ConfigurationSetType>WindowsProvisioningConfiguration</ConfigurationSetType>
<ComputerName>SomeName</ComputerName>
<AdminPassword>XXXXXXXXXX</AdminPassword>
<EnableAutomaticUpdates>true</EnableAutomaticUpdates>
<ResetPasswordOnFirstLogon>false</ResetPasswordOnFirstLogon>
</ConfigurationSet>
<ConfigurationSet i:type="NetworkConfigurationSet">
<ConfigurationSetType>NetworkConfiguration</ConfigurationSetType>
<InputEndpoints>
<InputEndpoint>
<LocalPort>3389</LocalPort>
<Name>RemoteDesktop</Name>
<Protocol>tcp</Protocol>
</InputEndpoint>
</InputEndpoints>
</ConfigurationSet>
</ConfigurationSets>
<DataVirtualHardDisks/>
<Label></Label>
<OSVirtualHardDisk>
<MediaLink>¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿???????????????</MediaLink>
<SourceImageName>¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿??????????????????</SourceImageName>
</OSVirtualHardDisk>
</Role>
</RoleList>
</Deployment>`
I need the MediaLink (URI of the OS image) and the SourceImageName (Canonical name of the OS image). My question is, the web portal provides several PREDEFINED images but I cannot determine the URI and the canonical name of them. Will I be forced to create my own OS image and upload it to any of the storage services under my Windows Azure account?
To get these parameters, you could perform List OS Images Service Management API operation on your subscription.
UPDATE
Please discard some of my comments below (sorry about those). I was finally able to create a VM using REST API :). Here're some of the things:
<MediaLink> element should specify the URL of the VHD off of which your VM will be created. It has to be a URL in one of your storage account in the same subscription as your virtual machine cloud service. So for this, you could specify a URL like: https://[yourstorageaccount].blob.core.windows.net/[blobcontainer]/[filename].vhd where [blobcontainer] would be the name of the blob container where you would want the API to store the VHD while the [filename] is any name that you want to give to your VHD. What REST API does is that it copies the source image specified in the <SourceImageName> and saves it at the URI specified in the <MediaLink> element.
Make sure that your Service and Storage Account where your VHD will be stored are in the same data center/affinity group. Furthermore that data center should be able to support Virtual Machines. It turns out that not all data centers support Virtual Machines.
Order of XML element is of utmost importance. You move one element up or down would result in 400 error.
Based on my experimentation, here's the code:
private static void CreateVirtualMachineDeployment(string subscriptionId, X509Certificate2 cert, string cloudServiceName)
{
try
{
string uri = string.Format("https://management.core.windows.net/{0}/services/hostedservices/{1}/deployments", subscriptionId, cloudServiceName);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.ContentType = "application/xml";
request.Headers.Add("x-ms-version", "2013-03-01");
request.ClientCertificates.Add(cert);
string requestPayload = #"<Deployment xmlns=""http://schemas.microsoft.com/windowsazure"" xmlns:i=""http://www.w3.org/2001/XMLSchema-instance"">
<Name>[SomeName]</Name>
<DeploymentSlot>Production</DeploymentSlot>
<Label>[SomeLabel]</Label>
<RoleList>
<Role i:type=""PersistentVMRole"">
<RoleName>MyTestRole</RoleName>
<OsVersion i:nil=""true""/>
<RoleType>PersistentVMRole</RoleType>
<ConfigurationSets>
<ConfigurationSet i:type=""WindowsProvisioningConfigurationSet"">
<ConfigurationSetType>WindowsProvisioningConfiguration</ConfigurationSetType>
<ComputerName>[ComputerName]</ComputerName>
<AdminPassword>[AdminPassword - Ensure it's strong Password]</AdminPassword>
<AdminUsername>[Admin Username]</AdminUsername>
<EnableAutomaticUpdates>true</EnableAutomaticUpdates>
<ResetPasswordOnFirstLogon>false</ResetPasswordOnFirstLogon>
</ConfigurationSet>
<ConfigurationSet i:type=""NetworkConfigurationSet"">
<ConfigurationSetType>NetworkConfiguration</ConfigurationSetType>
<InputEndpoints>
<InputEndpoint>
<LocalPort>3389</LocalPort>
<Name>RemoteDesktop</Name>
<Protocol>tcp</Protocol>
</InputEndpoint>
</InputEndpoints>
</ConfigurationSet>
</ConfigurationSets>
<DataVirtualHardDisks/>
<Label></Label>
<OSVirtualHardDisk>
<MediaLink>https://[storageaccount].blob.core.windows.net/vhds/fb83b3509582419d99629ce476bcb5c8__Microsoft-SQL-Server-2012SP1-Web-CY13SU04-SQL11-SP1-CU3-11.0.3350.0.vhd</MediaLink>
<SourceImageName>fb83b3509582419d99629ce476bcb5c8__Microsoft-SQL-Server-2012SP1-Web-CY13SU04-SQL11-SP1-CU3-11.0.3350.0</SourceImageName>
</OSVirtualHardDisk>
</Role>
</RoleList>
</Deployment>";
byte[] content = Encoding.UTF8.GetBytes(requestPayload);
request.ContentLength = content.Length;
using (var requestStream = request.GetRequestStream())
{
requestStream.Write(content, 0, content.Length);
}
using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
{
}
}
catch (WebException webEx)
{
using (var streamReader = new StreamReader(webEx.Response.GetResponseStream()))
{
string result = streamReader.ReadToEnd();
Console.WriteLine(result);
}
}
}
Hope this helps!
not sure how to do this since there is no ado.net in Windows Phone. Would appreciate if you can show me some code and sample.
Thanks
Use image data type, see this sample: http://erikej.blogspot.com/2009/11/how-to-save-and-retrieve-images-using.html
I would suggest to approaches:
1- If your images are not going to be very large, then you can store Base64 string of image data as a string field in database.
var base64String = Convert.ToBase64String(imageData); // and store this to database
var imageData = Convert.FromBase64String(imageDataString); // read image data from database
2- otherwise you can assign your image (or database record) a unique GUID and store your image in IsolatedStorageFile.
using (var isf = IsolatedStorageFile.GetUserStoreForApplication())
{
using (var writer = new BinaryWriter(new IsolatedStorageFileStream(filename, System.IO.FileMode.Create, isf)))
{
writer.Write(imageData);
}
}
will add code in a minute