Creating Excel files directly on blob - azure-blob-storage

I have following function which works fine when saving to disk. I am executing the code from an Azure function. Is there anyway to to write to a blob storage instead without saving to disk?
private void ExportDataSet(DataTable ds, string destination)
{
using (var workbook = SpreadsheetDocument.Create(destination, DocumentFormat.OpenXml.SpreadsheetDocumentType.Workbook))
{
var workbookPart = workbook.AddWorkbookPart();
workbook.WorkbookPart.Workbook = new DocumentFormat.OpenXml.Spreadsheet.Workbook();
workbook.WorkbookPart.Workbook.Sheets = new DocumentFormat.OpenXml.Spreadsheet.Sheets();
var sheetPart = workbook.WorkbookPart.AddNewPart<WorksheetPart>();
var sheetData = new DocumentFormat.OpenXml.Spreadsheet.SheetData();
sheetPart.Worksheet = new DocumentFormat.OpenXml.Spreadsheet.Worksheet(sheetData);
DocumentFormat.OpenXml.Spreadsheet.Sheets sheets = workbook.WorkbookPart.Workbook.GetFirstChild<DocumentFormat.OpenXml.Spreadsheet.Sheets>();
string relationshipId = workbook.WorkbookPart.GetIdOfPart(sheetPart);
uint sheetId = 1;
if (sheets.Elements<DocumentFormat.OpenXml.Spreadsheet.Sheet>().Count() > 0)
{
sheetId =
sheets.Elements<DocumentFormat.OpenXml.Spreadsheet.Sheet>().Select(s => s.SheetId.Value).Max() + 1;
}
DocumentFormat.OpenXml.Spreadsheet.Sheet sheet = new DocumentFormat.OpenXml.Spreadsheet.Sheet() { Id = relationshipId, SheetId = sheetId, Name = "Sites" };
sheets.Append(sheet);
DocumentFormat.OpenXml.Spreadsheet.Row headerRow = new DocumentFormat.OpenXml.Spreadsheet.Row();
List<String> columns = new List<string>();
foreach (System.Data.DataColumn column in ds.Columns)
{
columns.Add(column.ColumnName);
DocumentFormat.OpenXml.Spreadsheet.Cell cell = new DocumentFormat.OpenXml.Spreadsheet.Cell();
cell.DataType = DocumentFormat.OpenXml.Spreadsheet.CellValues.String;
cell.CellValue = new DocumentFormat.OpenXml.Spreadsheet.CellValue(column.ColumnName);
headerRow.AppendChild(cell);
}
sheetData.AppendChild(headerRow);
foreach (System.Data.DataRow dsrow in ds.Rows)
{
DocumentFormat.OpenXml.Spreadsheet.Row newRow = new DocumentFormat.OpenXml.Spreadsheet.Row();
foreach (String col in columns)
{
DocumentFormat.OpenXml.Spreadsheet.Cell cell = new DocumentFormat.OpenXml.Spreadsheet.Cell();
cell.DataType = DocumentFormat.OpenXml.Spreadsheet.CellValues.String;
cell.CellValue = new DocumentFormat.OpenXml.Spreadsheet.CellValue(dsrow[col].ToString()); //
newRow.AppendChild(cell);
}
sheetData.AppendChild(newRow);
}
}
}
I would expect you maybe could save to a Stream?

Save to stream is the solution if you don't like to save it to a disk(In azure function, you can save it to a disk in azure function kudu like D:\home etc.).
If you choose to save to stream, just a few changes to your code, like below:
private void ExportDataSet(DataTable ds, MemoryStream memoryStream)
{
using (var workbook = SpreadsheetDocument.Create(memoryStream, DocumentFormat.OpenXml.SpreadsheetDocumentType.Workbook))
{
//your code logic here
}
//here, the code to upload to azure blob storage.
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("test1");
CloudBlockBlob myblob = cloudBlobContainer.GetBlockBlobReference("myexcel.xlsx");
//upload to blob storage
memoryStream.Position = 0;
myblob.UploadFromStream(memoryStream)
//or you can use Asnyc mehtod like myblob.UploadFromStreamAsync(memoryStream)
}
Note: if you're using the latest azure blob storage sdk Microsoft.Azure.Storage.Blob, version 9.4.0 or later, you can use either UploadFromStreamAsync or UploadFromStream method in azure function v2.

Related

Azure.Storage.Blobs returns different list from WindowsAzure.Storage on creation

I have this test code which connects to Azure Blob Storage in two ways with the same credentials, once with the now deprecated WindowsAzure.Storage package and once the new Azure.Storage.Blobs package:
using Azure.Storage.Blobs;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Blob;
using NUnit.Framework;
using System.Linq;
namespace EntityFrameworkTest
{
public class AzureBlobStorageTests
{
[Test]
public void TestStorage()
{
string storageAccount = "MyConnectionString";
string containerName = "testazureblobobjectstore";
var clientNew = new BlobServiceClient(storageAccount);
var containerNew = clientNew.GetBlobContainerClient(containerName);
containerNew.CreateIfNotExists();
var blobsListNew = containerNew.GetBlobs();
var accountOld = CloudStorageAccount.Parse(storageAccount);
var clientOld = accountOld.CreateCloudBlobClient();
var containerOld = clientOld.GetContainerReference(containerName);
containerOld.CreateIfNotExistsAsync().GetAwaiter().GetResult();
BlobContinuationToken config = new BlobContinuationToken();
var blobsListOld = containerOld.ListBlobsSegmentedAsync(config).GetAwaiter().GetResult().Results;
Assert.AreEqual(blobsListOld.Count(), blobsListNew.Count());
}
}
}
The lengths of the two lists differ:
In blobsListOld I have 2 items:
The directory
A BlockBlob with name __id__foo.
In blobsListNew I have 3 items, all are BlockBlobs with the following names:
//||!##$%^&*()_-=+[]'<>~;:`?
/||!##$%^&*()_-=+[]'<>~;:`?
__id__foo.
Can anyone explain this to me please?
I tried in my system to test the both return count of blobs which connects to Azure Blob Storage using deprecated WindowsAzure.Storage package and once the new Azure.Storage.Blobs package: Both returning the same value .
Try with this code
string storageAccount = "Connection String";
string containerName = "test";
var clientNew = new BlobServiceClient(storageAccount);
var containerNew = clientNew.GetBlobContainerClient(containerName);
containerNew.CreateIfNotExists();
BlobContainerClient containerClient = clientNew.GetBlobContainerClient("test");
// List all blobs in the container using new
await foreach (BlobItem blobItem in containerClient.GetBlobsAsync())
{
Console.WriteLine("\t" + blobItem.Name);
}
var accountOld = CloudStorageAccount.Parse(storageAccount);
CloudBlobClient clientOld = accountOld.CreateCloudBlobClient();
CloudBlobContainer container = clientOld.GetContainerReference("test");
//List all blobs using old
CloudBlobDirectory dira = container.GetDirectoryReference(string.Empty);
var rootDirFolders = dira.ListBlobsSegmentedAsync(true, BlobListingDetails.Metadata, null, null, null, null).Result;
foreach (var blob in rootDirFolders.Results)
{
Console.WriteLine( blob.Uri);
}
OUTPUT
So apparently I had to use containerNew.GetBlobsByHierarchy(delimiter: "/"); to get only the blobs in the root folder.

How can I create a MetadataWorkspace using metadata loading delegates?

I followed this example Changing schema name on runtime - Entity Framework where I can create a new EntityConnection from a MetaDataWorkspace that I then use to construct a DbContext with a different schema, but I get compiler warnings saying that RegisterItemCollection method is obsolete and to "Construct MetadataWorkspace using constructor that accepts metadata loading delegates."
How do I do that? Here is the code that is working but gives the 3 warnings for the RegsiterItemCollection calls. I'm surprised it works since warning says obsolete not just deprecated.
public static EntityConnection CreateEntityConnection(string schema, string connString, string model)
{
XmlReader[] conceptualReader = new XmlReader[]
{
XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".csdl")
)
};
XmlReader[] mappingReader = new XmlReader[]
{
XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".msl")
)
};
var storageReader = XmlReader.Create(
Assembly
.GetExecutingAssembly()
.GetManifestResourceStream(model + ".ssdl")
);
//XNamespace storageNS = "http://schemas.microsoft.com/ado/2009/02/edm/ssdl"; // this would not work!!!
XNamespace storageNS = "http://schemas.microsoft.com/ado/2009/11/edm/ssdl";
var storageXml = XElement.Load(storageReader);
foreach (var entitySet in storageXml.Descendants(storageNS + "EntitySet"))
{
var schemaAttribute = entitySet.Attributes("Schema").FirstOrDefault();
if (schemaAttribute != null)
{
schemaAttribute.SetValue(schema);
}
}
storageXml.CreateReader();
StoreItemCollection storageCollection =
new StoreItemCollection(
new XmlReader[] { storageXml.CreateReader() }
);
EdmItemCollection conceptualCollection = new EdmItemCollection(conceptualReader);
StorageMappingItemCollection mappingCollection =
new StorageMappingItemCollection(
conceptualCollection, storageCollection, mappingReader
);
//var workspace2 = new MetadataWorkspace(conceptualCollection, storageCollection, mappingCollection);
var workspace = new MetadataWorkspace();
workspace.RegisterItemCollection(conceptualCollection);
workspace.RegisterItemCollection(storageCollection);
workspace.RegisterItemCollection(mappingCollection);
var connectionData = new EntityConnectionStringBuilder(connString);
var connection = DbProviderFactories
.GetFactory(connectionData.Provider)
.CreateConnection();
connection.ConnectionString = connectionData.ProviderConnectionString;
return new EntityConnection(workspace, connection);
}
I was able to get rid of the 3 warning messages. Basically it wants you to register the collections in the constructor of the MetadataWorkspace.
There are 3 different overloads for MetadataWorkspace, I chose to use the one which requires to to supply a path (array of strings) to the workspace metadata. To do this I saved readers to temp files and reloaded them.
This is working for me without any warnings.
public static EntityConnection CreateEntityConnection(string schema, string connString, string model) {
var conceptualReader = XmlReader.Create(Assembly.GetExecutingAssembly().GetManifestResourceStream(model + ".csdl"));
var mappingReader = XmlReader.Create(Assembly.GetExecutingAssembly().GetManifestResourceStream(model + ".msl"));
var storageReader = XmlReader.Create(Assembly.GetExecutingAssembly().GetManifestResourceStream(model + ".ssdl"));
XNamespace storageNS = "http://schemas.microsoft.com/ado/2009/11/edm/ssdl";
var storageXml = XElement.Load(storageReader);
var conceptualXml = XElement.Load(conceptualReader);
var mappingXml = XElement.Load(mappingReader);
foreach (var entitySet in storageXml.Descendants(storageNS + "EntitySet")) {
var schemaAttribute = entitySet.Attributes("Schema").FirstOrDefault();
if (schemaAttribute != null) {
schemaAttribute.SetValue(schema);
}
}
storageXml.Save("temp.ssdl");
conceptualXml.Save("temp.csdl");
mappingXml.Save("temp.msl");
MetadataWorkspace workspace = new MetadataWorkspace(new List<String>(){
#"temp.csdl",
#"temp.ssdl",
#"temp.msl"
}
, new List<Assembly>());
var connectionData = new EntityConnectionStringBuilder(connString);
var connection = DbProviderFactories.GetFactory(connectionData.Provider).CreateConnection();
connection.ConnectionString = connectionData.ProviderConnectionString;
return new EntityConnection(workspace, connection);
}
Not wanting to create temp files which slows the process down, I found an alternate answer to this is fairly simple. I replaced these lines of code -
//var workspace2 = new MetadataWorkspace(conceptualCollection, storageCollection, mappingCollection);
var workspace = new MetadataWorkspace();
workspace.RegisterItemCollection(conceptualCollection);
workspace.RegisterItemCollection(storageCollection);
workspace.RegisterItemCollection(mappingCollection);
with this one line of code -
var workspace = new MetadataWorkspace(() => conceptualCollection, () => storageCollection, () => mappingCollection);
and that works fine.

Master details batch crud with entity framework only saving single detail record

I'm building a master details page with batch editing, that is multiple details records with single master record. but only one detail record is being saved into the data base. I tried to debug & found that detail loop is executing multiple time accurately but not the saving multiple data. Here is my Code for save method:
public ActionResult CMN_VAL_FORM(HRM_CMN_VLU_MST_ViewModel model)
{
//var ctx=new Entities1();
var CMN_VLU_MST_OBJ = new HRM_CMN_VLU_MST();
var CMN_VLU_DTL_OBJ = new HRM_CMN_VLU_DTL();
//using (TransactionScope transaction = new TransactionScope())
//{
using (var ctx = new Entities1())
{
var type_code = ctx.ExecuteStoreQuery<string>("select get_pk_code('hrm_cmn_vlu_mst','CMN_VLU_TYPE_CODE') from dual").SingleOrDefault(); //A scalar function to generate the code in the format yymmdd0001
var value_code = ctx.ExecuteStoreQuery<string>("select get_pk_code('hrm_cmn_vlu_dtl','CMN_VLU_CODE') from dual").SingleOrDefault();
CMN_VLU_MST_OBJ.CMN_VLU_TYPE_CODE = type_code;
CMN_VLU_MST_OBJ.CMN_VLU_REM = model.CMN_VLU_REM;
CMN_VLU_MST_OBJ.CMN_VLU_TYPE_FOR = model.CMN_VLU_TYPE_FOR;
CMN_VLU_MST_OBJ.CMN_VLU_TYPE_SRTNM = model.CMN_VLU_TYPE_SRTNM;
CMN_VLU_MST_OBJ.ENTRY_DATE = DateTime.Now;
CMN_VLU_MST_OBJ.CMN_VLU_TYPE_NAME = model.CMN_VLU_TYPE_NAME;
foreach (var item in model.HRM_CMN_VLU_DTL)
{
CMN_VLU_DTL_OBJ.CMN_VLU_LEVL = item.CMN_VAL_LEVL;
CMN_VLU_DTL_OBJ.CMN_VLU_REM = item.CMN_VLU_REM;
CMN_VLU_DTL_OBJ.CMN_VLU_SLNO = item.CMN_VLU_SLNO;
CMN_VLU_DTL_OBJ.CMN_VLU_TITL = item.CMN_VAL_TITL;
CMN_VLU_DTL_OBJ.CMN_VLU_CNTN = item.CMN_VAL_CNTN;
CMN_VLU_DTL_OBJ.CMN_VLU_CODE = value_code;
CMN_VLU_DTL_OBJ.CMN_VLU_TYPE_CODE = type_code;
CMN_VLU_DTL_OBJ.ENTRY_DATE = DateTime.Now;
CMN_VLU_DTL_OBJ.MAIL_ADDR_INT = item.MAIL_ADDR_INT;
CMN_VLU_DTL_OBJ.MAIL_ADDR_EXT = item.MAIL_ADDR_EXT;
CMN_VLU_DTL_OBJ.MAIL_AUTO_SEND_INT = item.MAIL_AUTO_SEND_INT;
CMN_VLU_DTL_OBJ.MAIL_AUTO_SEND_EXT = item.MAIL_AUTO_SEND_EXT;
CMN_VLU_DTL_OBJ.ACTIVE_STATUS = item.ACTIVE_STATUS;
CMN_VLU_MST_OBJ.HRM_CMN_VLU_DTL.Add(CMN_VLU_DTL_OBJ);
ctx.SaveChanges();
var temp_value_code = Int32.Parse(value_code);
temp_value_code++;
value_code = temp_value_code.ToString();
ctx.HRM_CMN_VLU_MST.AddObject(CMN_VLU_MST_OBJ);
}
ctx.SaveChanges();
// transaction.Complete();
//}
}
return View();
}
No Error message for the code, but not saving multiple detail records. What I'm doing wrong?
What I was doing wrong, I created the object just once & updated that same object every time while executing the foreach loop! I just moved the object declaration into the foreach loop & it works like a charm!

How to save stream of image data to the root of the local app data folder in windows phone?

I am trying to save the stream of image data to a file. I was able to save it to Pictures library though.
But I want to save it to a file in the root of my application/ project.
I was trying the below but it doesn't work.
using (MediaLibrary mediaLibrary = new MediaLibrary())
mediaLibrary.SavePicture(#"\DefaultScreen.jpg", stream);
In this case you should use LocalStorage.
Here is a simple solution to do this:
using (IsolatedStorageFile isoStore = IsolatedStorageFile.GetUserStoreForApplication())
{
if (!isoStore.FileExists(fileName)
{
var sr = Application.GetResourceStream(new Uri(fileName, UriKind.Relative));
using (var br = new BinaryReader(sr.Stream))
{
byte[] data = br.ReadBytes((int)sr.Stream.Length);
string strBaseDir = string.Empty;
const string DelimStr = "/";
char[] delimiter = DelimStr.ToCharArray();
string[] dirsPath = fileName.Split(delimiter);
// Recreate the directory structure
for (int i = 0; i < dirsPath.Length - 1; i++)
{
strBaseDir = Path.Combine(strBaseDir, dirsPath[i]);
isoStore.CreateDirectory(strBaseDir);
}
using (BinaryWriter bw = new BinaryWriter(isoStore.CreateFile(fileName)))
{
bw.Write(data);
}
}
}
}
Here you can find all info about data in Windows Phone:
http://msdn.microsoft.com/en-us/library/windowsphone/develop/ff402541(v=vs.105).aspx

In Windows Phone 7 how can I save a BitmapImage to local storage?

In Windows Phone 7 how can I save a BitmapImage to local storage? I need to save the image for caching and reload if it is requested again in the next few days.
If you save the file into IsolatedStorage you can set a relative path to view it from there.
Here's a quick example saving a file that was included in the XAP (as a resource) into Isolated Storage.
using (IsolatedStorageFile isoStore = IsolatedStorageFile.GetUserStoreForApplication())
{
if (!isoStore.FileExists(fileName)
{
var sr = Application.GetResourceStream(new Uri(fileName, UriKind.Relative));
using (var br = new BinaryReader(sr.Stream))
{
byte[] data = br.ReadBytes((int)sr.Stream.Length);
string strBaseDir = string.Empty;
const string DelimStr = "/";
char[] delimiter = DelimStr.ToCharArray();
string[] dirsPath = fileName.Split(delimiter);
// Recreate the directory structure
for (int i = 0; i < dirsPath.Length - 1; i++)
{
strBaseDir = Path.Combine(strBaseDir, dirsPath[i]);
isoStore.CreateDirectory(strBaseDir);
}
using (BinaryWriter bw = new BinaryWriter(isoStore.CreateFile(fileName)))
{
bw.Write(data);
}
}
}
}
You may also be interested in the image caching converters created by Ben Gracewood and Peter Nowaks. They both show saving images into isolated storage and loading them from there.
Another approach I've used is to pass the stream you retrieve for the image in your xap straight into an isolated storage file. Not a lot of moving parts.
using (var isoStore = IsolatedStorageFile.GetUserStoreForApplication()) {
var bi = new BitmapImage();
bi.SetSource(picStreamFromXap);
var wb = new WriteableBitmap(bi);
using (var isoFileStream = isoStore.CreateFile("pic.jpg"))
Extensions.SaveJpeg(wb, isoFileStream, wb.PixelWidth, wb.PixelHeight, 0, 100);
}

Resources