Can Standard logic app upload a blob contain a folder - azure-blob-storage

Currently in consumption you can specify a new folder in a blob container when you create a new blob.
In Standard you have to use the upload to a blob and I don't see where I can specify the folder path:

In Standard you will get two option for choose an operation one is Build-in and second is Azure
would suggest you to choose Azure Option you will get same list of action as you are getting in consumption
Here in Azure-> create blob action(V2), you will observe same thing as in Consumption
Note : choosing Action Upload a blob Azure Storage from Build-in won't give option for Folder Path.

Related

Azure Storage Explorer - Inadequate resource type access

I am attempting to use the Microsoft Azure Storage Explorer, attaching with a SAS URI. But I always get the error:
Inadequate resource type access. At least service-level ('s') access
is required.
Here is my SAS URI with portions obfuscated:
https://ti<...>hare.blob.core.windows.net/?sv=2018-03-28&ss=b&srt=co&sp=rwdl&se=2027-07-01T00:00:00Z&st=2019-07-01T00:00:00Z&sip=52.<...>.235&spr=https&sig=yD%2FRUD<...>U0%3D
And here is my connection string with portions obfuscated:
BlobEndpoint=https://tidi<...>are.blob.core.windows.net/;QueueEndpoint=https://tidi<...>hare.queue.core.windows.net/;FileEndpoint=https://ti<...>are.file.core.windows.net/;TableEndpoint=https://tid<...>hare.table.core.windows.net/;SharedAccessSignature=sv=2018-03-28&ss=b&srt=co&sp=rwdl&se=2027-07-01T00:00:00Z&st=2019-07-01T00:00:00Z&sip=52.<...>.235&spr=https&sig=yD%2FRU<...>YU0%3D
It seems like the problem is with the construction of my URI/endpoints/connectionstring/etc, more than with permissions granted me on the server, due to the fact that when I click Next, the error displays instantaneously. I do not believe it even tried to reach out to the server.
What am I doing wrong? (As soon as I get this working, I'll be using the URI/etc to embed in my C# app for programmatic access.)
What you need to connect is a service requirement the "SRT" part of the URI.
The URI you have has a SRT of "CO" container and object and needs the "S" part, you need to create a new sas key this can be generated in portal, azure cli or powershell.
In the portal is this part:
You have to enter to the storage acount and select what you need:
Allowed services (if you are looking for blob)
Blob
Allowed resource types
Service (make sure this one is activated)
Container
Object
Allowed permissions (this to do everything)
Read
Write
Delete
List
Add
Create
Example where to look
If you need more info look here:
https://learn.microsoft.com/en-us/rest/api/storageservices/create-account-sas?redirectedfrom=MSDN
If you like to create the SAS key in the CLI use this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-cli
If you like to create the SAS key in powershell use this:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-user-delegation-sas-create-powershell
I has a similar issue trying to connect to the blob container using a Shared Access Signature (SAS) URL, and this worked for me:
Instead of generating the SAS URL in Azure Portal, I used Azure Storage Explorer.
Right click the container that you want to share -> "Get Shared Access Signature"
Select the Expiry time and permissions and click create
This URL should work when your client/user will try to connect to the container.
Cheers
I had the same problem and managed to get this to work by hacking the URL and changing "srt=co" to "srt=sco". It seems to need the "s".

Quick Way to Bluk Copy Azure Blobs

I have about 40,000 blobs in azure storage, they have been given the wrong file extension. They have been uploaded with the filename <name>.png and I need to correct the name to <name>.jpg. In the 1st instance I'd like simply copy the originals into the same blob store but with a new file name.
azcopy would normally be my go to for this kind of thing, but it doesn't seem to have the options I need.
How can I bulk copy and rename files in an azure blob store?
Azure Blob Storage doesn't support renaming directly. However, you can work it around by copying the blob to a new blob with modified name (by StartCopy method), and removing the original blob (by Delete method). The copy procedure can be pretty fast if the source and destination is under the same storage account since it's actually a shallow copy.

Blob files have to renamed manually to include parent folder path

We are new to Windows azure and have used Windows azure storage for blob objects while developing sitefinity application but the blob files which are uploaded to this storage via publishing to azure from Visual Studio uploads files with only the file names and do not maintain the prefix folder name and slash. Hence we have to rename all files manually on the windows azure management portal and put the folder name and slash in the beginning of each file name so that the page which is accessing these images can show the images properly otherwise the images are not shown due to incorrect path.
Though in sitefinity admin panel , when we upload these images/blob files in those pages , we upload them inside a folder and we have configured to leverage sitefinity to use azure storage instead of database.
Please check the file attached to see the screenshot.
Please help me to solve this.
A few things I would like to mention first:
Windows Azure does not support rename functionality. Rename blob functionality = copy blob followed by delete blob.
Copy blob operation is asynchronous so you must wait for copy operation to finish before deleting the blob.
Blob storage does not support folder hierarchy natively. As you may have already discovered, you create an illusion of a folder by prepending a blob name (say logo.png) with the name of folder you want (say images) and separate them with slash (/) so your blob name becomes images/logo.png.
Now coming to your problem. Needless to say that manually renaming the blobs would be a cumbersome exercise. I would recommend using a storage management tool to do that. One such example would be Azure Management Studio from Cerebrata. If you use that tool, essentially what you can do is create an empty folder in the container and then move the files into that folder. That to me would be the fastest way to achieve your objective.
If you wish to write some code to do that, here are the steps you will take:
First you will list all blobs in a blob container.
Next you will loop over this list.
For each blob (let's call it source blob), you would get its name and prepend the folder name that you want and create an instance of a CloudBlockBlob object.
Next you would initiate a copy blob operation on that blob using StartCopyFromBlob on this new blob where source is your source blob.
You would need to wait for the copy operation to finish. Once the copy operation is finished, you can safely delete the source blob.
P.S. I would have written some code but unfortunately I'm stuck with something else. I might write something later on (but please don't hold your breath for that :)).

Unzip the .GZ file in worker Process of Azure

Can any1 provide me an Idea, How to implement unzipping of .gz format file through Worker. If i try to write unzipping of file then, where i need to store unzipped file(i.e one text file
) , Will it be loaded in any location in azure. how can i specify the path in Windows Azure Worker process like current execting directory. If this approach doesnot work, then i need to create one more blob to store unzipped .gz file i.e txt.
-mahens
In your Worker Role, it is up to you how a .gz file arrive (downloaded from Azure Blob storage) however on the file is available you can use GZipStream to compress/uncompress a .GZ file. You can also find code sample in above link with Compress and Decompress function.
This SO discussion shares a few tools and code to explain how you can unzip .GZ using C#:
Unzipping a .gz file using C#
Next when you will use Decompress/Compress code in a Worker Role you have ability to store it directly to local storage (as suggested by JcFx) or use MemoryStream to store directly to Azure Blob Storage.
The following SO article shows how you can use GZipStream to store unzipped content into MemoryStream and then use UploadFromStream() API to store directly to Azure Blob storage:
How do I use GZipStream with System.IO.MemoryStream?
If you don't have any action related to your unzipped file then storing directly to Azure Blob storage is best however if you have to do something with unzipped content you can save locally as well as storage to Azure Blob storage back for further usage.
This example, using SharpZipLib, extracts a .gzip file to a stream. From there, you could write it to Azure local storage, or to blob storage:
http://wiki.sharpdevelop.net/GZip-and-Tar-Samples.ashx

WP 7 Isolated Storage

In my WP 7 App, i have to store the images and XML file of two types,
1: first type of files are not updated frequently on server so i want to store them Permanently on local storage so that when ever app starts it can access these files from local storage , and when these files are updated on server , also update local storage files.I want these files not to be deleted on application termination.
2: Second type of files are those that i want to save in isolated storage temporarily e.g. app requested a XML file from server , i stored it locally and next time if app requests same file instead of getting it from server get it from local storage , and Delete these files when the application terminates..
How can i do this ?
Thanks
1) Isolated Storage is designed to be used to store data that should remain permanent (until the user uninstalls the app). There's example code of how to write and save a file on MSDN. Therefore, any file you save (temp or not), will be stored until the user uninstalls the app or your app deletes the file.
2) For temporary data, you can use the PhoneApplicationState property. This will automatically delete the files after your app closes. However, there's a size limit (I belive PhoneApplicationService.State has a limit of 4mb).
Alternatively, if the XML file is too big, you can write it to the Isolated Storage. Then, you can handle your page's Closing event and delete the file from Isolated Storage there using the DeleteFile method.

Resources