Does Azure storage premium account support block blob? - azure-blob-storage

just created a few premium azure storage accounts with general-purpose v2, but it seem doesn't support block blob, when I try to upload from either Portal or Data Explorer, only option is page blob. but according document, general purpose v2 should support all 3 blob types.

Premium V2 storage accounts do not support Block Blobs. They only support Page Blobs and local redundancy level (LRS). If you need support for all kinds of blobs (Append, Block and Page) in a single storage account, you would need to create a "Standard" kind of account.
You would need to create a BlockBlobStorage kind of account if you want premium performance for Block blobs. Please note that this kind of account only supports Block blobs.
You can read more about it here: https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-performance-tiers.

Related

Azure Storage Blob Policy to enforce ClientSideEncryptionVersion.V2_0

i updated my application because of this security vulnerability.
Is it possible to set a policy in Azure Storage Blob that only blobs without encryption or with ClientSideEncryptionVersion.V2_0 can be uploaded?
Upload attempts with ClientSideEncryptionVersion.V1_0 should be blocked.
We don't have this feature today. You may leave your feedback here All the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
Client side encryption can be done only using SDK because we use CEK to encrypt the data before uploading, so you have to write your own custom logic for this scenario, Also make sure that all your application are upgraded to latest SDK.
Alternatively, would it be possible to use the Azure Metric Explorer to evaluate the upload attempts with the ClientSideEncryptionVersion?
No. Encryption metadata is stored with the blob, but we don’t have granular tracking for what metadata is set.
Or can you alternatively, for example, give a tag or an application version when uploading that is saved together with the blob?
This is completely dependent on the application and the features you use in storage/the SDK. There isn’t anything automatically enabled by the SDK.
Please let me know if you have any further queries.

How to copy existing Blob into Azure Media Services using the CLI

I'm looking for a way to copy an existing Blob from an Azure "Storage account" to an Azure "Media Services account" using the CLI or the portal.
There seems to be plenty of documentation for Windows centric platforms- but nothing for 'nix or Mac users.
Side note: Why is there an assumption that I would want to do this via code? (which btw also seems to require a Windows platform)? I've got a large video (e.g., larger then the 200MB limit for portal upload) - I want to load it into media service - why is that so difficult?
I'm not an expert on this, but from a quick glance at the portal and testing it, it seems like you can just upload your video to the (classic) Storage Account created with Azure Media Services, using any tool that can handle Azure Blob Storage, such as the cross platform Cyberduck. After that you can simply select that video in the portal under upload content > from storage. Note that it only works with classic storage accounts.
You can upload you media to any blob storage account
Then when you want to create a media services asset you can copy it to the Media Services blob storage and create an asset using the blob you moved.
Unfortunately at the moment Azure Media Service is not supported VIA Azure Cli
But there is a power shell support
See this article
You can write some code and do everything using the SDK
There are SDKs for .Net , Java, Python, Node, Php and more...
To be visible in the Azure Media Service storage explorer UI, your files need to add a valid extension, as answered here : Azure not displaying blobs within container in Storage Account
Hope this helps,
Julien

Create Azure VM from image created under different subscription

Is there a way to take a backup of a VM (image capture following the sysprep method) and then make that image in the gallery visible to someone under their subscription. For example, I create a VM, archive it off to the image gallery, then my colleague comes along and wants to create a VM from that image (the colleague cannot be a co-administrator on my subscription).
Alternatively, is there just a way to move the VM to a different subscription without archving it off and recreating?
Have found the following but this is a bit convoluted and requires purchase of third party software
http://gauravmantri.com/2012/07/04/how-to-move-windows-azure-virtual-machines-from-one-subscription-to-another/
Many thanks
Richard Clarke
I don't think you can use images created in one subscription to create VMs in another subscription. You would need to copy those images into your subscription. Since images are nothing but Page Blobs in your storage account, you would need to copy them into a storage account in your target subscription, create images off of them and then deploy VMs. I'm not aware of any other way around it.
Regarding your comment about requires purchase of third party software, that's not really true. You don't have to buy 3rd party software. The main thing is to move your VHDs (which are page blobs) from one subscription to another. Do take a look at Step 1 - Copy Blobs in that blog post. It has a link to console application with source code that you can use to copy blob across. I used Cloud Storage Studio to explore my blob storage. You can use any other storage explorer to check the contents of blob storage (including Windows Azure portal). Cerebrata recently released a free blob storage explorer which you may want to check out: http://www.cerebrata.com/labs/azure-explorer.
Unfortunately that's the only way that I know of to create Azure VMs from images created under different subscriptions.
This is now possible using Azure Shared Image Galleries
Edit:
#Shanky the subscription ID is part of the resource ID for the image version. It will look like this:
/subscriptions/$subscriptionId/resourceGroups/$resourceGroup/providers/Microsoft.Compute/galleries/$galleryName/images/$galleryImageDefinition/versions/$imageVersion
You pass that as the ImageReference.id when creating a VM. It's similar to creating a VM from a marketplace image but you just pass the ImageReference ID instead of the publisher/sku/etc.

How safe is storing my role service package in Azure blob storage?

Azure Management Portal allows deploying a service from a service package earlier uploaded to Azure blob storage. This looks very convenient but kind of paranoid - what if some third party accesses the blob storage and retrieves the executables comprising my role?
How safe is storing role service package in Azure blob storage? What are better alternatives if any?
There are a few attack vectors to get to blob storage and you are in control of all of them, so it is up to you to secure the access. Specifically:
Securing your Primary and Secondary secret keys to the storage account. Loss of these keys would compromise the storage account. All acccess by default to blob storage must be authenticated.
Securing any and all management certificates (private key) for the subscription. A management certificate holder can always get the storage keys for all storage accounts in the sub, so this is a total compromise.
Securing the container with the package. If you mark the container public, folks can get it without a storage key.
Removing any Signed Identifiers or making sure you are not unwittingly allowing access through a poorly crafted SAS signature.
That's it. Unless there is an actual security issue with blob storage service (that we currently don't know about), those are the only ways to get access. If you secure it, it is pretty safe and I don't think there is a better alternative to store a package in Windows Azure.
One last thing: the package you upload by default is actually encrypted. Even if someone downloaded it, the only thing that can decrypt it is the fabric controller. I think you have other issues you should worry more about.

What are viable ways to develop an Azure app on multiple machines

The scenario is that I am rebuilding an application that is presently SQL and classic asp. However I want to update this a bit to leverage Azure Tables. I know that the Azure SDK has the Dev Fabric storage thing available and I guess it's an option to have that installed on all of my machines.
But I'm wondering if there is a less 'invasive' way to mimick the Azure Tables. Do object DBs or document DBs provide a reasonable facsimile that could be used for the early protoyping. Or is making the move from them to the Azure SDK tables just more headache than it's worth?
In my opinion you should skip the fake Azure tables completely. Even the MS development storage is not an exact match to how things will actually run in the cloud. You get 1M transactions for $1, 1GB of storage for $0.15 and $0.15 per GB in/out of the data centre. If you're just prototyping, live dangerously and spend $10.
If you're new to working with Azure tables and you try to use a development storage or some other proxy you'll save yourself that much money in time spent reworking your code to work against the real thing.
If you're just using tables and not queues, blobs $10 will go a long way.
Each Azure "project" (which is like an Azure account) is initially limited to 5 hosted storage accounts. Let's say you have one storage account for your actual content, and one for diagnostics. Now let's say you want a dev and QA storage account for each of those, respectively, so you don't damage production data. You've now run out of your storage accounts (in the above scenario, you'd need 6 hosted accounts). You'd need to call Microsoft and ask for this limit to be increased...
If, instead, you used the Dev Fabric for your tables during development / local testing, you'll free up Azure storage accounts (the ones used for dev - you'd still want QA storage accounts to be in Azure, not Dev Fabric).
One scenario where you can't rely on Dev Fabric for storage: performance/load testing.

Resources