primary and secondary location URI ina storageURI must point to the same resource windows azure - azure-blob-storage

I am just try to connect my Blob Storage with .net SDK with MVC application and here is my code;
public static CloudBlobClient CreateClient(UnitOfWork uow)
{
CloudStorageList credentials;
CloudBlobClient client;
credentials = uow.RepositoryFor<CloudStorageList>().GetAll(filter: xx => !xx.IsDeleted).FirstOrDefault();
var storageCredentials = new StorageCredentials(credentials.Name, credentials.PrimaryAccessKey);
var storage =
new CloudStorageAccount(storageCredentials,true);
client = storage.CreateCloudBlobClient();
return client;
}
But I am facing the error when I reached the line;
var storage =
new CloudStorageAccount(storageCredentials,true);
I have just mention the error in the subject i.e. primary and secondear location URI in a storageUI must point to the same resource.
Any help will be a favor.
Regards,

var storageCredential = new StorageCredentials(*, **);
is the account name : this is the name of your storagename
** is the keyValue : vieuw in the image

Related

Getting an error while retrieving a blob using user assigned managed identity

We have a C# code which used to retrieve a blob from storage account. The authentication is done using user assigned service principal. These things works till December. But now we are getting some weird error as follows.
ManagedIdentityCredential authentication unavailable. The requested identity has not been assigned to this resource.
Status: 400 (Bad Request)
Content:
{"error":"invalid_request","error_description":"Identity not found"}
The managed identity has storage data blob contributor access in the storage account.
Attaching the code for reference:
public static async Task<string> GetBlobAsync()
{
string storageName = "storage account name";
Uri blobUri = new Uri("blob uri");
TokenCredential cred = new ManagedIdentityCredential("client id");
var blobClient = new BlobClient(blobUri, cred, null);
try
{
var downloadInfo = await blobClient.DownloadAsync();
using (TextReader reader = new StreamReader(downloadInfo.Value.Content))
{
string metadataBlob = await reader.ReadToEndAsync();
return metadataBlob;
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
Console.WriteLine("");
return null;
}
P:S: the three environmental variables such as app id, app secret and tenant id are correct.
I have been stuck here for almost a month. Nothing works.
This document demonstrates how to use managed identity to access App Configuration from App Service, but you can replace the App Service with any other Azure services that support managed identity. https://learn.microsoft.com/en-us/azure/azure-app-configuration/howto-integrate-azure-managed-service-identity
Here are a few things I'd like to call out
Make sure the managed identity is enabled in the Azure service where your application runs.
When you are using system assigned managed identity, you don't need to provide the client Id. You only need to provide the client Id when you use user assigned managed identity.
Make sure the managed identity is granted either App Configuration Data Reader or App Configuration Data Owner role in the access control of your App Configuration instance.
Wait for at least 15 minutes after the role assignment for the permission to propagate.
Managed identity can ONLY work when your code is running in the Azure service. It will NOT work when running locally
Try this
Uri blobUri = new Uri("blob uri");
var cred = new DefaultAzureCredential(
new DefaultAzureCredentialOptions {
ManagedIdentityClientId = "your client id" });
var blobClient = new BlobClient(blobUri, cred, null);
ref: https://learn.microsoft.com/pt-br/dotnet/api/azure.identity.defaultazurecredential?view=azure-dotnet
Option 2 (work for me)
Create Identity and add in app service
Assign RBAC "Storage Blob Data Contributor" to your storage resource.
Add Key AZURE_CLIENT_ID (clientid of the identity that was created) in Environment App Service
Code to access blob
(you don't need to specify client id in the code because it will use the AZURE_CLIENT_ID configured in the AppService)
app.MapGet("/read", async () =>
{
Uri blobUri = new Uri("https://xxxx.blob.core.windows.net/texts/text.txt");
var cred = new DefaultAzureCredential();
var blobClient = new BlobClient(blobUri, cred, null);
var downloadInfo = await blobClient.DownloadAsync();
using (TextReader reader = new StreamReader(downloadInfo.Value.Content))
{
string metadataBlob = await reader.ReadToEndAsync();
return metadataBlob;
}
});
Result print

Upload to Azure Blob from Xamarin.Forms PCL

I'm trying to upload an image's stream to Azure Blob from a Xamarin.Forms PCL app using WindowsAzure.Storage 7.0.2-preview. For some reason, the StorageCredentials doesn't recognize the AccountName of the SAS token.
var credentials = new StorageCredentials("https://<ACCOUNT-NAME>.blob.core.windows.net/...");
CloudStorageAccount Account = new CloudStorageAccount(credentials, true);
And then uploading it like this:
public async Task<string> WriteFile(string containerName, string fileName, System.IO.Stream stream, string contentType = "")
{
var container = GetBlobClient().GetContainerReference(containerName);
var fileBase = container.GetBlockBlobReference(fileName);
await fileBase.UploadFromStreamAsync(stream);
if (!string.IsNullOrEmpty(contentType))
{
fileBase.Properties.ContentType = contentType;
await fileBase.SetPropertiesAsync();
}
return fileBase.Uri.ToString();
}
How can I resolve my problem? Is there a better solution of uploading to Azure Storage?
Thank you!
The StorageCredentials constructor that takes in a SAS token expects just the query part of the SAS token, not the full URI string. If you have the full URI to a resource, including the SAS token, the most common thing to do is to use the constructor for that object directly. For example, if you have the URI:
string myBlobUri = #"https://myaccount.blob.core.windows.net/sascontainer/sasblob.txt?sv=2015-04-05&st=2015-04-29T22%3A18%3A26Z&se=2015-04-30T02%3A23%3A26Z&sr=b&sp=rw&sip=168.1.5.60-168.1.5.70&spr=https&sig=Z%2FRHIX5Xcg0Mq2rqI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D";
you can just create a blob object as such:
CloudBlockBlob fileBase = new CloudBlockBlob(new Uri(myBlobUri));
If you do want to use the StorageCredentials and CloudStorageAccount classes for some reason (maybe you have an AccountSAS, for example), here is one way to make that work:
string sasToken = #"sv=2015-04-05&st=2015-04-29T22%3A18%3A26Z&se=2015-04-30T02%3A23%3A26Z&sr=b&sp=rw&sip=168.1.5.60-168.1.5.70&spr=https&sig=Z%2FRHIX5Xcg0Mq2rqI3OlWTjEg2tYkboXr1P9ZUXDtkk%3D";
StorageCredentials credentials = new StorageCredentials(sasToken);
CloudStorageAccount account = new CloudStorageAccount(credentials, "myaccount", "core.windows.net", true)

Where to use SAS Token in Xamarin.Android

I'm making an Android app that will connect to an Azure Storage Account to save information in a table. When I run the app in the simulator, and press the button that opens the page that connects to the database I get an exception stating "Shared Key is not supported using the PCL. Please use a SAS token."
So I followed the steps to generate a SAS token but I'm not sure what to do with the string. Can anyone suggest where I should place the string?
namespace UndergroundSports
{
[Activity]
public class austinBowlingSignUpPage : Activity
{
protected override async void OnCreate (Bundle savedInstanceState)
{
base.OnCreate (savedInstanceState);
SetContentView (Resource.Layout.austinBowlingSignUpPage);
EditText austinBowlingFullNameEntry = FindViewById<EditText> (Resource.Id.austinBowlingFullNameEntry);
EditText austinBowlingEmailEntry = FindViewById<EditText> (Resource.Id.austinBowlingEmailEntry);
Button austinBowlingSubmitButton = FindViewById<Button> (Resource.Id.austinBowlingSignUpButton);
string sas = "https://undergroundathletes.blob.core.windows.net/underground-container?sv=2015-04-05&sr=c&sig=Gcgc28K%2B\nc6uQk9pkHRAotshR7zEU%3D&se=2016-04-20T18%3A13%3A31Z&sp=rwdl";
string connectionString =
"DefaultEndpointsProtocol=http;" +
"AccountName=My_Account_Name;" +
"AccountKey=My_Account_Key";
CloudStorageAccount storageaccount = CloudStorageAccount.Parse (connectionString);
CloudTableClient tableClient = storageaccount.CreateCloudTableClient ();
CloudTable austinBowlingAthletes = tableClient.GetTableReference ("austinBowlingAthletesTable");
await austinBowlingAthletes.CreateIfNotExistsAsync();
austinBowlingSubmitButton.Click += async (sender, e) => {
austinBowlingAthlete austinBowlingAthlete1 = new austinBowlingAthlete();
austinBowlingAthlete1.fullname = austinBowlingFullNameEntry.ToString();
austinBowlingAthlete1.email = austinBowlingEmailEntry.ToString();
TableOperation insertOperation = TableOperation.Insert(austinBowlingAthlete1);
await austinBowlingAthletes.ExecuteAsync(insertOperation);
};
}
}
}
You need to create StorageCredentials based on the SAS token, use the credentials to create CloudStorageAccount and call CreateCloudTableClient on the storage account to get CloudTableClient:
StorageCredentials creds = new StorageCredentials(sas);
CloudStorageAccount storageAccount = new CloudStorageAccount(creds, null, null, tableStorageUri, null);
CloudTableClient tableClient = storageAccount.CreateCloudTableClient();
CloudTable austinBowlingAthletes = tableClient.GetTableReference("austinBowlingAthletesTable");
Everything else stays the same.
Check this example: Teal (Azure Storage sample for Xamarin)

Configuring Development Storage Account in Server Explorer

I have changed the ports that Azure Storage Emulator runs on from 10000,10001,10002 to 10003,10004,10005 from the config file at "C:\Program Files (x86)\Microsoft SDKs\Azure\Storage Emulator\WAStorageEmulator.exe.config"
Now when I try to access Development Storage from Server Explorer in Visual Studio 2013 it fails to access the updated ports. I tried to manually add external storage and specify the endpoints to reflect the updated ports with the following info default storage account information:
DefaultEndpointsProtocol=http
AccountName=devstoreaccount1
AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==
BlobEndpoint=http://127.0.0.1:10003/devstoreaccount1
QueueEndpoint=http://127.0.0.1:10004/devstoreaccount1
TableEndpoint=http://127.0.0.1:10005/devstoreaccount1
but that still does not allow it to connect. I also tried the same endpoints but without the storage account suffix. It even reverts the ports to 10000,10001,10002 when I refresh the External Storage. I assume it is reading from some config somewhere but I cannot seem to google any answer as to where this is being read from.
So how can I configure Server Explorer to reflect the updated ports?
The ports are hard coded into the CloudStorageAccount class so no you can't modifiy them:
private static CloudStorageAccount GetDevelopmentStorageAccount(Uri proxyUri)
{
UriBuilder uriBuilder = proxyUri != (Uri)null ? new UriBuilder(proxyUri.Scheme, proxyUri.Host) : new UriBuilder("http", "127.0.0.1");
uriBuilder.Path = "devstoreaccount1";
uriBuilder.Port = 10000;
Uri uri1 = uriBuilder.Uri;
uriBuilder.Port = 10001;
Uri uri2 = uriBuilder.Uri;
uriBuilder.Port = 10002;
Uri uri3 = uriBuilder.Uri;
uriBuilder.Path = "devstoreaccount1-secondary";
uriBuilder.Port = 10000;
Uri uri4 = uriBuilder.Uri;
uriBuilder.Port = 10001;
Uri uri5 = uriBuilder.Uri;
uriBuilder.Port = 10002;
Uri uri6 = uriBuilder.Uri;
CloudStorageAccount cloudStorageAccount = new CloudStorageAccount(new StorageCredentials("devstoreaccount1", "Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="), new StorageUri(uri1, uri4), new StorageUri(uri2, uri5), new StorageUri(uri3, uri6), (StorageUri)null);
cloudStorageAccount.Settings = (IDictionary<string, string>)new Dictionary<string, string>();
cloudStorageAccount.Settings.Add("UseDevelopmentStorage", "true");
if (proxyUri != (Uri)null)
cloudStorageAccount.Settings.Add("DevelopmentStorageProxyUri", proxyUri.ToString());
cloudStorageAccount.IsDevStoreAccount = true;
return cloudStorageAccount;
}
Unfortunately, there is no support for changing the Azure Storage Emulator ports.

Is it possible to protect Azure connection strings that are referenced with CloudConfigurationManager?

I've read the MSDN blog posts on protecting sensitive data in web.config by encrypting the contents and setting up a certificate on Azure so they can be read back.
However, there is top-secret data in my 'service configuration' .cscfg files in the Visual Studio Azure Deployment project. We store connection strings and other sensitive data here so that the test system, also on Azure, can be directed to equivalent test back-end services.
This data is accessed with CloudConfigurationManager (e.g. .GetSetting("AwsSecretKey")) rather than WebConfigurationManager as discussed in the blog post.
Is it possible to protect this data in a similar way? It's important that we have different AWS and SQL connection strings in test and production, and that the production keys are hidden from me and the rest of the dev staff.
YES, we do this with a x509 cert uploaded in the deployment configuration. However, the settings are only as secure as your policy/procedures for protecting the private key! Here is the code we use in an Azure Role to decrypt a value in the ServiceConfiguration:
/// <summary>Wrapper that will wrap all of our config based settings.</summary>
public static class GetSettings
{
private static object _locker = new object();
/// <summary>locked dictionary that caches our settings as we look them up. Read access is ok but write access should be limited to only within a lock</summary>
private static Dictionary<string, string> _settingValues = new Dictionary<string, string>();
/// <summary>look up a given setting, first from the locally cached values, then from the environment settings, then from app settings. This handles caching those values in a static dictionary.</summary>
/// <param name="settingsKey"></param>
/// <returns></returns>
public static string Lookup(string settingsKey, bool decrypt = false)
{
// have we loaded the setting value?
if (!_settingValues.ContainsKey(settingsKey))
{
// lock our locker, no one else can get a lock on this now
lock (_locker)
{
// now that we're alone, check again to see if someone else loaded the setting after we initially checked it
// if no one has loaded it yet, still, we know we're the only one thats goin to load it because we have a lock
// and they will check again before they load the value
if (!_settingValues.ContainsKey(settingsKey))
{
var lookedUpValue = "";
// lookedUpValue = RoleEnvironment.IsAvailable ? RoleEnvironment.GetConfigurationSettingValue(settingsKey) : ConfigurationManager.AppSettings[settingsKey];
// CloudConfigurationManager.GetSetting added in 1.7 - if in Role, get from ServiceConfig else get from web config.
lookedUpValue = CloudConfigurationManager.GetSetting(settingsKey);
if (decrypt)
lookedUpValue = Decrypt(lookedUpValue);
_settingValues[settingsKey] = lookedUpValue;
}
}
}
return _settingValues[settingsKey];
}
private static string Decrypt(string setting)
{
var thumb = Lookup("DTSettings.CertificateThumbprint");
X509Store store = null;
try
{
store = new X509Store(StoreName.My, StoreLocation.LocalMachine);
store.Open(OpenFlags.ReadOnly);
var cert = store.Certificates.Cast<X509Certificate2>().Single(xc => xc.Thumbprint == thumb);
var rsaProvider = (RSACryptoServiceProvider)cert.PrivateKey;
return Encoding.ASCII.GetString(rsaProvider.Decrypt(Convert.FromBase64String(setting), false));
}
finally
{
if (store != null)
store.Close();
}
}
}
You then can leverage RoleEnvironment.IsAvailable to only decrypt values in the emulator or deployed environment, thereby running the web role in local IIS using an unencrypted App setting with key="MyConnectionString" for local debugging (without the emulator):
ContextConnectionString = GetSettings.Lookup("MyConnectionString", decrypt: RoleEnvironment.IsAvailable);
Then, to complete the example, we created a simple WinForsm App with the following code to encrypt/decrypt the value with the given cert. Our production team maintains access to the production cert and encrypts the necessary values using the WinForms App. They then provide the DEV team with the encrypted value. You can find a full working copy of the solution here. Here's the main code for the WinForms App:
private void btnEncrypt_Click(object sender, EventArgs e)
{
var thumb = tbThumbprint.Text.Trim();
var valueToEncrypt = Encoding.ASCII.GetBytes(tbValue.Text.Trim());
var store = new X509Store(StoreName.My, rbLocalmachine.Checked ? StoreLocation.LocalMachine : StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
var cert = store.Certificates.Cast<X509Certificate2>().Single(xc => xc.Thumbprint == thumb);
var rsaProvider = (RSACryptoServiceProvider)cert.PublicKey.Key;
var cypher = rsaProvider.Encrypt(valueToEncrypt, false);
tbEncryptedValue.Text = Convert.ToBase64String(cypher);
store.Close();
btnCopy.Enabled = true;
}
private void btnDecrypt_Click(object sender, EventArgs e)
{
var thumb = tbThumbprint.Text.Trim();
var valueToDecrypt = tbEncryptedValue.Text.Trim();
var store = new X509Store(StoreName.My, rbLocalmachine.Checked ? StoreLocation.LocalMachine : StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
var cert = store.Certificates.Cast<X509Certificate2>().Single(xc => xc.Thumbprint == thumb);
var rsaProvider = (RSACryptoServiceProvider)cert.PrivateKey;
tbDecryptedValue.Text = Encoding.ASCII.GetString(rsaProvider.Decrypt(Convert.FromBase64String(valueToDecrypt), false));
}
private void btnCopy_Click(object sender, EventArgs e)
{
Clipboard.SetText(tbEncryptedValue.Text);
}

Resources