How can you use Managed Identity authentication and the Blob connector? - azure-blob-storage

How can you use Managed Identity authentication and the Blob connector?
The blob API Connector only seems to have an option to enter a SAS key, not to use Managed Identity.
I can do "raw HTTP" requests, e.g. access-to-blob-storage-using-managed-identity-in-logic-apps, but then I lose all the Blob actions, e.g. "List Blobs".
So is there anyway to have the nice features of the Blob actions, and use Managed Identity?

So is there anyway to have the nice features of the Blob actions, and use Managed Identity?
I test in my site and can use blob action with Managed Identity. You could refer to the following steps to do it.
1.Go to your logic app > Identity> turn on System assigned Managed Identity.
2.Go to your Storage > Access control > Add role assignment and add your logic app with Storage Blob Data Control role like below:
3.Then you could list blob in logic app.Use the following designer:
And the output is as below:

Related

Azure Blob Storage file access

Is there a configuration in Azure Blob storage that lets you link to a single file (or one that lets you link to a specific 'folder' in the Azure portal interface), but redirects the viewer into a login screen if they're not already signed in?
I am not terribly familiar with Azure Blob storage yet, but I see an option for 'anonymous' access, which isn't what I want (I want them to need to be logged in and have the proper permissions for that container), and I see an option for SAS (which isn't what I want, because it grants anyone who has the link access, and is time-boxed)
https://learn.microsoft.com/en-us/answers/questions/435869/require-login-when-accessing-blob-storage-url.html
This link appears to be asking the same question, and the response says something about 'role-based authentication' - I get the concept of adding roles to users, and using those as the authorization, but even as the owner of the blob container I can't seem to just link to myservice.blob.core.windows.net/container/myfile.jpg and download it without appending a SAS key.
Nor a way to link to myservice.blob.core.windows.net/container/myfolder and have it authenticate them then take them into that 'directory' in the UI.
If the access level of the container is set to public anonymous, we can directly access the Blob Uri in the browser to access the blobs.
If the access level of the container is set to private, opening the Blob Uri in the browser doesn’t redirect the user to the login screen. Instead, it will give ResourceNotFound error.
Even the proper role is assigned in the Role Assignments for the blob storage, still we would not be able to access the Blob Uri from the browser without appending the SAS token. Because, opening the direct Blob Uri in the browser doesn't trigger the OAuth flow.
Even though, it is not possible to access the blob Uri from browser and download the files, there are other ways to accomplish this.
We can use Azure CLI, PowerShell and Rest API to access the blob data with the authenticated users.
If you want to access the blob data from the browser, we can use function app. We can enable the function app for authentication. Then the authenticated users can access the blob data via function app.
Reference : azure - Access a blob file via URI over a web browser using new AAD based access control - Stack Overflow

Where to store the BigCommerce access token and other data

I am using laravel to build an app in BigCommerce. I am able to get access token but I need to store that for future requests. What is the best possible way to store the app data for BigCommerce?
I've got this working by creating a DB schema in Laravel where in there are tables like stores, users, app_settings.
Whenever the user installs an app, I am storing an access token and other information like store hash in stores table and user details in users table.
Whenever the app is loaded I could get the store and user information via verify signed request payload. Using this I am able to configure my app settings for the user and store those in app settings table.
So when I create a Webhook for the store, I could get the store hash from response as producer key and accordingly I can find the access token for the store using store hash in stores table.
If you're git ignoring your config.json or .env files, you could store these there. However after speaking with one of our Developer Advocates, I wanted to pass along some best practice advice. :) You may want to consider using a secrets manager for option #1 in your decision here. A secrets manager meaning a tool to safely store these variables like Secrets in Github or Key Vault in Azure.
Also, this resource may be helpful to review for your use case. https://www.codementor.io/#ccornutt/keeping-credentials-secure-in-php-kvcbrk55z

How to view Blob Storage logs in Azure App Insights?

I just uploaded a blob to an Azure Blob Storage Container. Almost immediately, there was an entry in the .log file when viewed using Azure Storage Explorer:
How do I view these logs (or similar) using Application Insights?
The only App Insights table that seems to have any data is the AzureActivity table, but it only shows the List Storage Account Keys activity, not the actual filename/size/etc of the uploaded blob.
StorageBlobLogs is empty (this is the most reasonably sounding table where the data would be):
AzureMetrics is also empty:
Your requirement can be achieve, but maybe you need to do some custom filter.
First, the log analysis of storage will not log these things.
You need to send logs to Log Analytics workspace:
And the operation name will be a little different, such as 'create blob' will change to 'put blob'(This is because the most basic interactive behaviors are implemented through rest api. So you need to find the rest api request corresponding to each behavior. After that you can know the name of the corresponding behavior.).
This is the official doc, you can do a check.

How can I fix the "invalid character" error when creating a project in Azure Form Recognizer labeling tool

I'm trying to label data using the Azure Form Recognizer labeling tool.
create Azure account and Form Recognizer resource
download docker image
run local web site
create a project, fill in a project name using plain English string.
I get the “SAS token to blob storage” by opening the "Get Shared Access Signature" on my Azure Storage Account, selecting all permissions, and then paste that string into the "Azure blog storage / SAS URI*" field in the tool.
provide my endpoint and key (endpoint url is copied from Azure Portal "quick start" page)
save.
result:
cannot create a new project due to “invalid character”.
"invalid character" can be caused by using the blob storage SAS token rather than the blob container SAS token. Try creating and using a SAS token to the blob container in the "Azure blob storage / SAS URI" field.
It seems I should have picked the SAS token to the blob container instead of the blob storage. These two terms are quite similar to each other, and they are next to each other in the Azure Storage Explorer UI. I re-created the project with the correct steps and it worked so far.
following are 2 screenshots from Azure Storage Explorer, hope this helps:

Authenticating dynamics CRM plugin to access Web API 2 methods

I have written a plugin in dynamics CRM. This plugin accesses a few Web API 2 methods that are deployed in Azure cloud (via HTTPS). The plug-in is triggered when a contact data in the CRM changes. Many CRM account holders will update the contact data.
I am going to hard code a 'secret key' (a one time generated Guid) in the plug-in and send this key every time I access the web api methods. I'll validate this guid in the web api methods to prevent un-authorized access.
I do not like to store the secret key (guid) in the source code.
Questions
What are my alternatives if do not want to 'hard code' the secret key?
What are the security flaws in this approach?
Note
In general, all my Web APIs are authenticated by a custom authentication web api filter, but the Web APIs that are accessed from the plugin are not part of the custom authentication.
CRM version is 2013
As the previous answers states, the first option is to store your information in a configuration custom entity that you can retrieve from your plugin. Those records are going to be protected by the CRM security model, so if your plugin is running in the calling user context you will need to make sure that the users have privileges to read that information (not really a good idea) or change the plugin to be executed under an admin user context.
Another option is to use Secure/Unsecure Configuration:
Those are two (string) parameters that you can configure within the step and you will be able to read them from the plugin. I would say that the secure configuration fits your requirement but give it a look. You can also easily find how to implement it (example).
The third and last option that I can think of, is to create an XML WebResource and read it from the plugin. Again, you will need to make sure that the user context under the plugin is running has access to it.
I don't think this approach will ever be secure.
It's possible to extract the plugin assembly from CRM. Someone could then disassemble the assembly and find the Guid. Effectively your password is stored in plain text.
At the very least you could store the user name/password/secret key in a CRM record. The CRM record can then be protected with CRM security.
You are probably better off implementing the authentication 'normally'.

Resources