How can I fix the "invalid character" error when creating a project in Azure Form Recognizer labeling tool - labeling

I'm trying to label data using the Azure Form Recognizer labeling tool.
create Azure account and Form Recognizer resource
download docker image
run local web site
create a project, fill in a project name using plain English string.
I get the “SAS token to blob storage” by opening the "Get Shared Access Signature" on my Azure Storage Account, selecting all permissions, and then paste that string into the "Azure blog storage / SAS URI*" field in the tool.
provide my endpoint and key (endpoint url is copied from Azure Portal "quick start" page)
save.
result:
cannot create a new project due to “invalid character”.

"invalid character" can be caused by using the blob storage SAS token rather than the blob container SAS token. Try creating and using a SAS token to the blob container in the "Azure blob storage / SAS URI" field.

It seems I should have picked the SAS token to the blob container instead of the blob storage. These two terms are quite similar to each other, and they are next to each other in the Azure Storage Explorer UI. I re-created the project with the correct steps and it worked so far.
following are 2 screenshots from Azure Storage Explorer, hope this helps:

Related

Best approach to download files from Azure Blob Storage using file urls

What is the best approach for the following requirement: I have files stored in Azure Blob Storage with a Private endpoint. I need to show to the user, a table with a column containing these file URLs in the browser. When a user clicks on the link, it should either open the file in a new tab or download the file.
I could show the URLs and download files using the Power BI report when the Blob Storage has public access. But in my scenario, it's a Private endpoint. I tried appending the SAS token to the URL, and that also worked, but in this case, SAS token is visible in the browser which is not allowed in my case. So this also does not work for my scenario.
Can we achieve this using Power BI or Power Apps or any other tools/api?
Could you please suggest the steps?
Thanks
I tried in my environment and got below results:
If you need download blob through browser with private container, you need Blob URL + SAS token because it provides read and writes to access blobs.
If you are using public container, you can download with Blob URL through browser.
There is no option to call your Blob URL by hiding SAS token.
As workaround, if you need to hide the SAS token from the user and try to access the blob, you can make use of Power Automate connector here:
I have connected to my Azure blob storage with these Power Automate connectors. 1) I have first connected to my Blob URL endpoint> And created a SAS URI with this connector.
The SAS URI is created successfully by the connector, and I have saved the SAS token inside the compose variable for it to be hidden.
Later I call an HTTP trigger to trigger the SAS token and get the Image content as HTTP body in the output.
When I decode the HTTP body content> I get my Blob Image successfully.
You can convert base64 to image you can use this link:
If you want your users to access this Blob, you can further convert this b64 encoded string into Image and save the file in OneDrive, SharePoint for your users to access you can refer this link by eric-cheng.
If you want to Email your users the Blob file, that can also be done by using Outlook, Gmail, etc connectors later.

Azure Blob Storage file access

Is there a configuration in Azure Blob storage that lets you link to a single file (or one that lets you link to a specific 'folder' in the Azure portal interface), but redirects the viewer into a login screen if they're not already signed in?
I am not terribly familiar with Azure Blob storage yet, but I see an option for 'anonymous' access, which isn't what I want (I want them to need to be logged in and have the proper permissions for that container), and I see an option for SAS (which isn't what I want, because it grants anyone who has the link access, and is time-boxed)
https://learn.microsoft.com/en-us/answers/questions/435869/require-login-when-accessing-blob-storage-url.html
This link appears to be asking the same question, and the response says something about 'role-based authentication' - I get the concept of adding roles to users, and using those as the authorization, but even as the owner of the blob container I can't seem to just link to myservice.blob.core.windows.net/container/myfile.jpg and download it without appending a SAS key.
Nor a way to link to myservice.blob.core.windows.net/container/myfolder and have it authenticate them then take them into that 'directory' in the UI.
If the access level of the container is set to public anonymous, we can directly access the Blob Uri in the browser to access the blobs.
If the access level of the container is set to private, opening the Blob Uri in the browser doesn’t redirect the user to the login screen. Instead, it will give ResourceNotFound error.
Even the proper role is assigned in the Role Assignments for the blob storage, still we would not be able to access the Blob Uri from the browser without appending the SAS token. Because, opening the direct Blob Uri in the browser doesn't trigger the OAuth flow.
Even though, it is not possible to access the blob Uri from browser and download the files, there are other ways to accomplish this.
We can use Azure CLI, PowerShell and Rest API to access the blob data with the authenticated users.
If you want to access the blob data from the browser, we can use function app. We can enable the function app for authentication. Then the authenticated users can access the blob data via function app.
Reference : azure - Access a blob file via URI over a web browser using new AAD based access control - Stack Overflow

How to view Blob Storage logs in Azure App Insights?

I just uploaded a blob to an Azure Blob Storage Container. Almost immediately, there was an entry in the .log file when viewed using Azure Storage Explorer:
How do I view these logs (or similar) using Application Insights?
The only App Insights table that seems to have any data is the AzureActivity table, but it only shows the List Storage Account Keys activity, not the actual filename/size/etc of the uploaded blob.
StorageBlobLogs is empty (this is the most reasonably sounding table where the data would be):
AzureMetrics is also empty:
Your requirement can be achieve, but maybe you need to do some custom filter.
First, the log analysis of storage will not log these things.
You need to send logs to Log Analytics workspace:
And the operation name will be a little different, such as 'create blob' will change to 'put blob'(This is because the most basic interactive behaviors are implemented through rest api. So you need to find the rest api request corresponding to each behavior. After that you can know the name of the corresponding behavior.).
This is the official doc, you can do a check.

How can you use Managed Identity authentication and the Blob connector?

How can you use Managed Identity authentication and the Blob connector?
The blob API Connector only seems to have an option to enter a SAS key, not to use Managed Identity.
I can do "raw HTTP" requests, e.g. access-to-blob-storage-using-managed-identity-in-logic-apps, but then I lose all the Blob actions, e.g. "List Blobs".
So is there anyway to have the nice features of the Blob actions, and use Managed Identity?
So is there anyway to have the nice features of the Blob actions, and use Managed Identity?
I test in my site and can use blob action with Managed Identity. You could refer to the following steps to do it.
1.Go to your logic app > Identity> turn on System assigned Managed Identity.
2.Go to your Storage > Access control > Add role assignment and add your logic app with Storage Blob Data Control role like below:
3.Then you could list blob in logic app.Use the following designer:
And the output is as below:

message.Attachments.ContentURL differences between Bot Emulator and production

In my bot (using C# version of BotBuilder) I expect to receive attached files from users.
I am accepting them by processing message.Attachments object.
When running in Emulator for attached file I will get ContentURL like this:
http://localhost:9000/content/8a684db8?file=IMG-20160503-WA0002.jpg
from here I parse the URL to get to the filename and store it in local azure storage blob.
When I deploy my bot to azure and uploading files from Telegram (only chat app I connected so far) same file is posted with URL like this:
https://bcattachmentsprod.blob.core.windows.net/635994216000000000/3DOUR10S0J2IL4
from here I am loosing filename, trying to inspect other message.Attachments properties it does not seem to be there. Making WebRequst on this URL in a hope to have filename somewhere in the header also does not render results
Is this an intended behavior for posting URL when deployed to Azure and if yes, how can I get to the filename that user has attached?
I seem to like how ContentURL is formed when running under Emulator.
I'm not familiar with BotBuilder, but the blob name is always part of the URL that serves as the address for that blob.
The storage emulator uses well-known endpoints. For Blob storage, the URL will always have this address, where "mycontainer" is the name of your container and "myblob" is the name of your blob. So as you can see, the blob name is part of the URL path.
http://127.0.0.1:10000/devstoreaccount1/mycontainer/myblob
The emulator uses this URL format because it is addressing a resource on your local computer.
In your Azure storage account in the cloud, the URL will look like this, where "myaccount" is the name of your storage account, "mycontainer" is the name of your container, and "myblob" is the name of your blob:
https://myaccount.blob.core.windows.net/mycontainer/myblob
In your example above, "file=IMG-20160503-WA0002.jpg" is not part of a Blob storage URL. Perhaps this is appended by BotBuilder?
Please see these topics for more information:
https://msdn.microsoft.com/en-us/library/azure/dd135715.aspx
https://azure.microsoft.com/en-us/documentation/articles/storage-use-emulator/

Resources