How to view Blob Storage logs in Azure App Insights? - azure-blob-storage

I just uploaded a blob to an Azure Blob Storage Container. Almost immediately, there was an entry in the .log file when viewed using Azure Storage Explorer:
How do I view these logs (or similar) using Application Insights?
The only App Insights table that seems to have any data is the AzureActivity table, but it only shows the List Storage Account Keys activity, not the actual filename/size/etc of the uploaded blob.
StorageBlobLogs is empty (this is the most reasonably sounding table where the data would be):
AzureMetrics is also empty:

Your requirement can be achieve, but maybe you need to do some custom filter.
First, the log analysis of storage will not log these things.
You need to send logs to Log Analytics workspace:
And the operation name will be a little different, such as 'create blob' will change to 'put blob'(This is because the most basic interactive behaviors are implemented through rest api. So you need to find the rest api request corresponding to each behavior. After that you can know the name of the corresponding behavior.).
This is the official doc, you can do a check.

Related

Best approach to download files from Azure Blob Storage using file urls

What is the best approach for the following requirement: I have files stored in Azure Blob Storage with a Private endpoint. I need to show to the user, a table with a column containing these file URLs in the browser. When a user clicks on the link, it should either open the file in a new tab or download the file.
I could show the URLs and download files using the Power BI report when the Blob Storage has public access. But in my scenario, it's a Private endpoint. I tried appending the SAS token to the URL, and that also worked, but in this case, SAS token is visible in the browser which is not allowed in my case. So this also does not work for my scenario.
Can we achieve this using Power BI or Power Apps or any other tools/api?
Could you please suggest the steps?
Thanks
I tried in my environment and got below results:
If you need download blob through browser with private container, you need Blob URL + SAS token because it provides read and writes to access blobs.
If you are using public container, you can download with Blob URL through browser.
There is no option to call your Blob URL by hiding SAS token.
As workaround, if you need to hide the SAS token from the user and try to access the blob, you can make use of Power Automate connector here:
I have connected to my Azure blob storage with these Power Automate connectors. 1) I have first connected to my Blob URL endpoint> And created a SAS URI with this connector.
The SAS URI is created successfully by the connector, and I have saved the SAS token inside the compose variable for it to be hidden.
Later I call an HTTP trigger to trigger the SAS token and get the Image content as HTTP body in the output.
When I decode the HTTP body content> I get my Blob Image successfully.
You can convert base64 to image you can use this link:
If you want your users to access this Blob, you can further convert this b64 encoded string into Image and save the file in OneDrive, SharePoint for your users to access you can refer this link by eric-cheng.
If you want to Email your users the Blob file, that can also be done by using Outlook, Gmail, etc connectors later.

How to ensure user see the images pertaining to him only from private blob Azure Storage?

Some of the user’s images are stored in Azure BLOB which is not publicly accessible. In our scenario we upload the images (user’s images) on private blob which later needs to be shown at the client side(Angular). Moreover the user should only be able to see the images that is related to him and not the images of other users.
We can generate the list of images URLs at server side but when this is passed to client side to render, it would fail naturally being blob not being public.
Now, being all the users who would access the application are internal to the organization, I believe authorization to access the images can be achieved by AAD/SAS. However at the same time, I am fail to understand how would I ensure or apply the security that if wanted user X should not be able to read the images of user Y?
Regarding the issue, you can use the following suggestions to implement it.
Store the User information and his image information (such as the image's azure blob container name, azure blob blob name ) in the database.
When the user wants to access the image, query the database with the user information to get these image,
After getting the image information, use the sdk to create sas token then return the image's blob url with sas token to cliend. Regarding how to create sas token, please refer to here.

How can I fix the "invalid character" error when creating a project in Azure Form Recognizer labeling tool

I'm trying to label data using the Azure Form Recognizer labeling tool.
create Azure account and Form Recognizer resource
download docker image
run local web site
create a project, fill in a project name using plain English string.
I get the “SAS token to blob storage” by opening the "Get Shared Access Signature" on my Azure Storage Account, selecting all permissions, and then paste that string into the "Azure blog storage / SAS URI*" field in the tool.
provide my endpoint and key (endpoint url is copied from Azure Portal "quick start" page)
save.
result:
cannot create a new project due to “invalid character”.
"invalid character" can be caused by using the blob storage SAS token rather than the blob container SAS token. Try creating and using a SAS token to the blob container in the "Azure blob storage / SAS URI" field.
It seems I should have picked the SAS token to the blob container instead of the blob storage. These two terms are quite similar to each other, and they are next to each other in the Azure Storage Explorer UI. I re-created the project with the correct steps and it worked so far.
following are 2 screenshots from Azure Storage Explorer, hope this helps:

gsutil signed URL, hide file and bucket name

So I can successfully generate a temporary signed url on google cloud storage, with an expiry time etc.
However the signed URL still has the clearly visible bucket name and file name.
Now I understand that when the download occurs, the filename will be visible.. as we have downloaded that file. However it would be nice to obscure the bucket and filename in the URL?
Is this possible, the documentation does not give any clues and a google search session has not really given any results that help.
I don't think there's a way. Bucket naming best practices basically state that bucket and object names are "public", that you should avoid using sensitive information as part of those names, and advice you to use random names if you are concerned about name guessing/enumeration.
A possible workaround for this would be to proxy the "get" for the Cloud Storage objects using Cloud Functions or App Engine, so the app retrieves the objects from Cloud Storage and then send it to the client.
This is more costly, and would require to write more code.
I can think on another possible workaround which consists in protect your Signed URL by code (such as PHP), so that users cannot know what the URL is. Nevertheless, taking in account that you want to avoid any displayed-data on the network activity when downloading, you should test this workaround first to see if this works as intended.

spring mvc store data in server without using database

i am working on a chatbot and need to save context of the previous conversation so that it can be sent to the next message. Now i m integrating it with facebook where facebook doesn't send context and need to store this context somewhere in a server. my client doesn't want to use DB. i tried sessions but technically i dont have a UI (facebook is the UI) . Next i tried ehcache but not able to retrieve data of previous webhook calls. Please let me know if any there is a method to store data and retrieve it latter without using DB.
What you describe is not really a cache usage from what I can tell. That is you do not want to have entries disappear (eviction) and they do not get stale (expiration).
If that is correct, you will need to use the appropriate in-memory data structure so that you can store that information.
being more specific would require a bit more information about your system, the volume of data (per entry and max entries at once), etc ...

Resources