Azure Blob Storage file access - azure-blob-storage

Is there a configuration in Azure Blob storage that lets you link to a single file (or one that lets you link to a specific 'folder' in the Azure portal interface), but redirects the viewer into a login screen if they're not already signed in?
I am not terribly familiar with Azure Blob storage yet, but I see an option for 'anonymous' access, which isn't what I want (I want them to need to be logged in and have the proper permissions for that container), and I see an option for SAS (which isn't what I want, because it grants anyone who has the link access, and is time-boxed)
https://learn.microsoft.com/en-us/answers/questions/435869/require-login-when-accessing-blob-storage-url.html
This link appears to be asking the same question, and the response says something about 'role-based authentication' - I get the concept of adding roles to users, and using those as the authorization, but even as the owner of the blob container I can't seem to just link to myservice.blob.core.windows.net/container/myfile.jpg and download it without appending a SAS key.
Nor a way to link to myservice.blob.core.windows.net/container/myfolder and have it authenticate them then take them into that 'directory' in the UI.

If the access level of the container is set to public anonymous, we can directly access the Blob Uri in the browser to access the blobs.
If the access level of the container is set to private, opening the Blob Uri in the browser doesn’t redirect the user to the login screen. Instead, it will give ResourceNotFound error.
Even the proper role is assigned in the Role Assignments for the blob storage, still we would not be able to access the Blob Uri from the browser without appending the SAS token. Because, opening the direct Blob Uri in the browser doesn't trigger the OAuth flow.
Even though, it is not possible to access the blob Uri from browser and download the files, there are other ways to accomplish this.
We can use Azure CLI, PowerShell and Rest API to access the blob data with the authenticated users.
If you want to access the blob data from the browser, we can use function app. We can enable the function app for authentication. Then the authenticated users can access the blob data via function app.
Reference : azure - Access a blob file via URI over a web browser using new AAD based access control - Stack Overflow

Related

Best approach to download files from Azure Blob Storage using file urls

What is the best approach for the following requirement: I have files stored in Azure Blob Storage with a Private endpoint. I need to show to the user, a table with a column containing these file URLs in the browser. When a user clicks on the link, it should either open the file in a new tab or download the file.
I could show the URLs and download files using the Power BI report when the Blob Storage has public access. But in my scenario, it's a Private endpoint. I tried appending the SAS token to the URL, and that also worked, but in this case, SAS token is visible in the browser which is not allowed in my case. So this also does not work for my scenario.
Can we achieve this using Power BI or Power Apps or any other tools/api?
Could you please suggest the steps?
Thanks
I tried in my environment and got below results:
If you need download blob through browser with private container, you need Blob URL + SAS token because it provides read and writes to access blobs.
If you are using public container, you can download with Blob URL through browser.
There is no option to call your Blob URL by hiding SAS token.
As workaround, if you need to hide the SAS token from the user and try to access the blob, you can make use of Power Automate connector here:
I have connected to my Azure blob storage with these Power Automate connectors. 1) I have first connected to my Blob URL endpoint> And created a SAS URI with this connector.
The SAS URI is created successfully by the connector, and I have saved the SAS token inside the compose variable for it to be hidden.
Later I call an HTTP trigger to trigger the SAS token and get the Image content as HTTP body in the output.
When I decode the HTTP body content> I get my Blob Image successfully.
You can convert base64 to image you can use this link:
If you want your users to access this Blob, you can further convert this b64 encoded string into Image and save the file in OneDrive, SharePoint for your users to access you can refer this link by eric-cheng.
If you want to Email your users the Blob file, that can also be done by using Outlook, Gmail, etc connectors later.

Restrict permission when client create a sastoken for a blobstorage

I have a blobstorage where I drop files for an external partner to list the files and read them. I thought that a SAS token would be a perfect way for the external partner to access the container and read the file(s).
So I created a SAS token and realized that if I don't want to create new sas tokens every 10 minutes and send them to the partner I need to set the expire date of the token far into the future, and that is not good if the sastoken is leaked or that the day the token expire the solution will stop working.
So to fix that I could let the client create a sastoken by giving them an accesskey and accountname by using the StorageSharedKeyCredential-class. That works great, maybe to great since it's now the client that decides what permission the sas token should have. So the client might now upload files / create containers etc etc.
So my question is: Is there any way to restrict what kind of permissions the sas token have when the client create the sastoken, so our external partner only can read/list files in a specific container that I have decided.
Best Regards
Magnus
Regarding the issue, I think you want to know how to create service sas token. If so, please refer to the following code.
BlobContainerClient containerClient=new BlobContainerClient(new Uri("https://{account_name}.blob.core.windows.net/{container_name}),new StorageSharedKeyCredential());
BlobSasBuilder sasBuilder = new BlobSasBuilder()
{
BlobContainerName =containerClient.Name,
Resource = "c"
};
sasBuilder.ExpiresOn = DateTimeOffset.UtcNow.AddHours(1);
sasBuilder.SetPermissions(BlobContainerSasPermissions.Read);
sasBuilder.SetPermissions(BlobContainerSasPermissions.List);
Uri sasUri = containerClient.GenerateSasUri(sasBuilder);
To give a specific container permission, you can do this followings:
Find your container, select Access Policy under the settings blade, and click Add Policy. Select the permissions which you want to give this specific container. Also, public access level is container level.You could refer the Thread which discussed on the similar related issue.
And also try how the RBAC works on Azure storage.
Only roles explicitly defined for data access permit a security principal to access blob or queue data. Roles such as Owner, Contributor, and Storage Account Contributor permit a security principal to manage a storage account, but do not provide access to the blob or queue data within that account.
You can grant the right to create a user delegation key separately from right to the data.
https://learn.microsoft.com/en-us/rest/api/storageservices/get-user-delegation-key is performed at the account level, so you must give this permission with something like the Storage Blob Delegator built-in role at the scope of the storage account.
You can then grant just the data permissions the user should have, using one of these 3 built-in roles at the scope of the blob container:
Storage Blob Data Contributor
Storage Blob Data Owner
Storage Blob Data Reader
The User Delegation Token can then be generated to grant a subset of the users permissions for a limited time, and can be granted for an entire blob container OR for individual blobs.
For more details you may check this thread.
And You have to use VNet rules in the storage firewall or trusted access to storage to restrict access for clients in the same region.
you may check with this links.
https://learn.microsoft.com/en-us/azure/storage/common/storage-sas-overview https://learn.microsoft.com/en-us/rest/api/storageservices/create-service-sas#permissions-for-a-blob

How to upload/download files directly from Google Drive to the browser without using server bandwidth?

I want to make a web app where the only cost to me is serving the webpage to the user, and all their data is saved to their Google Drive so I don't have to pay for storage or bandwidth.
Is this possible using Google Drive?
I can't see how:
If I want to save something directly from the browser, it needs my application's API key, and I can't put that in the HTML as it is non-secure.
If I try to do anything where the webpage calls my server, the file will have to pass through my server to get to Google.
If I want to save something directly from the browser, it needs my application's API key, and I can't put that in the HTML as it is non-secure.
You need client credentials, api key will only give you access to public data and wont give you the ability to write anything. Web credentials if configured properly are bound to the domain that they are intended to be use for there for they are considered secure.
If I try to do anything where the webpage calls my server, the file will have to pass through my server to get to Google.
Well this is true considering that your server is running the code. There is no way to route directly from the client to the user. Unless you did this with javascript in which case the code is running client sided and running in the users browser.

How to ensure user see the images pertaining to him only from private blob Azure Storage?

Some of the user’s images are stored in Azure BLOB which is not publicly accessible. In our scenario we upload the images (user’s images) on private blob which later needs to be shown at the client side(Angular). Moreover the user should only be able to see the images that is related to him and not the images of other users.
We can generate the list of images URLs at server side but when this is passed to client side to render, it would fail naturally being blob not being public.
Now, being all the users who would access the application are internal to the organization, I believe authorization to access the images can be achieved by AAD/SAS. However at the same time, I am fail to understand how would I ensure or apply the security that if wanted user X should not be able to read the images of user Y?
Regarding the issue, you can use the following suggestions to implement it.
Store the User information and his image information (such as the image's azure blob container name, azure blob blob name ) in the database.
When the user wants to access the image, query the database with the user information to get these image,
After getting the image information, use the sdk to create sas token then return the image's blob url with sas token to cliend. Regarding how to create sas token, please refer to here.

How to access a public, single google sheet without oauth2

I'm trying to log some data into a public sheet (accessible in incognito mode). I'm trying to avoid OAUTH2 as it seems I can only grant access to all my sheets and I'd like to avoid that. There are likely multiple users of the software and my google account contains some proprietary data.
I'm using Python3.
I just needed to create a "Service Account" (under credentials in the console) and share my sheet with the address in the JSON file containing the credentials.
Now I don't have to rely on OAUTH2 and granting access to all my sheets.

Resources