How can I restrict users from downloading the blobs into their machines? - download

I gave access to few of my colleagues on one of my Azure storage Account (Contributor).
The idea is to have them access (read and list) the data in the blob container, but I want to restrict them from downloading the data.
I tried the below:
Using SAS key with read and list still allowing them to download the blobs (Using Storage Explorer).
Giving them just reader access and "Storage blob data reader" access did not stop them from downloading the data.
Changing the blob access tier to "Archive" is not a solution that suites.
Tried creating a custom role, but failing to find the exact allow and disallow permissions.
I see the similar kind of question before but wasn't been answered yet # Restrict from downloading file on Azure Blob
Can you please help.

If a user has read permission on a blob (either through SAS Token or Azure AD role), they will be able to download the blob.
To prevent users from downloading a blob, remove read permissions on the blob for the users. For example if you are using a SAS Token, simply use List permissions there. Then the users will be able to see the list of the blobs but will not be able to download it.

Related

Experiments with azure blob storage

Good day , I'm new in azure blob and don't know what kinds of experiments can be done with it, I have searched about it but still isn't clear to me. I'll be really gratefull if you can tell me about easy experiments that can be done in azure blob.
Blob storage stores unstructured data such as text, binary data, documents or media files.
Hope you meant the samples or the operations that can be done using azure blob when you say experiments.
You can do operations like uploading the files, downloading , listing etc. programmatically or through the UI.
You can above operations using languages such as .Net, Java,Python,JS..etc which you can find links in this documentation Azure Storage samples
But to access Azure Storage, you'll need an Azure subscription or free trial account. All access to Azure Storage takes place through a storage account. For this quickstart, create a storage account using the Azure portal.
Refer to Create a storage account
You can do above operations in portal directly .This sample blog can give you quick insights to perform them through portal.
A number of solutions exist for migrating existing data to Blob storage like AzCopy ,Azure datafactory etc
Introduction to Blob (object) storage - Azure Storage | Microsoft Docs

Snowflake error when reading from Azure stage using SAS token

Snowflake stage for Azure blob giving error when we try to copy the file into snowflake table
Error:Failed to access remote file: access denied. Please check your credentials
We are able to list the files but unable to copy the files.
This issue occured when we are reading this file from third party source system (Azure Blob).
We are able to repo the same issue in our environment when we remove read access while generating the SAS token but third party source team generating with read access but still showing error.
Third party team whitelisted snowflake subnet ranges to avoid the misuse of SAS token.
Regards,
Srinivas
Error:Failed to access remote file: access denied. Please check your
credentials
It looks like SAS token permissions are not enough. Try to add full permissions and try again:
And you can try this answer(maybe helpful):
Cannot copy data from Snowflake into Azure Blob

When I have Azure do a CloudBlockBlob.StartCopyAsync(), is there a way to have it verify a checksum?

When I initiate an async copy of a block blob to another storage account using StartCopyAsync, is Azure doing any kind of integrity check for me, or if not, is there a way to have it do so?
I found that I can set the Properties.ContentMD5 property and have the integrity verified when uploading blobs. Is it also verifying during a copy operation?
I searched through the docs and found no mention of an integrity check during an async copy specifically. I found a couple references to AzCopy making integrity checks, and it also has the /CheckMD5 option, which is essentially what I'd like Azure to do after the blob copy.
As far as I know, the azure blob SDK is the package of the azure blob rest api.
So the azure SDK StartCopyAsync method will use copy operation(rest api) send to the azure server side to tell the server copy.
According to the copy operation article, you could find "When a blob is copied, the following system properties are copied to the destination blob with the same values".
It contains the "Content-MD5" property.

How to get a file after uploading it to Azure Storage

I had uploaded files mostly media files to Azure's File Storage, which I am able to see in Azure's Explorer as well. But when I view the file as anonymous user, I am not able to view the file. Tried to check with Permissions setting as well, but to no avail.
Any help would be welcomed :)
Azure files have Shared Access Signatures (SAS). This is a key that you compute with the storage account key, that gives access to a particular URL. Here is an example (storage account name is obfuscated here):
https://mystorageaccount.file.core.windows.net/sampleshare/2.png?sv=2015-04-05&sr=f&si=sampleread&sig=Zq%2BfflhhbAU4CkCuz9q%2BnUFEM%2Fsg2PbXe3L4MeCC9Bo%3D&sip=0.0.0.0-255.255.255.255
You have sample code on how to create a SAS with Azure files at https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/, ยง"Generate a shared access signature for a file or file share".
You can also do it interactively with a number of tools. For instance, CloudXPlorer has this feature.

Using Amazon S3 in place of an SFTP Server

I need to set up a repository where multiple people can go to drop off excel and csv files. I need a secure environment that has access control so customers logging on to drop off their own data can't see another customers data. So if person A logs on to drop a word document they can't see person B's excel sheet. I have an AWS account and would prefer to use S3 for this. I originally planned to setup an SFTP server on an EC2 server however, I feel that using S3 would be more scalable and safer after doing some research. However, I've never used S3 before nor have I seen it in a production environment. So my question really comes down to this does S3 provide a user interface that allows multiple people to drop files off similar to that of an FTP server? And can I create access control so people can't see other peoples data?
Here are the developer resources for S3
https://aws.amazon.com/developertools/Amazon-S3
Here are some pre-built widgets
http://codecanyon.net/search?utf8=%E2%9C%93&term=s3+bucket
Let us know your angle as we can provide other ideas knowing more about your requirements
Yes. It does, you can actually control access to your resources using IAM users and roles.
http://aws.amazon.com/iam/
You can allow privileges to parts of an S3 bucket say depending on the user or role for example:
mybucket/user1
mybucket/user2
mybucket/development
could all have different permissions.
Hope this helps.

Resources