ADF needs access to blob storage that is a part of virtual network - azure-blob-storage

I have create one azure data factory that needs access to blob storage which is a part of virtual network.
While creating the linked service in ADF, it fails and cannot connect to blob storage since the access to the blob storage is allowed using 'Enabled from selected virtual networks and IP addresses' option.
How can I configure the Azure blob settings or is there a way to give ADF access to blob storage following the best practices and providing a secured connectivity.

You can navigate to your storage account >> Networking >> Resource Instances and add the resource type and Instance name to allow access to specific resource instances.
NOTE: Make sure the instance belongs to the same tenant as your Storage account and have roles assigned to perform operations on blobs.
REFERENCES:
Configure Azure Storage firewalls and virtual networks.
Azure built-in roles for blobs

Related

Access Azure Blob Files also by Azure Public IPs

I would like to access a blob e.g. https://mystorage.blob.core.windows.net/container/file.dat also by using a public IP from Azure in my embedded application. For example https://1.2.3.4/container/file.dat should be the same file.
I have checked Azure Load Balancer, App Gateway, App Proxy, Azure CDN and Azure Front Door services bud could not find a solution.
EDIT:
Azure storage has private link option. I tried today and no success. Basically I am trying to link my public IP to Azure storage blob access.
This is the error you get:
First, make sure your blob can be access by public:
And if you don't have requirement to set networking, please make sure:
Then set the container access level:
Then, it should be no problem.

access issue while connecting to azure data lake gen 2 from databricks

I am getting this below access issue while trying to connect from databricks to gen2 data lake using Service principal and OAuth 2.0
Steps performed: Reference article
created new service principal
provide necessary access to this service principal from azure storage account IAM with Contributor role access.
Firewalls and private end points connection has been enabled on databricks and storage account.
StatusCode=403
StatusDescription=This request is not authorized to perform this operation using this permission.
ErrorCode=AuthorizationPermissionMismatch
ErrorMessage=This request is not authorized to perform this operation using this permission.
However when I tried connecting via access keys it works well without any issue. Now I started suspecting if #3 from my steps is the reason for this access issue. If so, do I need to give any additional access to make it success? Any thoughts?
When performing the steps in the Assign the application to a role, make sure to assign the Storage Blob Data Contributor role to the service principal.
Repro: I have provided owner permission to the service principal and tried to run the “dbutils.fs.ls("mnt/azure/")”, returned same error message as above.
Solution: Now assigned the Storage Blob Data Contributor role to the service principal.
Finally, able to get the output without any error message after assigning Storage Blob Data Contributor role to the service principal.
For more details, refer “Tutorial: Azure Data Lake Storage Gen2, Azure Databricks & Spark”.

Azure Blob: How to grant a mobile app limited access to user-specific data

I have a mobile app (Xamarin Android & iOS) that connects to a website (ASP.NET MVC). Some of the content for the mobile app (files & images) comes from an Azure Blob store that currently has public read-access enabled.
I am building an authentication module for the app (OAuth, with username/password). Is it possible to somehow build authentication into my Azure Blob account as well, so that a user would only have access to their specific files? I know that I could use the website as an intermediary (ie. user authenticates and connects to website, websites connects to azure & retrieves data and returns it to app) but this will add an extra step of lag as opposed to just connecting to Azure Blob directly.
I see that Azure Blob supports a shared access signature (SAS) tokens. Is it possible to generate a SAS token just for the subset of files relevant to that user? I imagine the workflow would be:
mobile app authenticates to website api
website generates and return SAS token for blob access
mobile app connects to azure blob directly using SAS token.
Would that even be a good idea? Any other suggestions?
From what I understand of your scenario, you could use either Azure AD or SAS for authentication/authorization to Blob storage. The key will be to organize your users' data by container, so that you can restrict access to that container. This type of design will align best with how authorization is handled in Azure Storage today.
So for example, you would create a container for user1's data, another container for user2's data, and so on.
If you are already using Azure AD to authenticate and authorize your users for your application, then you may be able to simply assign an RBAC role that is scoped to the user's container for each user. For example, you can assign the Storage Blob Data Contributor role to user1 for container1, then do the same for user2 on container2. See Use the Azure portal to assign an Azure role for access to blob and queue data for information about how to do this in the Azure portal; you can also use PowerShell or Azure CLI.
Note that an RBAC role cannot be scoped to an individual blob, but only at the container level or above.
If you determine that you need to use SAS, you can create a SAS for each user that is restricted to their container. If your users are already authenticating/authorizing to your application with Azure AD, then you probably don't need to use SAS. The SAS would be useful in the case where you need to grant access to a user that is not otherwise authenticated.

Is it possible to prevent the leakage of the original data of the database even if it is hacked?

We want to build a web application and deploy it on AWS.
EC2: Laravel
RDS: MySQL
I will use Laravel's encrypter to encrypt the data of database. Even RDS got hacked, the data have encrypted. Hacker can't know the contents. But if EC2 got hacked, hacker can get the database credential and the encryption key on the source code and decrypt the encrypted data from database.
My Boss (maybe client) think that it is not enough because of the database contains sensitive informations of users. He want to prevent the leakage of the original data of the database even if the web server (EC2) got hacked. Is it possible?
If not, I think we should focus on make the web server more difficult to be hacked:
Set Security Group to limit ssh access by IP address.
Or any other measures?
Here are a few safety measures you can do to reduce your blast radius.
Move your credentials for the RDS database so they are not directly on the instance, use a credential store such as:
AWS Secrets Manager
HashiCorp Vault
Rotate your database credentials frequently, and use IAM roles for your EC2 applications and not IAM users.
Keep your EC2 and RDS within private subnets, add an ELB in front of the EC2 so that public traffic can only access this device only.
Configure security groups to scoped to only what they need, limit inbound access to your AWS VPC to a VPN or direct connect connection.
Restrict access to who can do what in your AWS account, if a user does not need to perform certain actions for their role then just remove those permissions. This will prevent an accidental action on a service the user should not be using.
AWS have a large number of actions you can do in the security pillar too, so make sure to take a read of that.

How to access Azure blob storage via Flex Application?

Merged with How to access Azure blob storage via Flex Application?.
I am trying to access Blob Storage on Azure via my flex Application. I am doing this via an HTTP Service by using the url given by Azure Blob Storage. However, my storage has private and restricted access and I can only update the storage by using the key (provided by Azure).
Since my application needs to write to this storage, I somehow need to pass in the key via my HTTPService?
Does anyone have any idea how I can do this?
Regards
Aparna

Resources