Is Managed Identity used for more than authentication? - azure-databricks

For accessing ADLS gen 2 containers we perform mounting in Azure Databricks with the help of a Service Principal(client Id, Tenet Id, Client secret).
Can we use User assigned managed identity to perform the mounting instead of service principal in Azure Databricks? or to simply put how to use User assigned managed identity to access ADLS containers?
I'm trying to avoid the use of Service principal and trying to use User assigned managed identity but not able to perform mounting and I'm not finding any right documentation for reference to access the ADLS containers.
Can anyone let me know a solution for How to use User Assigned Managed Identity to access ADLS Container?
Please let me know if this is not the right approach to use managed identity.
Thanks.

As per official documentation, only the following services can use managed identities to access other services.
API Management
Application Gateway
Azure App Configuration
Azure App Services
Azure Arc enabled Kubernetes
Azure Arc enabled servers
Azure Automanage
Azure Automation
Azure Batch
Azure Blueprints
Azure Cache for Redis
Azure Container Instance
Azure Container Registry
Azure Cognitive Services
Azure Data Box
Azure Data Explorer
Azure Data Factory
Azure Data Lake Storage Gen1
Azure Data Share
Azure DevTest Labs
Azure Digital Twins
Azure Event Grid
Azure Image Builder
Azure Import/Export
Azure IoT Hub
Azure Kubernetes Service (AKS)
Azure Logic Apps
Azure Log Analytics cluster
Azure Machine Learning Services
Azure Managed Disk
Azure Media services
Azure Monitor
Azure Policy
Azure Purview
Azure Resource Mover
Azure Site Recovery
Azure Search
Azure Service Fabric
Azure SignalR Service
Azure Spring Cloud
Azure SQL
Azure SQL Managed Instance
Azure Stack Edge
Azure Static Web Apps
Azure Stream Analytics
Azure Synapse
Azure VM image builder
Azure Virtual Machine
Azure Virtual Machines
Azure Web PubSub Service
Unfortunately, Azure Databricks cannot use managed identity to access other services. But you can raise feature request here

Related

Offline object storage solution azure blob api compatible

I am looking for the offline storage solution compatible with azure blob storage API for production environment.
Example Minio for AWS S3.
Microsoft Azure Storage Emulator
is for testing purpose i can not used it for production
Minio can be used as object storage in Microsoft Azure.
MinIO is an open-source object storage solution.
Yes, Minio can be used as an offline object storage solution in Azure Blob storage with API compatibility by running Minio on an Azure virtual machine or as a managed service.
It can be configured to store data in an Azure Blob storage container.
This allows Minio to access the data in Azure Blob storage and provide a compatible S3-style API to access the data.
And Applications can interact with the Minio instance as if it were an S3 bucket, allowing for offline processing of data stored in Azure Blob storage.
Minio Features
It is compatible with the Azure Blob Storage API and can be used in production environments.
It is a high performance, scalability, and compatibility with cloud native tools and technologies.
MinIO supports multi-cloud, on-premises and hybrid cloud deployments, and it is a flexible choice for offline storage.
Steps to seup Minio
You can install MinIO on a server/ virtual machine / cloud platform like AWS, GCP, Azure or DigitalOcean.
Start MinIO with a unique endpoint and access/secret keys to access your data.
Create a bucket in MinIO to store your data.
You can upload data to MinIO using the MinIO client, S3 API, or the MinIO browser.
You can access your data stored in MinIO through the MinIO client, S3 API, or the MinIO browser.
MinIO provides features like versioning, lifecycle policies, access controls, and more to manage your data.
Use MinIO's built-in monitoring and management tools to monitor the performance and health of your MinIO instance.
References taken from
MinIO Client SDK for .NET)
MinIO Multi Cloud Object Storage

How to setup Azure Blob Storage security so that i can be access from webapp

I am creating an Azure Integration SDK which would allow users to interact with azure services via exposed apis. This SDK (jar file created by SpringBoot) could be part of any application with in the organization.
One of the part of this SDK is to expose Blob Storage. I have setup the sdk so far but the testing is done against blob storage which is accessible on public url and being authorized using accountName, key.
Whats the best way to configure blob storage security if
The apps are hosted in azure eco system.
The apps are hosted outside of azure eco system.

Connect to Oracle OnPremises using Azure Data Factory - via S2sVPN

Is it possible to connect to on-premises Oracle DB using Azure Data Factory via a Site to Site VPN? Or am I obliged to use a Self-Hosted Integration Runtime?
Thanks
If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-hosted integration runtime to connect to it.
If your data store is a managed cloud data service, you can use the Azure Integration Runtime. If the access is restricted to IPs that are approved in the firewall rules, you can add Azure Integration Runtime IPs to the allow list.
You can also use the managed virtual network integration runtime feature in Azure Data Factory to access the on-premises network without installing and configuring a self-hosted integration runtime.
For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.
The integration runtime provides a built-in Oracle driver. Therefore, you don't need to manually install a driver when you copy data from and to Oracle.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory

What's the possibility to slow down the web application after migrated to azure cloud service?

We have migrated the existing web application to azure cloud service, using azure database and azure redis cache provider.
We visit the same page in azure platform and local machine
In azure platform it takes 2 seconds to execute this page.
In local machine,it only takes 500 ms.
Both environment are using the same azure database and the same azure redis.
The location of azure database ,azure redis ,cloud service are all in west europe.
We use large cloud service (4 cores, 8GB memroy)
We also execute the page in cloud service machine by using remote desktop, try to prove that it's not due to the network, and it's still very slow.
Does anybody has experience about this, why it's so slow to execute the same page in azure (by using the same database and the same cache provider)?

How do I secure an odata service running on windows Azure

I am hosting odata service on windows azure platform. How do I authenticate client/user within windows azure platform before they can access the odata service?
Thanks,
In the WCF Data Sevice TeamĀ“s blog there is a huge entry splitted in 8 parts where they deatils all about authentication:
http://blogs.msdn.com/b/astoriateam/archive/2010/05/10/odata-and-authentication-part-1.aspx

Resources