IBM MQ Client using Azure Keyvault - spring-boot

We are considering moving our workloads to Azure. As applications move to Azure they will need to continue to communicate with on-premises workloads using IBM MQ (for some foreseeable future).
I did see this: Storing and retriveing a JKS from Azure key vault but we do not want to package the jks with the application and would like to replace the functionality by Azure KeyVault service.
Has anyone tried using the keyvault as the key and trusted store or can share some guidance on implementing this?

Related

How to setup Azure Blob Storage security so that i can be access from webapp

I am creating an Azure Integration SDK which would allow users to interact with azure services via exposed apis. This SDK (jar file created by SpringBoot) could be part of any application with in the organization.
One of the part of this SDK is to expose Blob Storage. I have setup the sdk so far but the testing is done against blob storage which is accessible on public url and being authorized using accountName, key.
Whats the best way to configure blob storage security if
The apps are hosted in azure eco system.
The apps are hosted outside of azure eco system.

Migrating Oracle Secure Global Desktop on Azure

We have to migrate the Oracle Secure Global Desktop application on Azure cloud.
So, My question is as below-
Is it possible to migrate above application on Azure cloud ?
If we plan to migrate it what thing I have to take care?
Regards,
Abhi
Yes, you can connect Oracle Secure Global Desktop with Azure cloud as the oracle have partnered with Microsoft with the private connectivity by enabling VPN access.
For private access from your data center to Oracle Cloud, use either Oracle Cloud Infrastructure FastConnect or IPSec VPN. Similarly, for private traffic from your data center to Microsoft Azure, use ExpressRoute or VPN.
For cross-cloud networking between Oracle Cloud and Microsoft Azure, set up a connection between a FastConnect circuit in Oracle Cloud and an ExpressRoute circuit in Microsoft Azure.
you can connect through Microsoft Azure Virtual Network and Express Route
Here is the document with complete information.

Connect to Oracle OnPremises using Azure Data Factory - via S2sVPN

Is it possible to connect to on-premises Oracle DB using Azure Data Factory via a Site to Site VPN? Or am I obliged to use a Self-Hosted Integration Runtime?
Thanks
If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-hosted integration runtime to connect to it.
If your data store is a managed cloud data service, you can use the Azure Integration Runtime. If the access is restricted to IPs that are approved in the firewall rules, you can add Azure Integration Runtime IPs to the allow list.
You can also use the managed virtual network integration runtime feature in Azure Data Factory to access the on-premises network without installing and configuring a self-hosted integration runtime.
For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.
The integration runtime provides a built-in Oracle driver. Therefore, you don't need to manually install a driver when you copy data from and to Oracle.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory

Is Managed Identity used for more than authentication?

For accessing ADLS gen 2 containers we perform mounting in Azure Databricks with the help of a Service Principal(client Id, Tenet Id, Client secret).
Can we use User assigned managed identity to perform the mounting instead of service principal in Azure Databricks? or to simply put how to use User assigned managed identity to access ADLS containers?
I'm trying to avoid the use of Service principal and trying to use User assigned managed identity but not able to perform mounting and I'm not finding any right documentation for reference to access the ADLS containers.
Can anyone let me know a solution for How to use User Assigned Managed Identity to access ADLS Container?
Please let me know if this is not the right approach to use managed identity.
Thanks.
As per official documentation, only the following services can use managed identities to access other services.
API Management
Application Gateway
Azure App Configuration
Azure App Services
Azure Arc enabled Kubernetes
Azure Arc enabled servers
Azure Automanage
Azure Automation
Azure Batch
Azure Blueprints
Azure Cache for Redis
Azure Container Instance
Azure Container Registry
Azure Cognitive Services
Azure Data Box
Azure Data Explorer
Azure Data Factory
Azure Data Lake Storage Gen1
Azure Data Share
Azure DevTest Labs
Azure Digital Twins
Azure Event Grid
Azure Image Builder
Azure Import/Export
Azure IoT Hub
Azure Kubernetes Service (AKS)
Azure Logic Apps
Azure Log Analytics cluster
Azure Machine Learning Services
Azure Managed Disk
Azure Media services
Azure Monitor
Azure Policy
Azure Purview
Azure Resource Mover
Azure Site Recovery
Azure Search
Azure Service Fabric
Azure SignalR Service
Azure Spring Cloud
Azure SQL
Azure SQL Managed Instance
Azure Stack Edge
Azure Static Web Apps
Azure Stream Analytics
Azure Synapse
Azure VM image builder
Azure Virtual Machine
Azure Virtual Machines
Azure Web PubSub Service
Unfortunately, Azure Databricks cannot use managed identity to access other services. But you can raise feature request here

Is there any Quark extension or support for aws secret manager?

We want migrate our java applications to Quarkus. We using AWS Secret Manager.
Is there any aws secret manager extension or solutions in Quarkus to read credentials from AWS Sectret manager.
There isn't currently.
See https://github.com/quarkusio/quarkus/tree/main/extensions and https://github.com/quarkusio/quarkus/tree/main/extensions/amazon-services for the list of Amazon related extensions

Resources