For Azure Open AI, Encryption at rest is built using Microsoft Managed Keys --> https://learn.microsoft.com/en-us/azure/cognitive-services/openai/encrypt-data-at-rest.
Is encryption in transit occurring via the trained validation data sets?
An example: I want to train my data using Azure blob. I know I can configure my storage account to only accept TLS 1.2 connections. However, I don’t appear to have the option of configuring how I want to obtain the data. Is my data being passed via TLS 1.2 between the blob and the training infrastructure.
Related
I'm trying to provide a contract with some external data. They need an oracle setup for the same. I have decided to go with chainlink external adapter. The issue is that I don't the off-chain data to be accessible by other contracts. EA has support for accessing protected apis. However this does not allow me filter based on contract address. Also not sure how to include initiator information in the request to the server in a secure way
Is it possible to connect to on-premises Oracle DB using Azure Data Factory via a Site to Site VPN? Or am I obliged to use a Self-Hosted Integration Runtime?
Thanks
If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-hosted integration runtime to connect to it.
If your data store is a managed cloud data service, you can use the Azure Integration Runtime. If the access is restricted to IPs that are approved in the firewall rules, you can add Azure Integration Runtime IPs to the allow list.
You can also use the managed virtual network integration runtime feature in Azure Data Factory to access the on-premises network without installing and configuring a self-hosted integration runtime.
For more information about the network security mechanisms and options supported by Data Factory, see Data access strategies.
The integration runtime provides a built-in Oracle driver. Therefore, you don't need to manually install a driver when you copy data from and to Oracle.
Refer - https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory
We are considering moving our workloads to Azure. As applications move to Azure they will need to continue to communicate with on-premises workloads using IBM MQ (for some foreseeable future).
I did see this: Storing and retriveing a JKS from Azure key vault but we do not want to package the jks with the application and would like to replace the functionality by Azure KeyVault service.
Has anyone tried using the keyvault as the key and trusted store or can share some guidance on implementing this?
Our current API use seesionID for the authentication. We plan to use Azure API management to manage our web api. However Azure web api management has their own authentication. How can we link those two together. Our customer can use the same logon information.
Conversations about authentication and identity in Azure API Management can get tricky because there can be three different identities and then there are the different contexts of runtime requests vs management requests. So, to be sure I'm answering the right question, let me try and get some terms defined.
The three identities:
API Provider: This is the Azure user who has created an API Management instance.
API Consumer: This is a developer who is writing some client software to consume the API.
End User: The user of the application written by the API Consumer and will be the one who actually initiates runtime requests to the API.
I am assuming that you are the API Provider. What I'm not sure about is whether your customers are the API Consumers or the End Users.
Azure API Management provides identity services for API Consumers. Consumers can either manually create a username/password account or use some social identity provider to create an account. They then can get a subscription key that will allow Azure API Management to associate requests to the API Consumer.
I think you are asking if you can connect the sessionID, which I am guessing you use to identify End Users, to a subscription key used to identify API Consumers. If that is correct, then the answer is no (except for the scenario described below), because we need to identify the API Consumer key before any policies are run to ensure we run the correct policies.
You can change our Api Consumer subscription key. So, if you only have a low quantity of customers/End Users you could create an Api Consumer account for each End User. However, you would only be able to map sessionID to API Consumer Subscription Key if sessionID was a constant value. I'm presuming based on the name, that value changes at each login.
Although Azure API Management provides identity services of API Consumers, it does not provide full identity management for End Users. We leave that to external partners like Azure AD, Thinktecture Identity Server and Auth0. I'm assuming that your existing system is already using some kind of identity provider to generate the sessionId. What you can do with Azure API Management is validate that sessionId using policies in the API Management Gateway. To do that we would need to know more about the format of the sessionId.
Sorry for the long post but it is a confusing topic and I wanted to be as clear as possible.
Is HTTP or HTTPS (DefaultEndpointsProtocol in the storage connection string) recommended for blob requests within the Azure data centre? That is, between Azure role instances and blob storage.
I can see that I must use HTTPS for diagnostics (Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString; see Incorrect DiagnosticsConnectionString Setting heading here).
I suppose that HTTP will be faster, although I have no measurements to say whether it is material.
There is no official Microsoft guidance on using HTTP or HTTPS with Azure Storage.
It will be faster to use HTTP as the HTTPS handshake is not taking place. Security within the data center should not be a concern.
With HTTP/2 protocol support on Azure, the performance difference will diminished. The Azure team is working on it.
On the other hand, HTTPS is secure and secure by default is a good practice.