Azure service for hosting zip file securely - spring-boot

I am running a spring boot app which internally uses secure bundle zip file of size 13kb . I want to host the file to remote server securely and encrypted . The infrastructure I am in is azure . Which azure service I can use to host my zip file securely ? Can I use azure key value to access my zip file ?

You can store the Zip file in Azure Blob storage. Azure Storage uses server-side encryption (SSE) to automatically encrypt your data when it is persisted to the cloud. You can configure your storage account to accept requests from secure connections only by setting the Secure transfer required property for the storage account.
You could then restrict access to the storage blob via Shared Access Signature.
If you wanted to be extra secret squirrel, you could even enable CMK (Customer Managed Keys) for the encryption to make doubly sure nobody is looking at your secret sauce.
https://learn.microsoft.com/en-us/azure/storage/common/storage-require-secure-transfer
https://learn.microsoft.com/en-us/azure/storage/common/customer-managed-keys-configure-key-vault?tabs=portal
https://learn.microsoft.com/en-us/azure/storage/blobs/security-recommendations

You could theoretically host your zip file in Key Vault if you serialize it to a text file. But I think that's a bad idea.
The right service is Azure Storage File Share.

Related

Databricks Azure Blob Storage access

I am trying to access files stored in Azure blob storage and have followed the documentation linked below:
https://docs.databricks.com/external-data/azure-storage.html
I was successful in mounting the Azure blob storage on dbfs but it seems that the method is not recommended anymore. So, I tried to set up direct access using URI using SAS authentication.
spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.dfs.core.windows.net", "<token>")
Now when I try to access any file using:
spark.read.load("abfss://<container-name>#<storage-account-name>.dfs.core.windows.net/<path-to-data>")
I get the following error:
Operation failed: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.", 403, HEAD,
I am able to mount the storage account using the same SAS token but this is not working.
What needs to be changed for this to work?
If you are using blob storage, then you have to use wasbs and not abfss. I have tried using using the same code as yours with my SAS token and got the same error with my blob storage.
spark.conf.set("fs.azure.account.auth.type.<storage_account>.dfs.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage_account>.dfs.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage_account>.dfs.core.windows.net", "<token>")
df = spark.read.load("abfss://<container>#<storage_account>.dfs.core.windows.net/input/sample1.csv")
When I used the following modified code, I was able to successfully read the data.
spark.conf.set("fs.azure.account.auth.type.<storage_account>.blob.core.windows.net", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage_account>.blob.core.windows.net", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage_account>.blob.core.windows.net", "<token>")
df = spark.read.format("csv").load("wasbs://<container>#<storage_account>.blob.core.windows.net/input/sample1.csv")
UPDATE:
To access files from azure blob storage where the firewall settings are only from selected networks, you need to configure VNet for the Databricks workspace.
Now add the same virtual network to your storage account as well.
I have also selected service endpoints and subnet delegation as following:
Now when I run the same code again using the file path as wasbs://<container>#<storage_account>.blob.core.windows.net/<path>, the file is read successfully.

How to Store Certificate Files Securely in Spring Cloud Config Server?

I have a .pfx file that I use for communicating with a web service. I load it from classpath in development environment like this:
application.yml
my-config:
certificate: classpath:/certificate/dev/mycertificate.pfx
Service.java
SSLContext sslContext = SSLContext.getInstance(SSL_CONTEXT_PROTOCOL);
KeyManagerFactory keyManagerFactory = KeyManagerFactory.getInstance(KeyManagerFactory.getDefaultAlgorithm());
KeyStore keystore = KeyStore.getInstance("JKS");
Resource certificateResource = myConfig.getCertificate();
keystore.load(certificateResource.getInputStream(), myConfig.getCertPassword().toCharArray());
certificateResource.getInputStream().close()
keyManagerFactory.init(keystore, myConfig.getCertPassword().toCharArray());
sslContext.init(keyManagerFactory.getKeyManagers(), null, null);
requestContext.put(SSL_SOCKET_FACTORY, sslContext.getSocketFactory());
This works fine in development environment. The problem is, I do not want to just push the certificate resource to git repo. Also I cannot put the file inside the server because we use pivotal application service for hosting the app. So is there any way I can securely store the certificate file in the config server or anywhere else?
Thanks.
You could put the cert into Spring Cloud Config Server. If you are using Spring Cloud Services for VMware Tanzu you can follow these instructions and store the value into CredHub through SCS.
Alternatively, you could store encrypted values in a Git backend and SCS will decrypt them for you. See instructions here. You could also store things in Vault, but Vault is not provided by the SCS for VMware Tanzu tile. You'd have to run your own Vault server. Instructions for using Vault. Both of these options, I feel, are a bit more work than using SCS's support for CredHub.
If you are trying to use only OSS Spring Cloud Config, you can do that too, but it's more work, more than I can cover here. That said, all three of these options are available there as well:
CredHub backend w/SCS.
Git + encrypted properties.
Vault backend w/SCS.
Vault and CredHub both have certificate types specifically for storing certificates. I do not believe SCS exposes these options, so you would be just storing the text representation of your certificate.
All of these options assume that you want to use Spring Cloud Config server. If you wanted an option not tied to Spring, you could use the CredHub Service Broker tile. This allows you to store items in CredHub and then present them as bound services. With it, you could create a bound service that represents your certificate, bind that to the apps that require it, and then fetch your certificate from VCAP_SERVICES like any other bound service.
The downside of this approach is that VCAP_SERVICES is an environment variable, so it's storing text only and there are limits to how much information can be stored.

Azure blob not deletable via CDN

I have created an Azure Akamai CDN with an endpoint connected to a storage account (block blob only). I generated a SAS Token with full permissions and it is used to access to a specific container using the CDN endpoint. The SAS Token gives the delete permission and the blobs are deletable throught storage account address (mystorage.blob.core.windows.net) but not deletable throught CDN endpoint (my-cdn.azureedge.net).
Is there a way to delete a blob via CDN endpoint?
Thanks,
Dan
I found that is a Storage Explorer (or AzCopy) issue. It works by REST API and SDK.

VB6: How to connect to a network shared folder, without providing credentials using

I want to connect to a network share path '\domainname\folder-name' using domain account, without passing credentials, through my VB6 code.
My legacy VB6 application service (running on server A) currently accesses shared folder (on server B) using local account credentials(stored in encrypted .ini file). This service is running on behalf of 'LocalSystem'.
application is using 'WNetUseConnection' API to connect to shared folder.
To ensure security local account needs to be replaced by 'domain account' and password policy should be CyberArk dynamic password.
Now this credentials can't be stored in .ini file anymore. The idea that I am working on is to get service running on behalf of 'domain account' rather 'LocalSystem'. My thought is if i make service run on behalf of 'domain account, and give relevant permissions to this account on shared folder. Shared path should be accessible to service without providing credentials.
I need help to understand which API shall I use.
The API(s) you'll need for this is WNetAddConnection.
See this example.

How to access Azure blob storage via Flex Application?

Merged with How to access Azure blob storage via Flex Application?.
I am trying to access Blob Storage on Azure via my flex Application. I am doing this via an HTTP Service by using the url given by Azure Blob Storage. However, my storage has private and restricted access and I can only update the storage by using the key (provided by Azure).
Since my application needs to write to this storage, I somehow need to pass in the key via my HTTPService?
Does anyone have any idea how I can do this?
Regards
Aparna

Resources