Heroku Hdrive & Mulesoft Integration - heroku

I want store backup file via Mulesoft on Heroku Hdrive bucket.
I've configured a private S3 bucket but I'm not able to connect Mulesoft (throught S3 Connector) neither via S3 Browser client. In attach the access error from Mulesoft.
What can I do ?
Thanks in advance.

Related

Azure service for hosting zip file securely

I am running a spring boot app which internally uses secure bundle zip file of size 13kb . I want to host the file to remote server securely and encrypted . The infrastructure I am in is azure . Which azure service I can use to host my zip file securely ? Can I use azure key value to access my zip file ?
You can store the Zip file in Azure Blob storage. Azure Storage uses server-side encryption (SSE) to automatically encrypt your data when it is persisted to the cloud. You can configure your storage account to accept requests from secure connections only by setting the Secure transfer required property for the storage account.
You could then restrict access to the storage blob via Shared Access Signature.
If you wanted to be extra secret squirrel, you could even enable CMK (Customer Managed Keys) for the encryption to make doubly sure nobody is looking at your secret sauce.
https://learn.microsoft.com/en-us/azure/storage/common/storage-require-secure-transfer
https://learn.microsoft.com/en-us/azure/storage/common/customer-managed-keys-configure-key-vault?tabs=portal
https://learn.microsoft.com/en-us/azure/storage/blobs/security-recommendations
You could theoretically host your zip file in Key Vault if you serialize it to a text file. But I think that's a bad idea.
The right service is Azure Storage File Share.

How to override DefaultAWSCredentialsProviderChain by our own implementaion of credential provider with assume role

I am trying to use spring config server with cross account as I am deploying config server in kubernetise with aws backed.
but due to DefaultAWSCredentialsProviderChain I am unable to get connected to s3 bucket and gets 403 error.
In DefaultAWSCredentialsProviderChain as per logs WebIdentityTokenCredentialsProvider try to get credentials get 403 error.
but when I am try to connect with my awss3 client with STSAssumeRoleSessionCredentialsProvider it gets connect.
Is there any way so that I can provide STSAssumeRoleSessionCredentialsProvider instead of DefaultAWSCredentialsProviderChain

Go storage client not able to access GCP bucket

I have a golang service which has an API exposed where we try to upload a CSV to a GCP bucket. On my local host, I set the environment variable GOOGLE_APPLICATION_CREDENTIAL
and point this variable to the filepath of service account json. But when deploying to an actual GCP instance, I'm getting the below error while trying to access this API. Ideally,the service should talk to GCP metadata server and fetch the credentials and then store them in a json file. So there are 2 problems here:
Service is not querying the metadata service to get the credentials.
If file is present(I created it manually), it's not able to access due to permission issues.
Any help would be appreciated.
Error while initializing storage Client:dialing: google: error getting credentials using well-known file (/root/.config/gcloud/application_default_credentials.json): open /root/.config/gcloud/application_default_credentials.json: permission denied
Finally, after long debugging and searching over the web, found out that there's already an open PR for the go-storage client which is open: https://github.com/golang/oauth2/issues/337. I had to make a few changes in the code using this method: https://pkg.go.dev/golang.org/x/oauth2/google#ComputeTokenSource where in basically we are trying to fetch the token explicitly from metadata server and then calling subsequent cloud API's.

Create a blob store in S3 for Nexus Sonatype

I'm not able to configure the AWS-S3 blob store for Nexus sontaype. The nexus server is hosted on AWS-EC2 and I want to store all the binaries into the S3.
Any help would be highly appreciated.

Connect to Azure Storage Blob via FTP

I'm trying to connect to an Azure storage Blob via FTP for comparison purposes, I tried following the guide to connect to an Azure website via FTP but I can't seem to find ftp credentials as there is no publishing profile
As in being able to upload/download blobs via an FTP client? If that is what you mean then, unfortunately, this isn't possible. This Stack Overflow post might help if that is the scenario you're looking for.

Resources