Packer and AWS credentials: CryptProtectData failed - windows

I am provisioning a Windows machine using Packer. I use a Powershell Script to do most of the provisioning.
An important provisioning step is to download some software from a private S3 bucket. In attempt to first set AWS credentials I run this snippit:
echo "Configure AWS"
echo "AWS_ACCESS_KEY_ID: ${env:AWS_ACCESS_KEY_ID}"
echo "AWS_SECRET_ACCESS_KEY: ${env:AWS_SECRET_ACCESS_KEY}"
echo "AWS_DEFAULT_REGION: ${env:AWS_DEFAULT_REGION}"
Set-AWSCredentials -AccessKey ${env:AWS_ACCESS_KEY_ID} -SecretKey ${env:AWS_SECRET_ACCESS_KEY} -StoreAs default
And invariably get an error when Packer runs it on the machine:
amazon-ebs: Set-AWSCredentials : CryptProtectData failed.
amazon-ebs: At C:\Windows\Temp\script.ps1:15 char:1
amazon-ebs: + Set-AWSCredentials -AccessKey ${env:AWS_ACCESS_KEY_ID} -SecretKey
amazon-ebs: ${env:AWS_SECR ...
If I run this command directly on the Windows instance it works fine.
Thanks,
Jevon

from the PowerShell doc:
The PowerShell Tools can use either of two credentials stores.
The AWS SDK store, which encrypts your credentials and stores them in your home folder. The AWS SDK for .NET and AWS Toolkit for Visual
Studio can also use the AWS SDK store.
The credentials file, which is also located in your home folder, but stores credentials as plain text. By default, the credentials file is
stored here: `C:\Users\username.aws. The AWS SDKs and the AWS Command
Line Interface can also use the credentials file. If you are running a
script outside of your AWS user context, be sure that the file that
contains your credentials is copied to a location where all user
accounts (local system and user) can access your credentials.
From google search, it seems people turn to use BasicAWSCredentials
I am not sure this is something you can do (depending if you use an SDK or not), if not you can use the second approach described in doc and store the variables in C:\Users\username\.aws and use S3 command with the credentials stored from this file

Related

Container access to gcloud credentials denied

I'm trying to implement the container that converts data from HL7 to FHIR (https://github.com/GoogleCloudPlatform/healthcare/tree/master/ehr/hl7/message_converter/java) on Google Cloud. However, I can't build the container, locally, on my machine, to later deploy to the cloud.
The error that occurs is always in the authentication part of the credentials when I try to rotate the image locally using the docker:
docker run --network=host -v ~/.config:/root/.config hl7v2_to_fhir_converter
/healthcare/bin/healthcare --fhirProjectId=<PROJECT_ID> --fhirLocationId=<LOCATION_ID> --
fhirDatasetId=<DATASET_ID> --fhirStoreId=<STORE_ID> --pubsubProjectId=<PUBSUB_PROJECT_ID> --
pubsubSubscription=<PUBSUB_SUBSCRIPTION_ID> --apiAddrPrefix=<API_ADDR_PREFIX>
I am using Windows and have already performed the command below to create the credentials:
gcloud auth application-default login
The credential, after executing the above command, is saved in:
C:\Users\XXXXXX\AppData\Roaming\gcloud\application_default_credentials.json
The command -v ~ / .config: /root/.config is supposed to enable the docker to search for the credential when running the image, but it does not. The error that occurs is:
The Application Default Credentials are not available. They are available if running in Google
Compute Engine. Otherwise, the environment variable GOOGLE_APPLICATION_CREDENTIALS must be defined
pointing to a file defining the credentials. See
https://developers.google.com/accounts/docs/application-default-credentials for more information.
What am I putting error on?
Thanks,
A container runs isolated to the rest of the system, it's its strength and that's why this packaging method is so popular.
Thus, all the configuration on your environment is void if you don't pass it to the container runtime environment, like the GOOGLE_APPLICATION_CREDENTIALS env var.
I wrote an article on this. Let me know if it helps, and, if not, we will discussed the blocking point!

Teamcity and AWS CLI

I am running Teamcity on a windows VM and have installed the awscli.
I am trying to pull a zip from aws S3. But I get this error:
" aws : The term 'aws' is not recognized as the name of a cmdlet, function, script file"
When I run the command in both cmd and powershell it works just fine.
I have also checked that the awscli path is in both user and system paths.
Any ideas?
I figured it out.
The build agent was not running as a service and was running as a user account that didn't have the correct permissions. Installed a new agent, ran it as a windows service and as a service account.
I hope this helps someone in the future that faces this frustrating issue.

Problem authenticating Google GCP with Dockers

I need to work on some previously pushed docker images stored on Google's gcr.io hubs.
I am doing this from a Windows 10 machine, with standard installations of Docker and Google Cloud SDK (no Homebrew or anything like that).
After setting permissions for my gmail account in GCP's IAM section, I am still getting this error message when using this in PowerShell:
docker pull gcr.io/blabla/blabla:latest
Error response from daemon: unauthorized: You don't have the needed
permissions to perform this operation, and you may have invalid
credentials. To authenticate your request, follow the steps in:
https://cloud.google.com/container-registry/docs/advanced-authentication
On going through setting up authentication again, I get these error messages
C:\Program Files (x86)\Google\Cloud SDK>gcloud auth configure-docker
WARNING: docker-credential-gcloud not in system PATH. gcloud's
Docker credential helper can be configured but it will not work until
this is corrected.
WARNING: docker not in system PATH. docker and
docker-credential-gcloud need to be in the same PATH in order to
work correctly together. gcloud's Docker credential helper can be
configured but it will not work until this is corrected.
On searching for solutions, I came across this thread which appears to use macOS commands. I've found the Windows alternative for 'which', which is 'where', giving this:
C:\Program Files (x86)\Google\Cloud SDK>where gcloud
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gcloud
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gcloud.cmd
C:\Users\l.cai\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\gcloud
C:\Users\l.cai\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\gcloud.cmd
But I'm having a lot of trouble understanding this post explaining the alternative for readlink. Replacing parts of that syntax with the filepaths either give
' ' is not recognized as an internal or external command
or
The system cannot find the path specified.
Multi-line commands also don't work well in Windows PowerShell or CMD, so I'm not sure where they're entering commands into.
Can anyone please help me along with this? Many thanks in advance.
Your problem is that neither gcloud nor docker is setup correctly for the user that you are logged in as. The following is a temporary solution. You should reinstall docker and the cloud SDK.
Verify that both components of the path below are correct and adjust for your installations.
Open a Windows Command Prompt and execute:
set PATH=C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin;C:\Program Files\Docker\Docker\Resources\bin;%PATH%
Found a solution: Log into Windows itself with an admin account. None of the other fixes/threads referred to in my OP ended up being relevant.
I had a local administrator account, but since this was set up recently, I was used to logging in to my usual work account (non-admin), and only entering the local admin credentials as needed (e.g. when running programs with elevated privileges).
So docker and powershell and cloud SDK can all be started individually with admin privileges, but somewhere along the chain it breaks, and I'm not prompted for anything. Logging in with the admin account circumvents that.

How do I install Azure CLI for a Service Account on my build server

I have successfully installed Azure CLI on my build server and can use it at the command line. But when a build executes, running under a service account I get the following error:
az : The term 'az' is not recognized as the name of a cmdlet, function, script
file, or operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again.
I am assuming this is because Azure CLI was installed only for my user. The service account does not have an interactive login so I can't log in and install Azure CLI for that account. Is there a way to make Azure CLI available to my service account.
Add this to your Path environment variable
C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\wbin

Creating Azure self-signed sertificate with powershell Invoke-AddCertToKeyVault command

Recently, I tried to create self-signed certificate for Azure Service Factory accordingly with Microsoft's manual:
Azure Docs: Secure a Service Fabric cluster, step 2.5, 02/05/2016
But command Invoke-AddCertToKeyVault failed with the next error:
Invoke-AddCertToKeyVault : The term 'Invoke-AddCertToKeyVault' is not
recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included,
verify that the path is correct and try again.
I think that Azure Powershell successfully installed on my machine because I was able to login into my Azure account by running Login-AzureRmAccount. Also $env:PSModulePath says that Azure Modules path added to path variable (accordingly with the article: Azure Docs: How to install and configure Azure PowerShell, 04/22/2016). Here they are:
...\Microsoft SDKs\Azure\PowerShell\ResourceManager\AzureResourceManager\;
...\Microsoft SDKs\Azure\PowerShell\ServiceManagement\;
...\Microsoft SDKs\Azure\PowerShell\Storage\;
Also, I have restarted my PC after installing Azure PowerShell.
It is possible that I have missed something, but I am stuck with it. How could it be resolved?
That commandlet is in the package that should be imported -
Import-Module "C:\Users\chackdan\Documents\GitHub\Service-Fabric\Scripts\ServiceFabricRPHelpers\ServiceFabricRPHelpers.psm1"
That is its implementation, by the way, for a reference that it even exists :). Try Import-Module and it should work.

Resources