Problem authenticating Google GCP with Dockers - windows

I need to work on some previously pushed docker images stored on Google's gcr.io hubs.
I am doing this from a Windows 10 machine, with standard installations of Docker and Google Cloud SDK (no Homebrew or anything like that).
After setting permissions for my gmail account in GCP's IAM section, I am still getting this error message when using this in PowerShell:
docker pull gcr.io/blabla/blabla:latest
Error response from daemon: unauthorized: You don't have the needed
permissions to perform this operation, and you may have invalid
credentials. To authenticate your request, follow the steps in:
https://cloud.google.com/container-registry/docs/advanced-authentication
On going through setting up authentication again, I get these error messages
C:\Program Files (x86)\Google\Cloud SDK>gcloud auth configure-docker
WARNING: docker-credential-gcloud not in system PATH. gcloud's
Docker credential helper can be configured but it will not work until
this is corrected.
WARNING: docker not in system PATH. docker and
docker-credential-gcloud need to be in the same PATH in order to
work correctly together. gcloud's Docker credential helper can be
configured but it will not work until this is corrected.
On searching for solutions, I came across this thread which appears to use macOS commands. I've found the Windows alternative for 'which', which is 'where', giving this:
C:\Program Files (x86)\Google\Cloud SDK>where gcloud
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gcloud
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gcloud.cmd
C:\Users\l.cai\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\gcloud
C:\Users\l.cai\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\gcloud.cmd
But I'm having a lot of trouble understanding this post explaining the alternative for readlink. Replacing parts of that syntax with the filepaths either give
' ' is not recognized as an internal or external command
or
The system cannot find the path specified.
Multi-line commands also don't work well in Windows PowerShell or CMD, so I'm not sure where they're entering commands into.
Can anyone please help me along with this? Many thanks in advance.

Your problem is that neither gcloud nor docker is setup correctly for the user that you are logged in as. The following is a temporary solution. You should reinstall docker and the cloud SDK.
Verify that both components of the path below are correct and adjust for your installations.
Open a Windows Command Prompt and execute:
set PATH=C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin;C:\Program Files\Docker\Docker\Resources\bin;%PATH%

Found a solution: Log into Windows itself with an admin account. None of the other fixes/threads referred to in my OP ended up being relevant.
I had a local administrator account, but since this was set up recently, I was used to logging in to my usual work account (non-admin), and only entering the local admin credentials as needed (e.g. when running programs with elevated privileges).
So docker and powershell and cloud SDK can all be started individually with admin privileges, but somewhere along the chain it breaks, and I'm not prompted for anything. Logging in with the admin account circumvents that.

Related

Teamcity and AWS CLI

I am running Teamcity on a windows VM and have installed the awscli.
I am trying to pull a zip from aws S3. But I get this error:
" aws : The term 'aws' is not recognized as the name of a cmdlet, function, script file"
When I run the command in both cmd and powershell it works just fine.
I have also checked that the awscli path is in both user and system paths.
Any ideas?
I figured it out.
The build agent was not running as a service and was running as a user account that didn't have the correct permissions. Installed a new agent, ran it as a windows service and as a service account.
I hope this helps someone in the future that faces this frustrating issue.

install weblogic on console mode without xming

I'm trying to install weblogic server on Centos 7 with following instruction of oracle about console mode. Everything will be fine till weblogic file 's extracting on my computer. I get this message about
display enviroment variable failed
I google it and found xming as solution. But is there any solution to install weblogic without xming.
You need to do a silent install as mentioned. You can find the documentation here.
Basically, you need two files:
A response file
Here, you will set some parameters like ORACLE_HOME, proxy information if needed and installation type, etc.
An oraInst.loc file
In this file, you need to do the following(from documentation):
Replace oui_inventory_directory with the full path to the directory where you want the installer to create the inventory directory. Then, replace oui_install_group with the name of the group whose members have write permissions to this directory.
After doing all of this, you can run the command as follows;
java -jar distribution_name.jar -silent -responseFile file [-options] [()*]
I uploaded my own oraInst.loc and response files here for demonstration. I strongly suggest you to read the documentation though. Good luck.

Creating Azure self-signed sertificate with powershell Invoke-AddCertToKeyVault command

Recently, I tried to create self-signed certificate for Azure Service Factory accordingly with Microsoft's manual:
Azure Docs: Secure a Service Fabric cluster, step 2.5, 02/05/2016
But command Invoke-AddCertToKeyVault failed with the next error:
Invoke-AddCertToKeyVault : The term 'Invoke-AddCertToKeyVault' is not
recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included,
verify that the path is correct and try again.
I think that Azure Powershell successfully installed on my machine because I was able to login into my Azure account by running Login-AzureRmAccount. Also $env:PSModulePath says that Azure Modules path added to path variable (accordingly with the article: Azure Docs: How to install and configure Azure PowerShell, 04/22/2016). Here they are:
...\Microsoft SDKs\Azure\PowerShell\ResourceManager\AzureResourceManager\;
...\Microsoft SDKs\Azure\PowerShell\ServiceManagement\;
...\Microsoft SDKs\Azure\PowerShell\Storage\;
Also, I have restarted my PC after installing Azure PowerShell.
It is possible that I have missed something, but I am stuck with it. How could it be resolved?
That commandlet is in the package that should be imported -
Import-Module "C:\Users\chackdan\Documents\GitHub\Service-Fabric\Scripts\ServiceFabricRPHelpers\ServiceFabricRPHelpers.psm1"
That is its implementation, by the way, for a reference that it even exists :). Try Import-Module and it should work.

rabbitmqctl.bat on Windows XP: unable to connect to node rabbit#MYPCNAME: nodedown

I have just installed RabbitMQ on my WindowsXP PC. I have fulfilled the Erlang OPC15 prereq as well.
My rabitmq seems to be working. I did a simple test using pika in python and it seems to work. The service is urnning.
The problem is that I cannot do anything with rabbitmqctl.bat. I always get the response:
Status of node rabbit#MYPCNAME ...
Error: unable to connect to node rabbit#MYPCNAME: nodedown
diagnostics:
- nodes and their ports on MYPCNAME: [{rabbit,3097},{rabbitmqctl17251,1132}]
- current node: rabbitmqctl17251#mypcname
- current node home dir: C:\Documents and Settings\Myuser
- current node cookie hash: NOTSUREIFTHISISSENSITIVESOREMOVED==
In my rabbitmq log file I get:
=ERROR REPORT==== 12-Feb-2012::17:01:22 ===
** Connection attempt from disallowed node rabbitmqctl17251#mypcname **
From various forums I deduce this has something to do with cookies. What cookies are we talking about? What do I need to do to be able to manage my RabbitMQ instance using rabbitmqctl.bat? Please word your answer in a way that a non-erlang non-functional programmer would understand.
Had the same problem, this instruction straight out of the manual installation guide solved my problem:
Synchronise Erlang Cookies (when running a manually installed Windows Service)
Erlang Security Cookies used by the service account and the user
running rabbitmqctl.bat must be synchronised for rabbitmqctl.bat to
function.
To ensure Erlang cookie files contain the same string, copy the .erlang.cookie file from the Windows directory (normally C:\WINDOWS\.erlang.cookie) to replace the user .erlang.cookie. The user cookie will be in the user's home directory (%HOMEDRIVE%%HOMEPATH%), e.g. C:\Documents and Settings\%USERNAME%\.erlang.cookie or C:\Users\%USERNAME%\.erlang.cookie (Windows Vista and later).
Shortcut command for #Lining answer:
copy C:\Windows\.erlang.cookie %HOMEDRIVE%%HOMEPATH%\.erlang.cookie
Try to create a file called .erlang.cookie in your $HOME directory and put a simple passphrase in there.
Then restart rabbitmq and it might work. If it doesn't then rabbitmq is doing something to make sure you cannot put a system wide cookie in place.
It worked for me after replacing ".erlang.cookie" file under c:\Windows in C:\Documents and Settings\username folder, because cookie should be same as per my understanding.

Web Deploy returns 401 unauthorized when publishing via [proj].deploy.cmd

I'm having a bit of a problem with Web Deploy I just can't seem to iron out. Every time I try and publish to WMSvc via the [proj].deploy.cmd command in the package I'm getting "The remote server returned an error: (401) Unauthorized". The command looks like this (project is called "Web", server is named "AutoDeploy"):
Web.deploy.cmd /Y /M:https://AutoDeploy:8172/MsDeploy.axd -allowUntrusted
I can publish fine to https://AutoDeploy:8172/MsDeploy.axd via Visual Studio so the service is definitely running and I can successfully authenticate to it as administrator. Running this locally on the machine against the package while logged on as administrator (it's just a little local Win 2k8 VPC) isn't working and adding /U and /P parameters with the administrator account does nothing.
I've enabled failed request tracing and am getting this output so at least there's something to refer to but unfortunately I can't determine what the root cause is. I'm trying to connect to the same service with the same credentials as in Visual Studio but obviously something is different.
Just out of interest, I can publish fine to the Web Deployment Agent Service (MsDepSvc) as follows:
Web.deploy.cmd /Y /M:http://AutoDeploy/MsDeployAgentService /U:AutoDeploy\Administrator /P:...
But I really want to get WMSvc running! Any thoughts?
Sayed's comment above got me pointed in the right direction. After making the build output verbosity "Detailed" and also setting UseMsdeployExe to true in the .csproj (another tip from Sayed's blog), I found the command generated by Visual Studio was setting the authentication type to basic which retrospectively, is obvious given the plain text username and password.
The MSDN post on How to: Install a Deployment Package Using the deploy.cmd File explains you can just add an "a" flag to the command to set this. So in short, here's how it now looks (and actually works):
Web.deploy.cmd /Y /M:http://AutoDeploy/MsDeployAgentService /U:AutoDeploy\Administrator /P:... /A:Basic

Resources