Recently, I tried to create self-signed certificate for Azure Service Factory accordingly with Microsoft's manual:
Azure Docs: Secure a Service Fabric cluster, step 2.5, 02/05/2016
But command Invoke-AddCertToKeyVault failed with the next error:
Invoke-AddCertToKeyVault : The term 'Invoke-AddCertToKeyVault' is not
recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included,
verify that the path is correct and try again.
I think that Azure Powershell successfully installed on my machine because I was able to login into my Azure account by running Login-AzureRmAccount. Also $env:PSModulePath says that Azure Modules path added to path variable (accordingly with the article: Azure Docs: How to install and configure Azure PowerShell, 04/22/2016). Here they are:
...\Microsoft SDKs\Azure\PowerShell\ResourceManager\AzureResourceManager\;
...\Microsoft SDKs\Azure\PowerShell\ServiceManagement\;
...\Microsoft SDKs\Azure\PowerShell\Storage\;
Also, I have restarted my PC after installing Azure PowerShell.
It is possible that I have missed something, but I am stuck with it. How could it be resolved?
That commandlet is in the package that should be imported -
Import-Module "C:\Users\chackdan\Documents\GitHub\Service-Fabric\Scripts\ServiceFabricRPHelpers\ServiceFabricRPHelpers.psm1"
That is its implementation, by the way, for a reference that it even exists :). Try Import-Module and it should work.
Related
I have been going through the Microsoft Azure data engineering course and in the "Data integration at scale with Azure Data Factory or Azure Synapse Pipeline / Integrate data with Azure Data Factory or Azure Synapse Pipeline " course It was mentioned to use code below to create a data set:
Set-AzDataFactoryV2Dataset -DataFactoryName $DataFactory.DataFactoryName -ResourceGroupName $ResGrp.ResourceGroupName -Name "InputDataset" -DefinitionFile ".\InputDataset.json"
When I run it with my DF name and RG name it gives this error:
Set-AzDataFactoryV2 : The term 'Set-AzDataFactoryV2' is not recognized as the name of a cmdlet, function, script file,
or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and
try again.
Should I install a package or something? What is this error?
the error message indicates that you do not have installed the related module or that the install path of the module is not part of the PsModulePath sytem variable.
to install the module
install-module az
if the module is installed add the path to the system variable PsModulePath or specify the path at the beginning of the scipt:
import-modul [path]
I am running Teamcity on a windows VM and have installed the awscli.
I am trying to pull a zip from aws S3. But I get this error:
" aws : The term 'aws' is not recognized as the name of a cmdlet, function, script file"
When I run the command in both cmd and powershell it works just fine.
I have also checked that the awscli path is in both user and system paths.
Any ideas?
I figured it out.
The build agent was not running as a service and was running as a user account that didn't have the correct permissions. Installed a new agent, ran it as a windows service and as a service account.
I hope this helps someone in the future that faces this frustrating issue.
I need to work on some previously pushed docker images stored on Google's gcr.io hubs.
I am doing this from a Windows 10 machine, with standard installations of Docker and Google Cloud SDK (no Homebrew or anything like that).
After setting permissions for my gmail account in GCP's IAM section, I am still getting this error message when using this in PowerShell:
docker pull gcr.io/blabla/blabla:latest
Error response from daemon: unauthorized: You don't have the needed
permissions to perform this operation, and you may have invalid
credentials. To authenticate your request, follow the steps in:
https://cloud.google.com/container-registry/docs/advanced-authentication
On going through setting up authentication again, I get these error messages
C:\Program Files (x86)\Google\Cloud SDK>gcloud auth configure-docker
WARNING: docker-credential-gcloud not in system PATH. gcloud's
Docker credential helper can be configured but it will not work until
this is corrected.
WARNING: docker not in system PATH. docker and
docker-credential-gcloud need to be in the same PATH in order to
work correctly together. gcloud's Docker credential helper can be
configured but it will not work until this is corrected.
On searching for solutions, I came across this thread which appears to use macOS commands. I've found the Windows alternative for 'which', which is 'where', giving this:
C:\Program Files (x86)\Google\Cloud SDK>where gcloud
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gcloud
C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin\gcloud.cmd
C:\Users\l.cai\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\gcloud
C:\Users\l.cai\AppData\Local\Google\Cloud SDK\google-cloud-sdk\bin\gcloud.cmd
But I'm having a lot of trouble understanding this post explaining the alternative for readlink. Replacing parts of that syntax with the filepaths either give
' ' is not recognized as an internal or external command
or
The system cannot find the path specified.
Multi-line commands also don't work well in Windows PowerShell or CMD, so I'm not sure where they're entering commands into.
Can anyone please help me along with this? Many thanks in advance.
Your problem is that neither gcloud nor docker is setup correctly for the user that you are logged in as. The following is a temporary solution. You should reinstall docker and the cloud SDK.
Verify that both components of the path below are correct and adjust for your installations.
Open a Windows Command Prompt and execute:
set PATH=C:\Program Files (x86)\Google\Cloud SDK\google-cloud-sdk\bin;C:\Program Files\Docker\Docker\Resources\bin;%PATH%
Found a solution: Log into Windows itself with an admin account. None of the other fixes/threads referred to in my OP ended up being relevant.
I had a local administrator account, but since this was set up recently, I was used to logging in to my usual work account (non-admin), and only entering the local admin credentials as needed (e.g. when running programs with elevated privileges).
So docker and powershell and cloud SDK can all be started individually with admin privileges, but somewhere along the chain it breaks, and I'm not prompted for anything. Logging in with the admin account circumvents that.
I have successfully installed Azure CLI on my build server and can use it at the command line. But when a build executes, running under a service account I get the following error:
az : The term 'az' is not recognized as the name of a cmdlet, function, script
file, or operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again.
I am assuming this is because Azure CLI was installed only for my user. The service account does not have an interactive login so I can't log in and install Azure CLI for that account. Is there a way to make Azure CLI available to my service account.
Add this to your Path environment variable
C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2\wbin
I just installed nodejs on one of my build servers (Win Server 2008 R2) which hosts a Bamboo remote agent. After completing the installation and doing a reboot I got stuck in the following situation:
The remote Bamboo build agent is running as a windows service with user MyDomain\MyUser. When a build with an inline powershell task is executing it fails with the error (from the build agent log):
com.atlassian.utils.process.ProcessNotStartedException: powershell could not be started
...
java.io.IOException: Cannot run program "powershell"
...
java.io.IOException: CreateProcess error=2, The system cannot find the file specified
Loggin on to the server as MyDomain\MyUser, I have checked that powershell is in the path:
where powershell
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
I have tried to restart the service and reboot the machine multiple times. No luck. The only thing that works is if I execute my scripts as a bat file with an absolute path to powershell - but I do not want that.
I have searched for solutions on this, but even though this one seems related: Hudson cannot find powershell after update to powershell 3 - the proposed solutions do not work.
What am I missing here?
If you do a default installation of nodejs you will see that it adds nodejs and npm to the path. Sometimes I have seen that the installer adds a user variable named PATH - it might be that the Bamboo agent decides to read the user path without "merging" it with the system path. I think it would be worth a try to give that a look.
As per Atlassian support page, this is related to a bug in Java Service Wrapper. I tried Workaround-2 since there was no user PATH variable in my system. I had to uninstall bamboo agent service and Java 64 versions from the agent machine to apply the workaround-2.