Docker Desktop: How to get current daemon mode (linux vs windows) - windows

I presume I have to use DockerCLI.exe in some manner, but I cannot for the life of me find documentation on this anywhere.
I know I can use -SwitchDaemon to switch modes, but I want to just get "what mode am I running right now"
Executing & $Env:ProgramFiles\Docker\Docker\DockerCli.exe -h gives very unhelpful output:
Usage: DockerCli.exe [Command]
-h, --help, -Help: Show the help information for this command
Thats it, thats all it gives, no list of commands or etc.
Is there documentation on this tool anywhere at all? When I try and search for it I just keep getting results for the docker commandline itself, not DockerCLI.exe for Docker Desktop, lol

The following command achieved my desired result!
docker version -f '{{.Server.Os}}'

Related

Bash Scripting with LastPass CLI

Edit: As of 01/31/2023 the scripts that I am using below ARE working. Any patterns of inconsistencies that I find I will report here. Would like to leave this open in case others have findings/advice they are interested in sharing in relation to bash scripting/LastPass CLI/WSL
I am looking to use the LastPass CLI to make some changes to Shared Sites within our LastPass enterprise. I was able to write the scripts (fortunately with some help from others on here), however I am unable to get the commands to work properly within a script.
One of the commands that I WAS having troubles with was lpass share create. This command worked directly from the command line, but I was unable to run this command within a script successfully. I have a very simple script, similar to the one below:
#!/bin/bash
folderpath=$1
lpassCreateStoreFolder(){
lpass share create "$folderpath"
}
lpassLogin(){
echo 'testPWD' | LPASS_DISABLE_PINENTRY=1 lpass login --trust --force tester#test.com
}
lpassLogin
lpassCreateStoreFolder
I've been invoking my script through the PowerShell command line like so:
wsl "path/to/script" "Shared-00 Test LastPass CLI"
Sometimes this command works within the script and other times it does not. When I tried running this script around mid December, I had no success at all. The script would run through all the way, the CLI would even give me a response
Folder Shared-00 Test LastPass CLI created.
and the LastPass Admin Console logs show me a report of "Create Shared Folder". The problem is when I go to my LastPass Vault, the Shared Folder was rarely/if ever created. Running the command without a script, directly from the command line worked almost 100% of the time. I initially chalked this up to inconsistencies on their end, but now I am experiencing these same problems with a different command.
Similarly I have been using the lpass edit command to make edits to sites within our LastPass vault. Once again, I have a relatively simple script to make the edit to the site:
#!/bin/bash
lpassId=$1
lpassSetNotes(){
printf "Notes:\n What are your notes?\nThese are my notes" | lpass edit --non-interactive --sync=now "$lpassId"
}
lpassLogin(){
echo 'testPWD' | LPASS_DISABLE_PINENTRY=1 lpass login --trust --force test#test.com
}
lpassLogin
lpassSetNotes
and have been invoking this script through Powershell like so:
wsl "path/to/script" "000LastPassID000"
like the lpass share create command, running the script does not produce the desired output. The script runs all the way through and my changes are reflected in the logs, but when I go to the vault the site itself is never changed. The command DOES however work when I run it from the command line directly within WSL.
I am relatively new to writing Bash scripts/the Linux operating system, so I'm not entirely sure if this something wrong on my end or just the vendor's tool that I am utilizing producing inconsistencies. Any help would be appreciated, I know this issue might be hard to replicate without a LastPass account
Example LastPass CLI calls that work directly from command line in WSL
lpass share create "Shared-00 Testing LastPass CLI"
printf "Notes:\n What are your notes?\nThese are my notes" | lpass edit --non-interactive --sync=now "$lpassId"
References
LastPass CLI
CLI Manual
CLI GitHub

Is there a docker log viewer or debug mode for windows?

It seems the Docker Desktop is always being updated and often with bugs so it is not working as it was previously. The default install on Windows is mostly a black box and you can look at the GUI, but it isn't very helpful in learning what is happening or where it is in starting when it says "restarting" for example. Preface of the question is that Docker is targeted towards developers, but then they wrap it into this dumbed down GUI that makes the service a black box to developers. I would prefer to start it some sort of debug mode in a console window and only when I want to interact via the GUI use those features. Is there a way to use docker from a more programmer friendly console interface on Windows?
Open an elevated command prompt and then run sc.exe c docker to get the current command line for docker service.
From the output of the above command take the BINARY_PATH_NAME and modify it using the below steps.
1. Escape each " with \
2. Add -D at the end
3. Keep the whole command in " "
After modifying it run the command and then restart the docker service.
Example
Let's say BINARY_PATH_NAME : "C:\Proram Files\Docker\dockerd.exe" --run-service
After modifying it needs to be like this
sc.exe config docker binpath= "\"C:\Program Files\Docker\dockerd.exe\" --run-service -D"
After executing the above commands restart the docker service.
sc.exe stop docker
sc.exe start docker

curl command in git-bash

I have a script written in bash and tested working in Linux (CentOS 7) and on MacOS. The script uses cURL to interact with a REST API compliant data platform (XNAT).
I was hoping that Windows users could use the same script within git-bash that comes packaged with Git for Windows. Unfortunately there seems to be an issue when using cURL in git-bash.
The first use I make of cURL is to retrieve a JSESSION cookie:
COOKIE=`curl -k -u $USERNAME https://theaddress/JSESSION`
On Linux, this asks the user for password and stores the cookie in COOKIE.
In git-bash, issuing the command hangs, until using a "ctrl + C" to interrupt it. Strangely at that point the query message for the password is displayed, but too late, the script has terminated.
I have a suspicion that this may have to do with CR or LF issues, but cannot find some info I understand regarding this.
Any pointers would be welcome !
Thank you
EDIT:
It appears the above command works fine if I pass the password in the command like this:
COOKIE=`curl -k -u $USERNAME:$PASSWORD https://theaddress/JSESSION`
However, as pointed here:
Using cURL with a username and password?
I would rather avoid having the user typing their password as a command argument.
So the question is now "why is cURL not prompting for a password when I use the first command?" when in git-bash on Windows, while that command behaves as expected in Linux or MacOS:
COOKIE=`curl -k -u $USERNAME https://theaddress/JSESSION`
Ending up replying to my own question, hope this may be useful to someone else.
It appears this issue is a known problem when running cURL from within git-bash, according to this thread:
https://github.com/curl/curl/issues/573
In particular, see the answer of dscho on 30 Dec 2015:
The problem is the terminal emulator we use with Git Bash since Git for Windows 2.5, MinTTY.
This terminal emulator is not associated with a Win32 Console, therefore the user does not see anything when cURL wants to interact with the user via said Console.
This issue has a workaround, which is documented here:
https://github.com/git-for-windows/build-extra/blob/master/ReleaseNotes.md#known-issues
The workaround is to run curl via winpty as follows:
winpty curl [arguments]
Not an issue with CR or LF after all.
Soooo, git-bash may not be the magic-bullet (tm) to run my bash scripts in Windows with zero effort. Sigh...

TeamCity and psexec don't work at all

Im trying to execute script on remote machine (script resides REMOTELY, and NOT in agent folder or whatever) through Command Line Runner:
#echo off
%env.ALLUSERSPROFILE%\JetBrains\TeamCity\plugins\.tools\psexec.exe \\12.34.56.78 -h -u admin -p 12345 T:\Folder\Script\update.cmd T:\Folder\Server
But I always get:
The system cannot find the path specified.
Process exited with code 1
And NOTHING else. TeamCity and psexec just scrambling output and not giving it out on what actually not found, what is the problem or whatever. It just freaks me out to even use psexec to run something.
What am I doing wrong? What else do I need to specify to JUST run it remotely (Im not asking about giving me output of script, because as I searched I understand what such simple functionality is not supported by psexec at all)?

What does this command do: 'wget -qO- 127.0.0.1'?

In the provisioning part of vagrant guide, there is a command wget -qO- 127.0.0.1 to check if apache is installed property.
Can anyone explain the command more in detail? I dont understand what the -qO- option does. Also, what is the meaning of wget to 127.0.0.1?
Thanks!
The dash after the O instructs output to go to standard output.
The q option means the command should be "quiet" and not write to standard output.
The two options together mean the instruction can be used nicely in a pipeline.
As far as added 127.0.0.1 as the source of the wget, that is there to make sure you have a local webserver running. Running wget on the commandline is faster than bringing up a browser.

Resources