How to run bash commands from pwsh? - bash

We have a powershell script which is 99% cross-platform but occasionally we need an IF LINUX THEN branch because of how different windows and linux service management is.
We would like to run the kill command from bash but this is an alias of the powershell Stop-Process.
How do we run native bash commands like ps, kill and ls from Powershell.
Note sh ps or bash ps do not work.
PS > bash ps
/usr/bin/ps: /usr/bin/ps: cannot execute binary file

Assuming that running bash runs bash from pwsh, you would want bash -c "ps". Normally the argument to bash would be a script that it tries to execute, hence the error "cannot exxecute binary file". ps is not a bash script, but an executable binary. The -c on the other hand runs arbitrary bash code provided as a command line argument, which can obviously run programs like ps.

If you know the location of the command you can just run it:
/usr/bin/kill
Usage:
kill [options] <pid|name]...
...
Otherwise which can find it and run it with Invoke-Expression or iex:
which kill | iex
This can get tricky since which could return multiple lines which you then would have to guess and just take the first one. You also need to somehow add parameters (e.g. 123) to your command:
which kill | select -first 1 | % {iex "$_ 123"}
kill: sending signal to 123 failed: No such process

Had lots of trouble running ant -version from pwsh but this works:
Invoke-Expression "/bin/bash ant -version"
Cross-Platform Function
function RunCommand($Command) {
if($env:OS -eq 'Windows_NT') {
CMD /c $Command
} else {
Invoke-Expression "/bin/bash $Command"
}
}

Related

$? from bash script command executed by TCL (open pipe) on windows returns wrong value

I've got tcl script with two ways of execution bash script:
#exec bash ./run.sh
open "|bash ./run.sh r"
The bash script is shown below:
#!/bin/bash
ls
if [ "$?" != "0" ]; then
echo "ERROR: Status failed!" > status
else
echo "Everything is OK!" > status
fi
I'm using tclsh for Windows with bash from git bash. When I use:
exec bash ./run.sh
I've got in status file:
Everything is OK!
otherwise:
open "|bash ./run.sh r"
got:
ERROR: Status failed!
Is there any possibility to correctly detect exit code when opened the tcl pipe?
You don't describe whether you get different results out of the ls part of the script. That matters; the ls command is most certainly capable of changing its behaviour according to the environment in which it is invoked. This matters because Tcl executes subprocesses (on Windows) directly using the CreateProcess() system call, rather than the various wrapped versions that Cygwin and git bash use. Other possibilities are that you're launching the script in a different directory and so on.
However, in general we'd expect a script to behave very similarly when launched via exec or via open |… r as they share a common core of functionality. The only differences are to do with how output and termination are waited for.
If you create a subprocess pipeline, by default you won't get to find out about errors from it until you close the pipeline. exec generates any errors “immediately” because it doesn't return control to you until the subprocess has terminated and all output has been read.

Run bash script loop in background which will write result of jar command to file

I'm novice to running bash script. (you can suggest me, if title I've given is incorrect.)
I want to run a jar file using bash script in loop. Then it should write the output of jar command into some file.
Bash file datagenerate.sh
#!/bin/bash
echo Total iterations are 500
for i in {1..500}
do
the_output="$(java -jar data-generator.jar 10 1 mockData.csv data_200GB.csv)"
echo $the_output
echo Iteration $i processed
done
no_of_lines="$(wc -l data_200GB.csv)"
echo "${no_of_lines}"
I'm running above script using command nohup sh datagenerate.sh > datagenerate.log &. As I want to run this script in background, so that even I log out from ssh it should keep running & output should go into datagenerate.log.
But when I ran above command and hit enter or close the terminal it ends the process. Only Total iterations are 500 is getting logged into output file.
Let me know what I'm missing. I followed following two links to create above shell script: link-1 & link2.
nohup sh datagenerate.sh > datagenerate.log &
nohup should work this way without using screen program, but depending on your distro your sh shell might be linked to dash.
Just make your script executable:
chmod +x datagenerate.sh
and run your command like this:
nohup ./datagenerate.sh > datagenerate.log &
You should check this out:
https://linux.die.net/man/1/screen
With this programm you can close your shell while a command or script is still running. They will not be aborted and you can pick the session up again later.

How can I open shell and then execute a command inside it

What I want is to open default shell, then call another and execute a command there.
Was trying something like this:
c:/Windows/System32/bash.exe -c "zsh & zstyle"
or
cmd /k "c:/Windows/System32/bash.exe -c zsh" & zstyle - this open shell but doesn't run a commands
or
c:/Windows/System32/bash.exe -c "zsh -c 'zstyle'"
Currently I am using a cmder/conemu terminal for windows.
Unfortunately, passing a startup to command to zsh with -c and keeping it open for interactive use (with -i) doesn't work.
Disclaimer: The following solutions were tested from a regular Command Prompt (cmd.exe), not cmder/conemu, though I'd expect them to work there too.
To try them from PowerShell (v3+), insert --% as the first argument after (right after bash.exe).
Here's a workaround:
c:/Windows/System32/bash.exe -c "zsh -c 'zstyle' && exec zsh -i"
Note that command zstyle is executed in a different, transient zsh instance, so this approach won't work for commands whose purpose is to modify the environment of the interactive shell that stays open.
If that is a requirement, things get more complicated (this solution courtesy of this answer):
c:/Windows/System32/bash.exe -c "{ { echo 'zstyle'; echo 'exec 0<&3-';} | zsh -i; } 3<&0"
Note, however, that both commands being executed will be printed before their output, if any, is shown, preceded by the prompt - as if the commands had been typed interactively.

Cygwin .sh file run as Windows Task Scheduler

Having issues getting this shell script to run in windows task scheduler.
#!/bin/bash
# Script to ping the VPN server for testing
RESULT=$(ping 192.168.1.252 | grep "Lost" | awk {' print $10 '})
LOG=/home/admin/results.txt
if [ "$RESULT" -gt 0 ];then
echo "VPN 192.168.1.252 NOT pinging" >> $LOG
else
echo "VPN Online"
fi
When I run it in cygwin, it runs with no issue, but when I attempt to run it from command prompt, I get the following:
C:\cygwin64\bin>bash test.sh
test.sh: line 4: grep: command not found
test.sh: line 4: awk: command not found
test.sh: line 7: [: : integer expression expected
My question is, how do I get it to run with bash instead so that it actually knows the grep and awk commands?
In Windows Scheduler, I have Action: Start A Program
Details: C:\cygwin64\bin\bash.exe
Argument: test.sh
Start in: C:\cygwin64\bin
Am I missing something?
I figured it out.
In the Windows Task Scheduler, I had to pass:
Program/script: C:\cygwin64\bin\bash.exe
Add arguments: -c -l test.sh
Start in: C:\cygwin64\bin
In correction to what Jimmy found:
Add arguments: -c -l "c:/FileFolder/test.sh"
You don't need the Start in argument anymore.
For the longest time I was experiencing the same issue as the OP: command not found errors when trying to run a shell script from Task Scheduler or the Command Prompt, despite the fact that running the same script from a Cygwin terminal worked fine.
After some more research I eventually realised that the reason was because my usual Bash PATH ~/.bashprofile wasn't being loaded, and that I needed to use Windows' Environment Variables window to add C:\cygin64\bin to my PATH environment variable (system or user, it doesn't really matter). This directory contains common system executables like grep and awk, which is why Bash is unable to locate them until the path is added to Windows' PATH.

how to run multiple commands on success

In bash & CMD you can do rm not-exists && ls to string together multiple commands, each running conditionally only if the previous commands succeeded.
In powershell you can do rm not-exists; ls, but the ls will always run, even when rm fails.
How do I easily replicate the functionality (in one line) that bash & CMD do?
Most errors in Powershell are "Non-terminating" by default, that is, they do not cause your script to cease execution when they are encountered. That's why ls will be executed even after an error in the rm command.
You can change this behavior in a couple of ways, though. You can change it globally via the $errorActionPreference variable (e.g. $errorActionPreference = 'Stop'), or change it only for a particular command by setting the -ErrorAction parameter, which is common to all cmdlets. This is the approach that makes the most sense for you.
# setting ErrorAction to Stop will cause all errors to be "Terminating"
# i.e. execution will halt if an error is encountered
rm 'not-exists' -ErrorAction Stop; ls
Or, using some common shorthand
rm 'not-exists' -ea 1; ls
The -ErrorAction parameter is explained the help. Type Get-Help about_CommonParameters
To check the exit code from a powershell command you can use $?.
For example, the following command will try to remove not-exists and if it is successful it will run ls.
rm not-exists; if($?){ ls }

Resources