I am trying to write down a Windows bash script to:
Start Windows Subsystem for Linux (WSL) and
cd within WSL to "../myfolder/"
run ./foo first_parameter second_parameter
Wait until finished and exit WSL
cd within Windows to "../myWinFolder/"
run Foo.exe parameter
wait until finished
This is my attempt:
bash -c "cd ../myFolder/ && ./foo first_parameter second_parameter"
cd ..
cd myWinFolder
START /WAIT Foo.exe parameter
But sadly CMD does not wait for WSL to finish before running the EXE.
Anything I can do?
I'm happy that the interop between dos and WSL does in fact wait for commands to finish. Try the following:
In a file called runit.bat
echo bat01
bash -c "echo bat02; cd ./bash/; ./runit.sh; echo bat03"
echo bat04
In a sub-folder called ./bash/ paste the following in a file called runit.sh
echo sh01
sleep 2s
echo sh02
When you run runit.bat from within dos you will see a wait of 2 seconds
You have not specified what is inside your ./foo script. I suspect that it is running a task in the background or running something that returns immediately. This can be simulated by putting & after the sleep so that it runs in the background within wsl sleep 2s &. When you do this you see that there is no pause in the execution of the script.
So I would check ./foo maybe add some echo debug statements around inside it and run it from within WSL first to make sure that it does indeed wait until all the commands are finished before it exits.
Related
I am working on a Raspberry Pi powered Magic Mirror project and to start the program I execute a shell script that runs in the background continuously. To make the AI part of my project work I need to open a second shell script in the background that also runs continuously. My problem occurs when I try to execute my Xterm commands it waits for the first script to complete before it starts the second script. Since both scripts have no designated end point I am stuck. Is there a way to make both Xterm commands execute at the same time?
Here is my current code to start the Xterm sessions:
cd ~/MMStartAll
xterm -e "cd ~/MMStartAll; ./AssistantStart.sh"
xterm -e "cd ~/MMStartAll; ./MMStart.sh"
$SHELL
Your script should end with a &. This means that both the xterms will run in a sepperate process id (pid).
cd ~/MMStartAll
xterm -e "cd ~/MMStartAll; ./AssistantStart.sh" &
xterm -e "cd ~/MMStartAll; ./MMStart.sh" &
$SHELL
I am trying to run a series of tests on a remote Linux server to which I am connecting via ssh.
I don't want to have to stay logged in the ssh session during the runs -> nohup(?)
I don't want to have to keep checking if one run is done -> for loop(?)
Because of licensing issues, I can only run a single testing process at a time -> sequential
I want to keep working while the test set is being processed -> background
Here's what I tried:
#!/usr/bin/env bash
# Assembling a list of commands to be executed sequentially
TESTRUNS="";
for i in `ls ../testSet/*`;
do
MSG="running test problem ${i##*/}";
RUN="mySequentialCommand $i > results/${i##*/} 2> /dev/null;";
TESTRUNS=$TESTRUNS"echo $MSG; $RUN";
done
#run commands with nohup to be able to log out of ssh session
nohup eval $TESTRUNS &
But it looks like nohup doesn't fare too well with eval.
Any thoughts?
nohup is needed if you want your scripts to run even after the shell is closed. so yes.
and the & is not necessary in RUN since you execute the command with &.
Now your script builds the command in the for loop, but doesn't execute it. It means you'll have only the last file running. If you want to run all of the files, you need to execute the nohup command as part of your loop. BUT - you can't run the commands with & because this will run commands in the background and return to the script, which will execute the next item in the loop. Eventually this would run all files in parallel.
Move the nohup eval $TESTRUNS inside the for loop, but again, you can't run it with &. What you need to do is run the script itself with nohup, and the script will loop through all files one at a time, in the background, even after the shell is closed.
You could take a look at screen, an alternative for nohup with additional features. I will replace your testscript with while [ 1 ]; do printf "."; sleep 5; done for testing the screen solution.
The commands screen -ls are optional, just showing what is going on.
prompt> screen -ls
No Sockets found in /var/run/uscreens/S-notroot.
prompt> screen
prompt> screen -ls
prompt> while [ 1 ]; do printf "."; sleep 5; done
# You don't get a prompt. Use "CTRL-a d" to detach from your current screen
prompt> screen -ls
# do some work
# connect to screen with batch running
prompt> screen -r
# Press ^C to terminate the batch (script printing dots)
prompt> screen -ls
prompt> exit
prompt> screen -ls
Google for screenrc to see how you can customize the interface.
You can change your script into something like
#!/usr/bin/env bash
# Assembling a list of commands to be executed sequentially
for i in ../testSet/*; do
do
echo "Running test problem ${i##*/}"
mySequentialCommand $i > results/${i##*/} 2> /dev/null
done
Above script can be started with nohup scriptname & when you do not use screen or simple scriptname inside the screen.
On Windows 8.1.
I have written a simple .sh script to start up my dev environment. I know, I can use Windows native batch script (and it works fine, has no given problem), but I prefer Git Bash. The problem is that every Git Bash window opened by my script is closed on Ctrl+C. And I don't want them to get closed but only exit running processes.
Here is my script. It opens four Git Bash windows and starts processes within them. And when I strike Ctrl+C in one of those four windows, the window just closes. Kills the process (except nginx; nginx continues working) and closes. And I only want to stop the process, not terminate the window:
#!/bin/bash
cd /c/nginx
start sh.exe --login -i -c "nginx"
cd /c/Users/user/app
start sh.exe --login -i -c "NODE_ENV='development' nodemon"
start sh.exe --login -i -c "NODE_ENV='development' gulp mytask"
start sh.exe --login -i -c "NODE_ENV='development' compass watch"
How to do it?
If you use a wrapper like Console2 or ConsoleZ around git bash with it's shell pointed to "C:\Program Files\Git\usr\bin\sh.exe" --login -i or "C:\Windows\SysWOW64\cmd.exe /c ""C:\Program Files\Git\usr\bin\sh.exe" --login -i"" it works fine. I'm not sure how to do it without having that wrapper, but it's pretty cool to use one anyway so you could try it out!
i have a shell script run.sh.
cd elasticsearch-1.1.0/
./bin/elasticsearch
cd
cd RBlogs/DataFetcher/
mvn clean install assembly:single;
cd target/
java -jar DataFetcher-0.0.1-SNAPSHOT-jar-with-dependencies.jar
Here if second line(./bin/elasticsearch) executes it runs infinite time, so the next lines will not execute. So what i need is to perform the next lines after 10 seconds. But
cd elasticsearch-1.1.0/
./bin/elasticsearch
sleep 10
cd
cd RBlogs/DataFetcher/
mvn clean install assembly:single;
cd target/
java -jar DataFetcher-0.0.1-SNAPSHOT-jar-with-dependencies.jar
This also will not execute next lines because ./bin/elasticsearch will not complete its execution in 10seconds. So how can i solve this problem? Please help.
Adding & at the end of ./bin/elasticsearch will cause the process to run in a subshell, freeing the current shell up for the next commands.
./bin/elasticsearch &
Change this in your second version of the script and things should run like you want them too.
More information can be found from man bash
If a command is terminated by the control operator &, the shell
executes the command in the background in a subshell.
The shell does not wait for the command to finish, and the return status is 0.
you may try to put it in background
& can do this for you
./bin/elasticsearch &
If you simply want elasticsearch to run in the background while the rest of the script progresses, just use &:
cd elasticsearch-1.1.0/
./bin/elasticsearch &
sleep 10
cd
cd RBlogs/DataFetcher/
However, if you want to run elasticsearch for at most 10 seconds, killing it if necessary, then proceeding with the rest of the script, you need something a little more complicated:
cd elasticsearch-1.1.0/
./bin/elasticsearch &
pid=$!
sleep 10
kill -0 $pid && kill $pid
cd
cd RBlogs/DataFetcher/
As other answers mentioned, you can
use ./bin/elasticsearch & to run the command in
background.
record the process id of the
command run in background by using child_pid=$! and then stop the
process by using kill $child_pid after some time to implement a
timeout mechanism.
Meanwhile, you also can synchronize another operation with the command run in background by using wait command. An example bellow:
./bin/elasticsearch &
# do something asynchronously here
wait # wait for accomplishment of ./bin/elasticsearch
# do something synchronously here
ssh user#myserver.com<<EOF
cd ../../my/path/
sh runscript.sh
wait
cd ../../temp/path
sh secondscript.sh
EOF
The first script runs and asks me the questions in that script, but before i'm even able to start typing to answer them the second script starts running. From what I'm reading this shouldn't be happening even without the wait.