I am trying to make a shell script run jeykll -w serve and sass --watch style.scss:style.css so I don't have to worry each time I want to develop locally. So I made a shell script, but obviously, if I put the jekyll command first, it won't run the sass one till jekyll is done. So how can run two commands at once? Do I have to make a command to open another tab in the terminal and then run sass? there must be a better way of doing that.
You are looking to fork the process so the current shell is not waiting on the command I assume.
jeykll -w serve && sass --watch style.scss:style.css
The above will wait for the first command to complete with an non-errored status
jeykll -w serve ; sass --watch style.scss:style.css
The above will wait for the first command to complete and disregard the exit status.
So if I understand correctly you want multiple commands to run near simultaneously. For this you use the & operator at the end of a command reference.
The Bash & (ampersand) is a builtin control operator used to fork processes. From the Bash man page, "If a command is terminated by the control operator &, the shell executes the command in the background in a subshell".
jeykll -w serve &
sass --watch style.scss:style.css &
Related
I have two commands, that both change directories and runs a program. I'd like to combine them into a single shell script.
The commands are:
cd engine && python cli.py run-engine
cd javascript/services/client && yarn watch
How can I combine them into a single shell script?
Just start your commands with &
#! /bin/sh
cd engine && python cli.py run-engine &
cd javascript/services/client && yarn watch &
If you need to synchronize with them, you can add two "wait". It will only finish when the longest lived process has finished.
Or your can just use the ampersand with the first one and then run the second one without backgrounding it.
Or you can use something like the tool "daemon" (apt install daemon) to control the service start/stop.
I have two processes that I want to run forever but I want to continue using the same shell script to do commands while they're still running.
I've tried using forever but it doesn't seem to be working properly.
npm install forever -g
forever start scripts/node1_start.sh
forever start scripts/node2_start.sh
Any ideas?
As #rhubarbdog pointed out, running the 2 external shell scripts in the background will allow you to continue executing commands (unblocked) in the current script.
sh scripts/node1_start.sh &
sh scripts/node2_start.sh &
# Do commands here
I need to start a couple of processes locally in multiple command-prompt windows, to make it simple, I have written a shell script say abc.sh to run in git-bash which has below commands:
cd "<target_dir1>"
<my_command1> &>> output.log &
cd "<target_dir2>"
<my_command2> &>> output.log &
when I run these commands in git bash I get jobs running in the background, which can be seen using jobs and kill command, however when I run them through abc.sh, I get my processes running in the background, but the git-bash instance disowns them, now I can no longer see them using jobs.
how can I get them run through the abc.sh file and also able to see them in jobs list?
I'm trying to write a Shell script (for use in Mac OSX Termninal) that will run a command to start a development server (gulp serve). I have it working except the server is continuously running so it doesn't allow me to enter subsequent commands in the same window without stopping the server (Control+C). My question is, is there a way I can run the process in the background and/or suppress any/all output? My goal is to also write a 'stop server' command that will kill the process (which I'm also unsure how to do). I've tried all combinations of using ampersands and &>/dev/null and nothing quite works. Here's what I have so far:
if [ "$1" = "server" ]
then
if [ "$2" = "on" ]
then
cd / & gulp serve --gulpfile /server/example/gulpfile.js # the output is still shown
printf "\033[0;32mserver is online.\033[0m\n"
else
killall tail &>/dev/null 2>&1 # this doesn't kill the process
printf "\033[0;32mportals is offline.\033[0m\n"
fi
fi
You're doing the output redirection on killall, not gulp, so gulp will continue to merrily split out text to your terminal. Try instead:
cd / && gulp server --gulpfile /server/example/gulpfile.js >/dev/null 2>&1 &
Secondly, your kill command doesn't kill your process because you're not telling it to; you're asking it to kill all tail processes. You want instead:
killall gulp
These modifications should be the most direct path to your goal. However, there are a few additional things that may be useful to know.
Process management has a long history in the *nix world, and so we've been inventing tools to make this easier for a long time. You can go through re-inventing them yourself (the next step would be to store the PID of your gulp process so that you can ensure you only kill it and not anything else with "gulp" in the name), or you can go all the way and write a system monitoring file. For Linux, this would be SysV, Upstart, or systemd; I'm not sure what the OS X equivalent is.
However, since you're just doing this for development purposes, not a production website, you probably don't actually need that; your actual goal is to be able to execute ad-hoc shell commands while gulp is running. You can use terminal tabs to do this, or more elegantly use the splitting capabilities of iTerm, screen, or tmux. Tmux in particular is a useful tool for when you find yourself working a lot in a terminal, and would be a useful thing to become familiar with.
First, to run the process in the background
cd / && gulp serve --gulpfile /server/example/gulpfile.js > /tmp/gulp.log &
after cd you need && (and) and & to run in the background at the end.
To kill all gulp processes
killall gulp
I know that there are gulp plugins for executing commands within tasks. What I want to do is run Gulp and while it watching files, I can switch my git branch without having to open a new terminal tab, execute the command there, and then switch back.
You could use the following command to run Gulp in background mode and redirect all output to /dev/null
nohup gulp > /dev/null &