I'm trying to create a make command that will rebuild a project on files change, but I'm failing to find a way to start two commands in parallel and end both of them when I hit Ctrl + C.
I have the following:
watch:
while inotifywait -qre close_write .; do \
make build; \
done
localserver:
sam local start-api ......
server: build
make watch & make localserver
This works, it'll detect file changes, build the project and the make localserver updates with changes.
The problem is that running Ctrl + C doesn't kill the watch process. I'll get to my terminal prompt, and when I change a while the while loop continues and builds the project.
Any ideas to prevent this?
Related
I want to run few gradle apps that will run as background processes for around 5 minutes (there will be more commands to be run after calling gradle apps and then the job will be finished). They execute just fine on my ubuntu machine using nohup:
nohup gradle app1 > nohup1.out 2>&1 &
nohup gradle app2 > nohup2.out 2>&1 &
...
Running these commands does not require pressing INTERRUPT button or enter and so I can just run multiple gradle applications in background in row and start interacting with them.
Though today I learned that Gitlab runner cancels all child processes, thus making nohup useless in a Gitlab CI job.
Is there a workaround so that I can run multiple gradle jobs inside Gitlab job in the background?
I tried using tool at but it did not bring functionality as nohup did.
To background a job, you do not need to use nohup, you can simply use & at the end of a command to 'background' it.
As a simple example:
test_job:
image: python:3.9-slim
script:
- python -m http.server & # Start a server on port 8000 in the background
- apt update && apt install -y curl
- echo "contacting background server..."
- curl http://localhost:8000
And this works. The job will exit (closing the background jobs as well) after the last script: step is run.
Be sure to give enough time for your apps to start up before attempting to reach them.
I have an issue with a startup script I've written and can't for the life of me figure out what's wrong. In essence, I want to start a detached screen, pass a daemon start command into the screen, then change directory and run "npm start" at the directory I've changed to.
Preferably, I'd like to install the screen script as a service, but I've had no success. Screens won't start at all. I've screen -list and even sudo screen -list while the service was installed. I've tried Type=forking Type=oneshot and still nothing.
Crontab is the only thing that has somewhat so far.
I've gotten the screen to stay open, however when I reattach to the screen I get the error:
/path/to/script.sh : 6: /path/to/script: npm: not found.
I used nvm to install node and npm. I tried linking to /usr/bin or /usr/local/bin with no success. Either errors with too many links, or node not found, even though its been linked.
For the scripts I have, the main script which starts the screens is
#!/bin/sh
SCRIPT_DIR=/path/to/script/folder/in/user/directory
screen -dmS nameofscreen bash -c $SCRIPT_DIR/script_to_run.sh
and the script that is supposed to be run is as follows:
#!/bin/sh
USR_BIN=/usr/local/bin
$USR_BIN/daemon_to_start -arg1 -arg2 &&
sleep 2
cd /path/to/npm/app/folder && npm start
sleep 3
exec $SHELL
Once again, with these scripts, the screen starts and stays open, however npm not found error is thrown. Also, if I use the absolute directory for npm, node not found becomes the new error. I've spent three days on this and never wrote a script. I'm starting to lose my mind.
PLEASE HELP!
EDIT: With the help of lojza and adding the node binary to my PATH, It somewhat works! Now when the startup script runs, only one screen is started. The script is supposed to start 4 screens. I've tried appending & and even && to the end with no luck. I will continue searching.
Call npm with full path:
/path/to/npm/app/folder/npm start
/usr/local/bin/npm start #in my case
or
cd /path/to/npm/app/folder && ./npm start
I am looking for a way to run an Electron app (npm startcommand) independently of the terminal itself. Meaning that I expect the Electron app to keep running even if the terminal closes..
I am not sure whether it is possible.
I have tried cd electron-directory-path && nohup npm start &, but this though allows me to use the terminal instance for other commands and prevents any electron messages from popping up in the terminal. But, closing the terminal still kills the Electron app.
Even cd electron-directory-path && npm start & does the same thing, but I haven't yet been able to find a way to run the Electron app completely independent of the terminal instance...
You start an Electron app through nohup npm start &, but when closing the terminal window, the Electron app also terminates (against expectation).
I can reproduce the behavior, but not all the times. In roughly 30% of my experiments, the Electron app was not terminated. I was not able to find the reason for this varying behavior yet.
Workaround
The following workaround closes the terminal without terminating the Electron app. In my tests, it has worked every time:
Start the Electron app as before: nohup npm start &
Close the running terminal by issuing nohup kill $$ &
The $$ gives the current process id.
Note that kill $$ doesn't work.
If you don't necessarily need to run from a terminal, you can also create a desktop file to start the app.
Let pathname be the path to the node app location.
Just use the command:
cd pathname && npm start && ^Z &
cd to change directory to where we need to execute the terminal command.
&& to mean there are other commands to be executed after this one.
npm start to start npm app
^Z to suspend the process running in the terminal, and hence disconnect the terminal part of node from the original app.
& to mean that we don't want the terminal to wait for the command to execute.
Now we can close the terminal, and the electron app should keep running...!
Credits:
https://tecadmin.net/close-terminal-without-killing-running-processes-on-linux/
I'm trying to write a Shell script (for use in Mac OSX Termninal) that will run a command to start a development server (gulp serve). I have it working except the server is continuously running so it doesn't allow me to enter subsequent commands in the same window without stopping the server (Control+C). My question is, is there a way I can run the process in the background and/or suppress any/all output? My goal is to also write a 'stop server' command that will kill the process (which I'm also unsure how to do). I've tried all combinations of using ampersands and &>/dev/null and nothing quite works. Here's what I have so far:
if [ "$1" = "server" ]
then
if [ "$2" = "on" ]
then
cd / & gulp serve --gulpfile /server/example/gulpfile.js # the output is still shown
printf "\033[0;32mserver is online.\033[0m\n"
else
killall tail &>/dev/null 2>&1 # this doesn't kill the process
printf "\033[0;32mportals is offline.\033[0m\n"
fi
fi
You're doing the output redirection on killall, not gulp, so gulp will continue to merrily split out text to your terminal. Try instead:
cd / && gulp server --gulpfile /server/example/gulpfile.js >/dev/null 2>&1 &
Secondly, your kill command doesn't kill your process because you're not telling it to; you're asking it to kill all tail processes. You want instead:
killall gulp
These modifications should be the most direct path to your goal. However, there are a few additional things that may be useful to know.
Process management has a long history in the *nix world, and so we've been inventing tools to make this easier for a long time. You can go through re-inventing them yourself (the next step would be to store the PID of your gulp process so that you can ensure you only kill it and not anything else with "gulp" in the name), or you can go all the way and write a system monitoring file. For Linux, this would be SysV, Upstart, or systemd; I'm not sure what the OS X equivalent is.
However, since you're just doing this for development purposes, not a production website, you probably don't actually need that; your actual goal is to be able to execute ad-hoc shell commands while gulp is running. You can use terminal tabs to do this, or more elegantly use the splitting capabilities of iTerm, screen, or tmux. Tmux in particular is a useful tool for when you find yourself working a lot in a terminal, and would be a useful thing to become familiar with.
First, to run the process in the background
cd / && gulp serve --gulpfile /server/example/gulpfile.js > /tmp/gulp.log &
after cd you need && (and) and & to run in the background at the end.
To kill all gulp processes
killall gulp
I know that there are gulp plugins for executing commands within tasks. What I want to do is run Gulp and while it watching files, I can switch my git branch without having to open a new terminal tab, execute the command there, and then switch back.
You could use the following command to run Gulp in background mode and redirect all output to /dev/null
nohup gulp > /dev/null &