How to run two node.js commands independently in windows? - cmd

In windows, I run 2 commands like this
node watcher.js && node server.js
The first runs a watcher script, and the second runs a server. The problem is both are persistent and don't actually end. So the running server part never happens because the watcher script still runs.
Is there a way I can say to run both but don't care about finishing a script?
Thanks

try
start node watcher.js && start node server.js
this will start two cmd programs independently.

Related

Can't terminate node(js) process without terminating ssh server in docker container

I'm using a Dockerfile that ends with a CMD ["/start.sh"]:
#!/bin/bash
service ssh start
/usr/bin/node /myApp/app.js
if for some reason i need to kill the node process, the ssh server is being closed as well (forces me to reboot the container to reconnect).
Any simple way to avoid this behavior?
Thank You.
The container exits as soon as main process of the container exits. In your case, the main process inside the container is start.sh shell script. The start.sh shell script is starting the ssh service and then running the nodejs process as child process. Once the nodejs process dies, the shell script exits as well and so the container exits. So what you can do is to put the nodejs process in background.
#!/bin/bash
service ssh start
/usr/bin/node /myApp/app.js &
# Need the following infinite loop as the shell script should not exit
while do:
sleep 2
done
I DO NOT recommend this approach though. You should have only a single process per container. Read the following answers to understand why -
Running multiple applications in one docker container
If you still want to run multiple processes inside container, there are better ways to do it like using supervisord - https://docs.docker.com/config/containers/multi-service_container/

Running bash script from ruby not producing the correct pid

I am developing a ruby framework to run different jobs and one of the things that I need to do is to know when these jobs have ended in order to used their outputs and organize everything. I have been using it with no problem but some colegues are starting to use it in different system and something really odd is happening. What I do is run the commands using
i,o,e,t = Open3.popen3(job.get_cmd)
p = t.pid
and later I check if the job has ended like this:
begin
Process.getpgid(p)
rescue Errno::ESRCH
# The process ended
end
It works perfectly in the system I am running (Scientifi linux 6) but when a friend of mine started running on Ubuntu 14.04 (using ruby 1.9.3p484) and the command is a concatenation of commands such as cmd1 && cmd2 && cmd3 each command is run at the same time by the system, not one after the other, and the pid returned by t.pid is neither of the pids of the different processes being run.
I modified the code and instead of running the concatenation of cammands it creates a script with all the command inside the command called from popen3 is just Open3.popen3("./script.sh") but the behaviour is the same... All the commands are run at the same time and the pid that ruby knows is not any of the processes pid...
I am not sure if this is something ruby related but since running that script.sh by hand behaves as expected, running one command after the other, it seems that either ruby is not launching the process accordingly or the system is not reading the process as it should. Do you know what might be happening?
Thanks a lot!
EDIT:
The command being run looks like this
./myFit.exe h vlq.config &> output_h.txt && ./myFit.exe d vlq.config &> output_d.txt && ./myFit.exe p vlq.config &> output_p.txt
This command, if run by hand and not inside the ruby script runs perfectly, exactly this command. When run from the ruby script it runs at the same time all the myFit.exe executions (but I want them to be run withh && becasue I want them running if the previous works fine). Myfit.exe is a tool which makes a fit, is not a system command. Again, this command, if run by hand runs perfeclty.

How to execute bashscript on multiple ec2 instances at the same time

I have written a bash-script. Just by performin ./script.sh I can execute it at the moment on one node.
But it's need to be executed on multiple nodes. How can I execute one script on multiple nodes at the same time?
At the moment I'm using this:
for ip in $(<ALL_SERVERS_IP); do ...
But this is performing the installation not at the the same time. It's finished the first node and start to the second etc. I'm working on centos7
you can try putting a & after your command.
for ip in $(<ALL_SERVERS_IP); do YOUR_COMMAND_OR_SCRIPT & done
Ampersand at the end will put your script in background, not waiting for script to end.

start multiple docker containers with a single command line shell script (without docker-compose)

I've got 3 containers that will run on a single server, which we'll call: A,B,C
Each server has a script on the host that has the commands to start docker:
A_start.sh
B_start.sh
C_start.sh
I'm trying to create a swarm script to start them all, but not sure how.
ABC_start.sh
UPDATE:
this seems to work, with the first being output to the terminal, cntrl+C exits out of them all.d
./A_start.sh & ./B_start.sh & ./C_start.sh
swarm will not help you start them at all..., it is used to distribute the work amongst docker machines that are part of the cluster.
there is no good reason not to use docker-compose for that use case, its main purpose is to link containers properly, and bring them up, so your collection of scripts could end up being a single docker-compose up command.
In bash,
you can do this:
nohup A_start.sh &
nohup B_start.sh &
nohup C_start.sh &

start and end shellscript for multiple programs

Following problem:
3 programs:
one Java application which is started via a existing sh script
one node application
one grunt server
I want to write 2 shell scripts, the first should start all 3 programs. The second should end them. For the first script I simply call the starting commands. But for the second, which should be a standalone script(as the first should be), I have to know all process Ids for killing them. But even if I know these Ids, what if they started sub processes. I would just kill these parent processes, wouldn't I?
What's the approach here?
Thanks in advance!
Try pkill -P -KILL [parentid]. This should kill processes with the designated parent ID.

Resources