I am writing one web-like linux shell using node.js + socket.io. Simple command like, ls, cd are working well.
But when issue command like ping google.com, the stdout is printing endlessly.
I tried to send Ctrl +C to stdin, but no luck.
1) spawn 'bash' process
spawn = require('child_process').spawn;
var sh = spawn('bash');
2) send bash stdout to socket.io
sh.stdout.on('data', function(data) {
console.log('stdout' + data);
listener.sockets.emit("stdout",new Buffer(data));
});
3) Sending Ctl C (\x03) to bash's stdin.
var listener = io.listen(server);
listener.set('log level',1);
listener.sockets.on('connection', function(client){
client.on('message', function(data){
if(data === "KILL") {
console.log('!!!!' + data);
sh.stdin.write('\x03');
client.broadcast.send(new Buffer("KILLING "));
//return;
};
console.log(data);
sh.stdin.write(data+"\n");
client.broadcast.send(new Buffer("> "+data));
});
});
I am stuck at this point. Seems like
Try process.kill(sh.pid). I use this to kill workers in a cluster when my master process shuts down. sh.signalCode should be equal to SIGTERM. Of course, I have no idea if this works on Windows.
try stdout.push(null).
I tried it on node 12. This will not kill your child process instead it will tell you output stream that there is no more input stream.
Related
I have succesfully spawn a ruby script from my node application. However, in my Ruby script I would like to require some gems and files, and it seems that when doing a require, node.js doesn't get any response.
Here is how this looks:
var cp = require('child_process')
var ruby_child = cp.spawn('ruby',['libs/scorer/test.rb']);
var result = '';
ruby_child.stdout.on('data', function(data) {
result += data.toString();
});
ruby_child.on('close', function() {
console.log(result);
});
And my Ruby script looks like this:
require 'utils' # if I remove this line, I can get the response.
# Does it have an argument?
if ARGV.nil? || ARGV.empty?
p 'test'
exit
end
That's probably the require line failed. The Ruby program sent some error status to STDERR and then exited, and your js program didn't capture STDERR output.
Add some STDERR capture code in your js program for diagnostic information of your problem. You may probably found that the Ruby cannot find the required library in $LOAD_PATH.
ruby_child.stderr.on('data', function(data) {
console.log("ERROR --- " + data);
});
Hello I want to send to the child_process, for example, ping 8.8.8.8-t, that is, an infinite number of ping. And some of the iterations I want to stop this command and execute a new, but in this case I do not want to kill a child process.
Example:
var spawn = require('child_process').spawn('cmd'),
iconv = require('iconv-lite');
spawn.stdout.on('data', function (data) {
console.log('Stdout: ', iconv.decode(data, 'cp866'));
});
spawn.stderr.on('data', function (data) {
console.log('Stderr: ', iconv.decode(data, 'cp866'));
});
spawn.stdin.write('ping 8.8.8.8 -t'+ '\r\n');
spawn.stdin.write(here control-c...); // WRONG
spawn.stdin.write('dir' + '\r\n');
I found your previous question. Looks like you are trying to create/emulate a terminal from within node.js. You can use readline for reading and writing from a terminal.
To write control character, you can see the example from its docs :
rl.write('Delete me!');
// Simulate ctrl+u to delete the line written previously
rl.write(null, {ctrl: true, name: 'u'});
To directly answer the question, to pass special characters you will need to pass their ASCII values. Ctrl + C becomes ASCII character 0x03. Value taken from here.
spawn.stdin.write("\x03");
For the satisfaction of my curiosity gene i'd like to play with bash/node combo.
I don't know how to make those 2 talk together. I just had a great smile on my face finding about TTY.JS ;-)
How do I feed terminal's output (sdtout?) to node? I thought about redirecting the stream to file and read it with node through 'fs' module. But there must be some prettier way I bet
thanks in advance.
Something like this should send terminal output to node
var app = require('express').createServer(),
io = require('socket.io').listen(app),
sys = require('util'),
exec = require('child_process').exec;
app.listen(4990);
io.sockets.on('connection', function(socket) {
socket.on('console', function(command, callBack) {
// client sends {command: 'ls -al'}
function puts(error, stdout, stderr) {
socket.emit('commandresult', stdout);
}
exec(command.command, puts);
});
});
Hope this helps
At:
node js interact with shell application
Trindaz linked to his YouTube video demonstrating how to interact with a bash shell (including interpreters available from the shell):
http://www.youtube.com/watch?v=16nFMucvwYQ
However I could not follow the frames from 20 seconds to 54 seconds. At 54 seconds the browser window shows:
....................................
connected
$ log in
$
.....................................
What are the steps required to get to this window? Any hints or guidance would be appreciated.
Thank you,
RP
I did not exactly replicate what is there in the video but I hope a sample like below should do it.
This is how I configured the server part.
var app = require('express').createServer(),
io = require('socket.io').listen(app),
sys = require('util'),
exec = require('child_process').exec;
app.listen(4990);
io.sockets.on('connection', function(socket) {
socket.on('console', function(command, callBack) {
// client sends {command: 'ls -al'}
function puts(error, stdout, stderr) {
socket.emit('commandresult', stdout);
}
exec(command.command, puts);
});
});
I hope you can do the client part.
Hope this helps
I have a program which is calling another program and processing the child's output, ie:
my $pid = open($handle, "$commandPath $options |");
Now I've tried a couple different ways to read from the handle without blocking with little or no success.
I found related questions:
perl-win32-how-to-do-a-non-blocking-read-of-a-filehandle-from-another-process
why-does-my-perl-sysread-block-when-reading-from-a-socket
But they suffer from the problems:
ioctl consistently crashes perl
sysread blocks on 0 bytes (a common occurrence)
I'm not sure how to go about solving this problem.
Pipes are not as functional on Windows as they are on Unix-y systems. You can't use the 4-argument select on them and the default capacity is miniscule.
You are better off trying a socket or file based workaround.
$pid = fork();
if (defined($pid) && $pid == 0) {
exit system("$commandPath $options > $someTemporaryFile");
}
open($handle, "<$someTemporaryFile");
Now you have a couple more cans of worms to deal with -- running waitpid periodically to check when the background process has stopped creating output, calling seek $handle,0,1 to clear the eof condition after you read from $handle, cleaning up the temporary file, but it works.
I have written the Forks::Super module to deal with issues like this (and many others). For this problem you would use it like
use Forks::Super;
my $pid = fork { cmd => "$commandPath $options", child_fh => "out" };
my $job = Forks::Super::Job::get($pid);
while (!$job->is_complete) {
#someInputToProcess = $job->read_stdout();
... process input ...
... optional sleep here so you don't consume CPU waiting for input ...
}
waitpid $pid, 0;
#theLastInputToProcess = $job->read_stdout();