node.js: Run external command in new window - windows

I'm writing a node.js script that permanently runs in the background and initiates file downloads from time to time. I want to use WGET for downloading the files, since it seems more robust until I really know what I'm doing with node.js.
The problem is that I want to see the progress of the downloads in question. But I can't find a way to start WGET through node.js in a way that a new shell window is opened and displayed for WGET.
Node's exec and spawn functions run external commands in the background and would let me access the output stream. But since my script runs in the background (hidden), there's not much I could do with the output stream.
I've tried opening a new shell window by running "cmd /c wget..." but that didn't make any difference.
Is there a way to run external command-line processes in a new window through node.js?

You can use node-progress and Node's http client (or requesT) for this, no need for wget:
https://github.com/visionmedia/node-progress
Example:
var ProgressBar = require('../')
, https = require('https');
var req = https.request({
host: 'download.github.com'
, port: 443
, path: '/visionmedia-node-jscoverage-0d4608a.zip'
});
req.on('response', function(res){
var len = parseInt(res.headers['content-length'], 10);
console.log();
var bar = new ProgressBar(' downloading [:bar] :percent :etas', {
complete: '='
, incomplete: ' '
, width: 20
, total: len
});
res.on('data', function(chunk){
bar.tick(chunk.length);
});
res.on('end', function(){
console.log('\n');
});
});
req.end();
UPDATE:
Since you want to do this in a background process and listen for the download progresses (in a separate process or what have you) you can achieve that using a pub-sub functionality, either:
use a message queue like Redis, RabbitMQ or ZeroMQ
start a TCP server on a known port / UNIX domain and listen to it
Resources:
http://nodejs.org/api/net.html

Related

How to reuse AWS socket?

I am using an AWS webserver, which is being polled by some other script. The problem is that when I start the server twice in a few seconds (with requests of a client in between), the server fails to start again, saying:
raised AWS.NET.SOCKET_ERROR : Bind : [98] Address already in use
There is this old thread that suggests there may be a reuse_address
option, either in a ini file or as a direct parameter, but says that that also does not work.
Perhaps there is some way to force the OS to abandon the socket?
You need to call AWS.Config.Set.Reuse_Address (Config, True); or set it in the AWS ini file.
For example:
with AWS.Config;
with AWS.Config.Set;
(...)
declare
HTTP_Server : AWS.Server.HTTP;
AWS_Config : AWS.Config.Object := AWS.Config.Default_Config;
begin
AWS.Config.Set.Reuse_Address (AWS_Config, True);
AWS.Config.Set.Server_Port (AWS_Config, 80);
AWS.Server.Start (HTTP_Server, Callback => Respond'Unrestricted_Access, Config => AWS_Config);
(...)

how to get access from local machine to server-side terminal by using node

I want to get access to server-side terminal from node js code(from my local machine ) . I know that if you want to execute some terminal commands from code ( from node js code) , u can use child-process - but how to make connection to server-side terminal and send some commands there . Hope my question is clear .
If SSH is enabled in your server, you can use the NPM node-ssh package
var path, node_ssh, ssh, fs
fs = require('fs')
path = require('path')
node_ssh = require('node-ssh')
ssh = new node_ssh()
ssh.connect({
host: 'localhost',
username: 'steel',
privateKey: '/home/steel/.ssh/id_rsa'
}).then( function(){
/* your ssh interactions with the server go here */
/* use the node-ssh API to execute commands, upload files... */
})

How to I read from a Jupyter iopub socket?

I'm trying to learn more about the Jupyter wire protocol. I want to collect examples of the messages sent on the IOPub socket.
SETUP:
I start a Jupyter console in one terminal then go find the connection file. In my case the contents are as follows:
{
"shell_port": 62690,
"iopub_port": 62691,
"stdin_port": 62692,
"control_port": 62693,
"hb_port": 62694,
"ip": "127.0.0.1",
"key": "9c6bbbfb-6ad699d44a15189c4f3d3371",
"transport": "tcp",
"signature_scheme": "hmac-sha256",
"kernel_name": ""
}
I create a simple python script as follows:
import zmq
iopub_port = "62691"
ip = "127.0.0.1"
transport = "tcp"
context = zmq.Context()
socket = context.socket(zmq.SUB)
socket.connect(f"{transport}://{ip}:{iopub_port}")
while True:
string = socket.recv()
print(string)
I open a second terminal and execute the script as follows (it blocks, as expected):
python3 script.py
And then I switch back to the first terminal (with the Jupyter console running) and start executing code.
ISSUE: Nothing prints on the second terminal.
EXPECTED: Some Jupyter IO messages, or at least some sort of error.
Uh, help? Is my code fine and this is probably an issue with my config? Or is my code somehow braindead?
From one of the owners of the Jupyter client repo:
ZMQ subscriber sockets need a subscription set before they'll receive
any messages. The subscription is a prefix of a valid message, and you
can set it to an empty bytes string to subscribe to all messages.
e.g. in my case I need to add
socket.setsockopt(zmq.SUBSCRIBE, b'')
before starting the while loop.
Do you know if it's possible to capture from IOPub if a process in Jupyter notebook is finished or not?
I'm looking here (http://jupyterlab.github.io/jupyterlab/services/modules/kernelmessage.html) but it is not very clear.

How to allow fs and multer to write files on AWS EC2

I am new to programming and recently finished a full MEAN stack project which is deployed on AWS EC2.
Project codes:
https://github.com/Cryfoo/13
Deployed at: http://35.166.172.216/
On server side, it uses fs.writeFile to save game logs after each game.
// Codes from server/config/game.js # lines 1361~1364
var filename = 10000 + roomNum;
filename = filename + "_" + endTime;
fs.writeFile("server/logs/" + filename + ".txt", variables[roomNum].logs, function(err) {});
On client side, it sends http request to server and uses multer to upload a profile picture of a user.
// Codes from server/controllers/user.js # lines 3~12
var storage = multer.diskStorage({
destination: function (req, file, callback) {
callback(null, "./client/static/profile");
},
filename: function(req, file, callback) {
callback(null, file.originalname);
}
});
var upload = multer({storage: storage, limits: {fileSize: 5000000}}).single("profile");
It works well locally on my laptop, but these two features do not work on EC2. I am assuming the problem has to do with allowing a permission to write files. How do I allow fs and multer to write files in EC2?
I have done a lot of searches for all the problems during this project and found solutions on stackoverflow and google, but for this problem I can't figure it out. I apologize if I am not being specific enough (First time posting a question here). Thank for the help in advance though.
If it's an issue with permissions on the folder, than you just need to change them
sudo chmod a+x -R server/logs
Edit: you probably need to run the command with sudo

node.js simple tcp test

here is the code you can find every where on net
var net = require('net');
var server = net.createServer(function (socket) {
socket.write("Echo server\r\n");
socket.pipe(socket);
});
server.listen(1337, "127.0.0.1");
A simple tcp server will echo whatever you will send it. How to send data to it? What tools/commands I need in mac to test this server?
Use nc aka netcat. In Terminal.app, while your node app is running:
$ nc localhost 1337
Echo server
Ta-da!

Resources