I'm attempting to make a small node executable which has the ability to open a local shell and send and receive commands to a HTTP server.
Seems simple enough logically but what would be the best practice in implementing this?
Open a command line using Node API
Listen for commands from server URL /xmlrpc-urlhere/
When a command is received via Ajax or WebRTC? Execute it in command line
POST the command line response back to the user
This shell will be for administrative purposes.
I can't think of a good way of doing this, I've checked npm and it would seem there are some basic command line modules for node js but nothing that has this functionality.
The npm modules commander and request ought to do the trick. To get you started, here's a simple command line tool that issues a search to google and dumps the raw HTML into the console.
Eg:
./search.js puppies
#!/usr/bin/env node
var program = require('commander');
var request = require('request');
var search;
program
.version('0.0.1')
.arguments('<query>')
.action(function(query) {
search = query;
});
program.parse(process.argv);
if (typeof search === 'undefined') {
console.error('no query given!');
process.exit(1);
}
var url = 'http://google.com/search?q=' + encodeURIComponent(search);
request(url, function(err, res, body) {
if (err) {
console.error(err);
process.exit(1);
}
console.log('Search result:');
console.log(body);
process.exit(0);
});
Related
I have some parse cloud code im running on my self hosted server but im running into an issue where queries are not doing anything. I can run commands through terminal and get data back but when I run a query.find.. nothing happens. For Example:
Parse.Cloud.job("getall", function(request, response) {
var itemStatus = Parse.Object.extend('MovieStatus');
var query = new Parse.Query(itemStatus);
query.find({
success: function(results) {
console.log(results.length)
response.success(results.length);
},
error: function(err) {
response.error(err);
},
useMasterKey : true
})
})
Nothing happens. No error no response. I have added console logs to make sure its at least getting called and it is, but for some reason nothing every returns from the server when I do query.find
I have tried all sorts of things to figure out what the issue is but this affects all of my cloud code so it has to be something in there.
You are using an old syntax. Since version 3.0, Parse Server supports async/await style. Try this:
Parse.Cloud.job("getall", async request => {
​const { log, message } = request;
const ItemStatus = Parse.Object.extend('MovieStatus');
const query = new Parse.Query(ItemStatus);
const results = await query.find({ useMasterKey: true });
log(response.length);
message(response.length);
})
Not this is a job and not a cloud code function. You can invoke this job using Parse Dashboard and you should see the message in the job status section.
I have an automation script in CasperJS controlling a PhantomJS headless browser that logs into a site, enters data over multiple pages / form.
From the same physical server, I have PHP/MySQL serving up a CRM client website. On the CRM site, I want to have the ability to:
Trigger the remote CasperJS script to go browse a remote site and log in and fill out forms
Read the output stream (i.e. "Page 1 complete, page 2 complete" ,etc)
Display the status updates to the client user as the CasperJS script is executing
I am thinking that socket.io is the ticket here. But, I am I going about this all wrong? I am trying to avoid having a selenium server running. I checked this answer on SO but I am not looking for screenshots, I'm looking for the console output from CasperJS to be displayed in the client website.
I had a similar task once and concocted a solution using local Express.js server with Socket.io.
You would launch this server with node.js and then pass tasks to it from PHP by making POST requests to http://127.0.0.1:9000 (I used the excellent Requests library).
Here's a simplified version of my script:
var fs = require("fs");
var express = require("express");
var app = express();
var server = require("http").Server(app);
var io = require("socket.io")(server);
var iosocket;
// Express middleware to get variables from POST request
var bodyParser = require('body-parser');
app.use(bodyParser.urlencoded({ extended: true }));
// Create websocket connection
io.on("connection", function(socket){
console.log('io.js connection');
iosocket = socket;
});
// Receieve task from external POST request
app.post("/scrape", function(req, res){
res.send("Request accepted");
// Url to parse
var url = req.body.url;
// Variable to collect data from scraper
var data = [];
// Launch scraping script
var spawn = require('child_process').spawn,
child = spawn('/path/to/casperjs', ['/path/to/scrape/script.js', url]);
console.log("Spawned parser");
// Receieve data from script
child.stdout.on('data', function (data) {
var message = data.toString();
data.push(message);
// Send data to the web client
iosocket.emit("message", message);
});
// On error
child.stderr.on('data', function (data) {
console.log('stderr: ' + data.toString());
});
// On scraper exit
child.on('close', function (code) {
console.log("Scraper exited with code: " + code);
//
// Put data into a file or a database, for example
//
fs.writeFileSync("path/to/file/results_" + (new Date()).getTime() + ".json", JSON.stringify(data));
});
});
// Bind app to port # localhost
server.listen(9000, "127.0.0.1");
Solution with CasperJS/Phantomjs server is interesting, however people pointed out that it leaks memory, which probably won't be happening if you run short-lived CasperJS scripts.
I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js
I'm having a lot of trouble running child processes and getting their output written to the console. In this episode, I'm trying to use spawn to run a windows mklink command. The error is the that I don't have permission to write the file.
My problem, though, is that the error isn't told to me in any way.
The following prints You do not have sufficient privilege to perform this operation. to the console:
mklink /D C:\some\path\to\my\intended\link C:\path\to\my\folder
But running this in node.js only gives me Error: spawn ENOENT - which is a highly useless error message:
require('child_process').spawn('mklink',
['/D', 'C:\\some\\path\\to\\my\\intended\\link',
'C:\\path\\to\\my\\folder'],
{stdio:'inherit'})
I get nothing on the console, despite the stdio:'inherit'. I've also tried the following:
var x = require('child_process').spawn('mklink',
['/D', 'C:\\some\\path\\to\\my\\intended\\link',
'C:\\path\\to\\my\\folder'])
x.stdout.pipe(process.stdout)
x.stderr.pipe(process.stderr)
But no dice. No console output at all. Note that I do get console output with exec:
var x = require('child_process')
.exec('mklink /D C:\\some\\path\\to\\my\\intended\\link C:\\path\\to\\my\\folder')
x.stdout.pipe(process.stdout)
x.stderr.pipe(process.stderr)
This shouldn't need any special knowledge of how windows mklink works - my problem is simply with error reporting with node.js spawn.
What am I doing wrong here? Is this a bug in node.js?
Update: It seems this bug has been fixed by node v0.10.29
For me stdio wasn't working.
Try this:
// Helper function to execute and log out child process
// TODO: implement a better success/error callback
var spawnProcess = function(command, args, options, callback) {
var spawn = require('child_process').spawn;
var process = spawn(command, args, options),
err = false;
process.stdout.on('data', function(data) {
grunt.log.write(data);
});
process.stderr.on('data', function(data) {
err = true;
grunt.log.errorlns(data);
});
if (typeof callback === 'function') {
process.on('exit', function() {
if (!err) {
return callback();
}
});
}
};
spawnProcess('mklink', ['/D', 'C:\\some\\path\\to\\my\\intended\\link', 'C:\\path\\to\\my\\folder'], {}, done);
As a workaround, try the following:
require('child_process').spawn('cmd',
['/C', 'mklink', '/D', 'C:\\some\\path\\to\\my\\intended\\link',
'C:\\path\\to\\my\\folder'],
{stdio:'inherit'})
I'm attempting to create a node server that serves up a png image generated using the node-wkhtml module (basically just a wrapper for the wkhtmltoimage/wkhtmltopdf command line utility). Here's what I have so far:
var http = require('http');
var Image = require("node-wkhtml").image();
http.createServer(function (request, response) {
new Image({ url: "www.google.com" }).convert (function (err, stdout) {
//var theImage = new Buffer (stdout, 'binary');
response.writeHead(200, {'Content-Type' : 'image/png',
'Content-Length' : stdout.length
});
response.write (stdout, 'binary');
response.end ();
//write out an error, if there is one
if (err)
console.log (err);
});
}).listen(8124);
Basically, the module calls the command:
wkhtmltoimage www.google.com -
which then generates a png image and writes it to the stdout. The amount of data served seems to be correct, but I can't get the browser to display it (nor does it work if I download it as a file). I tried the following command:
wkhtmltoimage www.google.com - > download.png
and sure enough, download.png was created and contained a snapshot of the google homepage, meaning that the wkhtmltoimage utility is working correctly, and the command works. I'm a beginner at node, so I'm not super familiar how to serve up a binary file like this, can anyone see any glaring issues? Here's the node module code that works the magic:
Image.prototype.convert = function(callback) {
exec(util.buildCommand(this), {encoding: 'binary', maxBuffer: 10*1024*1024}, callback);
}
(the buildCommand function just generates the "wkhtmltoimage www.google.com -" command, and I've verified it does this correctly, using node inspector.
UPDATE:
In case anyone finds this later and is interested, I found the solution. The plugin I was using (node-wkhtml) was not properly handling large buffers, due to the choice of using child-process.exec I changed the plugin code to use child-process.spawn instead, and it worked as desired.