I'm building a super simple server in node and in my onRequest listener I'm trying to determine if I should serve a static file (off the disk) or some json (probably pulled from mongo) based on the path in request.url.
Currently I'm trying to stat the file first (because I use mtime elsewhere) and if that doesn't fail then I read the contents from disk. Something like this:
fs.stat(request.url.pathname, function(err, stat) {
if (!err) {
fs.readFile(request.url.pathname, function( err, contents) {
//serve file
});
}else {
//either pull data from mongo or serve 404 error
}
});
Other than cacheing the result of fs.stat for the request.url.pathname, is there something that could speed this check up? For example, would it be just as fast to see if fs.readFile errors out instead of the stat? Or using fs.createReadStream instead of fs.readFile? Or could I potentially check for the file using something in child_process.spawn? Basically I just want to make sure I'm not spending any extra time messing w/ fileio when the request should be sent to mongo for data...
Thanks!
var fs = require('fs');
fs.exists(file, function(exists) {
if (exists) {
// serve file
} else {
// mongodb
}
});
I don't think you should be worrying about that, but rather how can you improve the caching mechanism. fs.stat is really ok for file checking, doing that in another child process would probably slow you down rather then help you here.
Connect implemented the staticCache() middleware a few months ago, as described in this blog post: http://tjholowaychuk.com/post/9682643240/connect-1-7-0-fast-static-file-memory-cache-and-more
A Least-Recently-Used (LRU) cache algo is implemented through the
Cache object, simply rotating cache objects as they are hit. This
means that increasingly popular objects maintain their positions while
others get shoved out of the stack and garbage collected.
Other resources:
http://senchalabs.github.com/connect/middleware-staticCache.html
The source code for staticCache
this snippet can help you
fs = require('fs') ;
var path = 'sth' ;
fs.stat(path, function(err, stat) {
if (err) {
if ('ENOENT' == err.code) {
//file did'nt exist so for example send 404 to client
} else {
//it is a server error so for example send 500 to client
}
} else {
//every thing was ok so for example you can read it and send it to client
}
} );
In case you want to serve a file using express, I would recommend to just use the sendFile error Handler of express
const app = require("express")();
const options = {};
options.root = process.cwd();
var sendFiles = function(res, files) {
res.sendFile(files.shift(), options, function(err) {
if (err) {
console.log(err);
console.log(files);
if(files.length === 0) {
res.status(err.status).end();
} else {
sendFiles(res, files)
}
} else {
console.log("Image Sent");
}
});
};
app.get("/getPictures", function(req, res, next) {
const files = [
"file-does-not-exist.jpg",
"file-does-not-exist-also.jpg",
"file-exists.jpg",
"file-does-not-exist.jpg"
];
sendFiles(res, files);
});
app.listen(8080);
If the file is not existent then it will go to the error that sends it self.
I made a github repo here https://github.com/dmastag/ex_fs/blob/master/index.js
Related
I am using Puppeteer to take screenshots of a web page for my company. I need to test multiple people's accounts so that means visiting the page multiple times (150 times in this case). This results in our firewall kicking me out for making too many requests.
My solution is to just fetch the contents of the page and save them locally. Then I use puppeteer on that local file, overriding the function used to get data from our servers to instead just use data already loaded into Node from a CSV.
All of this works, but it looks like it's still making requests to our servers.
I tried giving it a userDataDir so it could cache any resources. In theory, if it's loading it from file://, it's caching the resources and there's no Ajax requests, it shouldn't be making any further requests, right?
I also tried installing a debugging proxy but since it's https I can't see what it's trying to request.
This is how I start it:
puppeteer.launch({
userDataDir: "temp/"
})
.then(browser => {
next(browser, links);
)
.catch(error => {
cb(error, null);
});
next will iterate through any links it needs to visit.
This part saves the page locally:
if (this._linkCache[baseLink] === undefined) {
fetch(baseLink)
.then(resp => resp.text())
.then(contents => {
fs.writeFile(fullFileName, contents, 'utf8', err => {
if (err) {
cb(err, null);
} else {
this._linkCache[baseLink] = fileUrl;
gotoPage(fileUrl);
}
});
})
.catch(error => {
cb(error, null);
});
}
// Go to the cached version
else {
gotoPage(this._linkCache[baseLink] + queryParams);
}
And this gets the screenshots:
const gotoPage = async(url) => {
try {
const page = await browser.newPage();
// Override 'fetchAccountData' function
await page.evaluateOnNewDocument(testData => {
window["fetchAccountData"] = (cb: (err: any, data: any)=>void) => {
cb(null, testData);
};
}, data);
// Go to page and get screenshot
await page.goto(url);
const screenie = `${outputPath}${uuid()}.png`;
await page.screenshot({ fullPage: true, path: screenie, type: "png" });
pageHtml.push(`<img src="file://${screenie}" />`);
next(browser, rest);
} catch (e) {
cb(e, null);
}
};
I was hoping this would be able to only make a few requests at the beginning while it saves the html locally and caches all the resources, but it seems to make a request for every link.
How can I stop it?
I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js
I am a total node.js noobie and trying to figure out the best way to structure my application with proper separation of concerns.
I am using mongodb via mongoose and have successfully gotten my controllers separated out using node.js modules and am trying to then separate out my models. What I've gone appears to work, but when I check the database nothing has been saved. Also, I tried a console.log() in the save function and nothing gets logged.
from my server.js I have:
app.post(api.urlslug + '/messages', messagesapi.insert);
I then have a /controllers/api/messages.js:
var m = require('../../models/message');
exports.index = function(req, res, next){
res.send('all the messages...');
}
exports.insert = function(req, res, next){
var message;
message = new m.Message({
messagebody: req.body.messagebody
});
message.save(function(err) {
console.log('here we are in the save');
if(!err) {
return console.log('message created');
} else {
return console.log(err);
}
});
return res.send(message);
}
and my /models/message.js looks like this:
// required modules
var mongoose = require('mongoose')
, db = require('../models/db');
// setup database connection
mongoose.connect(db.connectionstring());
var Message = exports.Message = mongoose.model('Message', new mongoose.Schema({
messagebody: String
}));
When I post to the API I get a the proper JSON back and it even has the _id field with what appears to me as a mongodb provided unique id. With that, I am having trouble understanding why it is not actually going into mongodb if it appears to be creating the object and communicating with mongodb correctly.
sounds like a connection is not being made. try listening to the open/error events of the default mongoose connection to find out why.
function log (msg) { console.log('connection: ', msg) }
mongoose.connection.on('open', log);
mongoose.connection.on('error', log);
I have a little problem with Express and mongoose using Node.js . I pasted the code in pastebin, for a better visibility.
Here is the app.js: http://pastebin.com/FRAFzvjR
Here is the routes/index.js: http://pastebin.com/gDgBXSy6
Since the db.js isn't big, I post it here:
var mongoose = require('mongoose'),
Schema = mongoose.Schema;
module.exports = function () {
mongoose.connect('mongodb://localhost/test',
function(err) {
if (err) { throw err; }
}
);
};
var User = new Schema({
username: {type: String, index: { unique: true }},
mdp: String
});
module.exports = mongoose.model('User', User);
As you can see, I used the console.log to debug my app, and I found that, in routes/index.js, only the a appeared. That's weird, it's as if the script stopped (or continue without any response) when
userModel.findOne({username: req.body.username}, function(err, data)
is tried.
Any idea?
You never connect to your database. Your connect method is within the db.export, but that is never called as a function from your app.
Also, you are overwriting your module.exports - if you want multiple functions/classes to be exported, you must add them as different properties of the module.export object. ie.:
module.export.truthy = function() { return true; }
module.export.falsy = function() { return false; }
When you then require that module, you must call the function (trueFalse.truthy();) in order to get the value. Since you never execute the function to connect to your database, you are not recieveing any data.
A couple of things real quick.
Make sure you're on the latest mongoose (2.5.3). Update your package.json and run npm update.
Try doing a console.log(augments) before your if (err). It's possible that an error is happening.
Are you sure you're really connecting to the database? Try explicitly connecting at the top of your file (just for testing) mongoose.connect('mongodb://localhost/my_database');
I'll update if I get any other ideas.
I'm working on a project in NodeJS which involves file upload. The upload is done on the client side with the code:
$('#file-upload').bind('change focus click', function() {
var file = jQuery(this)[0].files[0];
if (file && file.fileName) {
var xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', onProgressHandler, false);
xhr.upload.addEventListener('load', transferComplete, false);
xhr.open('POST', '/upload', true);
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.setRequestHeader('X-File-Name', encodeURIComponent(file.fileName));
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
xhr.send(file);
function onProgressHandler(evt) {
var percentage = event.loaded/event.total*100;
console.log(percentage);
}
function transferComplete(evt) {
console.log('Done');
}
}
});
And on the server-side, I use:
app.post('/upload', function(req, res, next) {
if(req.xhr) {
console.log('Uploading...');
var fName = req.header('x-file-name');
var fSize = req.header('x-file-size');
var fType = req.header('x-file-type');
var ws = fs.createWriteStream('./'+fName)
req.on('data', function(data) {
console.log('DATA');
ws.write(data);
});
req.on('end', function() {
console.log('All Done!!!!');
});
}
});
This code does work alone, but when combined with the rest of my much larger project, it seems to chop of the beginning of large files, and ignore small files all together. If I upload a small file, the console.log('DATA') never fires and it does fire for large files, but not for the beginning of the file. I believe for some reason it is sending the file early and by the time my function picks it up the beginning (or in the case of a small file, the entire thing) has already sent. I don't know what would be causing this, though.
Thanks!
I figured it out. There was so much logic between my route being defined and the actual file upload code running that it wasn't ready listening for the file.
I am having this exact same problem. It bothers me that having too much logic between the request and the on('data') event is the problem. I"m testing with a local server, and the amount of logic between the start of the request and registering the on data event is negligible. But the fact that I don't need to cross the internet to do my upload is making this problem that much worse? Are you still experiencing this issue?