Node: Download raw bytes of jpeg without piping output - image

Here is what I'm trying to do:
Retrieve raw data of an image (jpeg) from a URL given to me by an API
Pass the raw data or buffer to a function that uploads it to another server
NEVER PIPE THE IMAGE TO THE DISK
I've followed every example I can find (that doesn't pipe to disk), but still the content comes out corrupted. I have tried forcing various "accept-encodings" (gzip, deflate) but they basically resolve to the same data, just compressed.
I believe this has something to do with the response encoding rather than how I am asking for the data.
Here's the code so far:
var parsedUrl = require('url').parse(PATH_TO_IMAGE)
var params = {
hostname: parsedUrl.hostname,
path: parsedUrl.path,
}
return http.get(params, function(photo_res) {
var photoData = '';
res.setEncoding('binary');
photo_res.on('data', function(chunk) {
photoData += chunk;
});
photo_res.on('end', function() {
// DO STUFF TO UPLOAD IMAGE
});
photo_res.on('error', function(err) {
console.error('Unable to download photo:', err);
return done(err);
});
});

You have a simple typographic error which may be causing Node to interpret your data stream with the incorrect type. Your error is in this line:
res.setEncoding('binary');
To avoid confusion you should keep the response variable named res, and since your data is in binary format, it might be better to keep it as a buffer.
http.get(options, function(res) {
var photoData = [];
res.setEncoding('binary');
res.on('data', function(chunk) {
photoData.push(chunk);
});
res.on('end', function() {
var photo = Buffer.concat(photoData);
});
res.on('error', function(err) {
console.error('Unable to download photo:', err);
});
});
In the example, I store all chunks of data into an array, then use Buffer.concat() to create a single buffer. It is better this way because you were originally appending your image's data onto a string, which may have cause the corruption.

Related

Getting binary file content instead of UTF-escaped using file.get

I'd like to know if it's possible to get exact binary data using callback from drive.files.get method of NodeJS Google API. I know that object returned by calling this API endpoint is a normal request object that could be e.g. piped like this:
drive.files.get({
fileId: fileId,
alt: 'media'
}).pipe(fs.createWriteStream('test'));
However I would like to know if it's possible to get binary data from within callback using this syntax:
drive.files.get({
fileId: fileId,
alt: 'media'
}, function(err, data) {
// Here I have binary data exposed
});
As far as I know, it should be possible to get that kind of data from request during its creation, passing {encoding: null} in request options object like this:
var requestSettings = {
method: 'GET',
url: url,
encoding: null // This is the important part
};
request(requestSettings, function(err, data) {/.../})`
however it seems that Google obscures this configuration object in its library.
So my question is - is it possible to do so without interfering/hacking the library?
Ok, so i found answer that could be useful for others :)
Aforementioned drive.files.get method returns Stream object, so it could be directly handled using proper event handlers. Then, buffer parts could be concatenated into one part and sent back in callback like this:
var stream = drive.files.get({
fileId: fileId,
alt: 'media'
});
// Build buffer
var chunks = [];
stream.on('data', (chunk) => {
chunks.push(chunk);
});
stream.on('end', () => {
return cb(null, Buffer.concat(chunks));
});

Meteor call for sending image through api

Is it is possible to send image through meteor call to the API?
Client js
var r = {image : image};
Meteor.apply('callToServer', r, function(error, result){
console.log(result);
});
Server js
Meteor.methods({
uploadAndSaveToDB: function(data){
var result = Meteor.http.post(apiUrl, {
params: { image : data['image']}
});
var result = JSON.parse(result.content);
return result;
},
});
If your question is about how to get the image data and send it to your api, it depends on a couple factors:
How are you getting the image's data in the first place from your app (a submission form, a URL, some drawing library...)
In what format does the API you are calling expects the image data to be sent (URL, raw data, encrypted...)
If you are simply asking if it is doable, then yes, definitely. You will just need to add the http package for this:
meteor add http
You can then make requests to your api pretty much like you wrote it. Just make sure to give the right name to your method call (also use call and not apply if you are not submitting an array of arguments):
Client js
var r = {image : image};
Meteor.call('uploadAndSaveToDB', r, function(error, result){
console.log(result);
});
Server js
Meteor.methods({
uploadAndSaveToDB: function(data){
var result = HTTP.post(apiUrl, {
params: { image : data['image']}
});
var result = JSON.parse(result.content);
return result;
},
});

Why is node.js breaking incoming data into chunks?

The following code in node.js does not log all incoming data inside the brackets, rather, it breaks the data into chunks. So for example, if the incoming data is ABCDEF...XYZ it logs the data as [ABC][DEF]...[XYZ] rather than [ABCDEF...XYZ]. The data is much larger of course, the alphabet is just an example.
How should I write this so that all incoming data is logged once inside the brackets and not in parts?
chatServer.on('connection', function(client)
{
client.on('data', function(data)
{
console.log('[' + data.toString() + ']');
})
})
Well your data is arriving in packets, so (in this case) you should be concatenating the packets into a variable that you define outside the function.
buffer = '';
chatServer.on('connection', function(client)
{
client.on('data', function(data)
{
buffer += data.toString();
})
});
console.log('[' + buffer + ']');
Like matthewdavidson said, you are subscribing to every "chunk" of data that is sent rather than the whole message. It is more likely you want to capture the data in a closure within the function and still respond asynchronously. Try the following:
chatServer.on('connection', function(client)
{
var buffer = '';
client.on('data', function(data)
{
buffer += data;
})
client.on('end', function(){
console.log('[' + buffer + ']');
})
});
Checkout http://www.nodebeginner.org/#handling-post-requests for more information

Fastest way to check for existence of a file in NodeJs

I'm building a super simple server in node and in my onRequest listener I'm trying to determine if I should serve a static file (off the disk) or some json (probably pulled from mongo) based on the path in request.url.
Currently I'm trying to stat the file first (because I use mtime elsewhere) and if that doesn't fail then I read the contents from disk. Something like this:
fs.stat(request.url.pathname, function(err, stat) {
if (!err) {
fs.readFile(request.url.pathname, function( err, contents) {
//serve file
});
}else {
//either pull data from mongo or serve 404 error
}
});
Other than cacheing the result of fs.stat for the request.url.pathname, is there something that could speed this check up? For example, would it be just as fast to see if fs.readFile errors out instead of the stat? Or using fs.createReadStream instead of fs.readFile? Or could I potentially check for the file using something in child_process.spawn? Basically I just want to make sure I'm not spending any extra time messing w/ fileio when the request should be sent to mongo for data...
Thanks!
var fs = require('fs');
fs.exists(file, function(exists) {
if (exists) {
// serve file
} else {
// mongodb
}
});
I don't think you should be worrying about that, but rather how can you improve the caching mechanism. fs.stat is really ok for file checking, doing that in another child process would probably slow you down rather then help you here.
Connect implemented the staticCache() middleware a few months ago, as described in this blog post: http://tjholowaychuk.com/post/9682643240/connect-1-7-0-fast-static-file-memory-cache-and-more
A Least-Recently-Used (LRU) cache algo is implemented through the
Cache object, simply rotating cache objects as they are hit. This
means that increasingly popular objects maintain their positions while
others get shoved out of the stack and garbage collected.
Other resources:
http://senchalabs.github.com/connect/middleware-staticCache.html
The source code for staticCache
this snippet can help you
fs = require('fs') ;
var path = 'sth' ;
fs.stat(path, function(err, stat) {
if (err) {
if ('ENOENT' == err.code) {
//file did'nt exist so for example send 404 to client
} else {
//it is a server error so for example send 500 to client
}
} else {
//every thing was ok so for example you can read it and send it to client
}
} );
In case you want to serve a file using express, I would recommend to just use the sendFile error Handler of express
const app = require("express")();
const options = {};
options.root = process.cwd();
var sendFiles = function(res, files) {
res.sendFile(files.shift(), options, function(err) {
if (err) {
console.log(err);
console.log(files);
if(files.length === 0) {
res.status(err.status).end();
} else {
sendFiles(res, files)
}
} else {
console.log("Image Sent");
}
});
};
app.get("/getPictures", function(req, res, next) {
const files = [
"file-does-not-exist.jpg",
"file-does-not-exist-also.jpg",
"file-exists.jpg",
"file-does-not-exist.jpg"
];
sendFiles(res, files);
});
app.listen(8080);
If the file is not existent then it will go to the error that sends it self.
I made a github repo here https://github.com/dmastag/ex_fs/blob/master/index.js

NodeJS Web App File Upload Chops Off Beginning Of File

I'm working on a project in NodeJS which involves file upload. The upload is done on the client side with the code:
$('#file-upload').bind('change focus click', function() {
var file = jQuery(this)[0].files[0];
if (file && file.fileName) {
var xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', onProgressHandler, false);
xhr.upload.addEventListener('load', transferComplete, false);
xhr.open('POST', '/upload', true);
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.setRequestHeader('X-File-Name', encodeURIComponent(file.fileName));
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
xhr.send(file);
function onProgressHandler(evt) {
var percentage = event.loaded/event.total*100;
console.log(percentage);
}
function transferComplete(evt) {
console.log('Done');
}
}
});
And on the server-side, I use:
app.post('/upload', function(req, res, next) {
if(req.xhr) {
console.log('Uploading...');
var fName = req.header('x-file-name');
var fSize = req.header('x-file-size');
var fType = req.header('x-file-type');
var ws = fs.createWriteStream('./'+fName)
req.on('data', function(data) {
console.log('DATA');
ws.write(data);
});
req.on('end', function() {
console.log('All Done!!!!');
});
}
});
This code does work alone, but when combined with the rest of my much larger project, it seems to chop of the beginning of large files, and ignore small files all together. If I upload a small file, the console.log('DATA') never fires and it does fire for large files, but not for the beginning of the file. I believe for some reason it is sending the file early and by the time my function picks it up the beginning (or in the case of a small file, the entire thing) has already sent. I don't know what would be causing this, though.
Thanks!
I figured it out. There was so much logic between my route being defined and the actual file upload code running that it wasn't ready listening for the file.
I am having this exact same problem. It bothers me that having too much logic between the request and the on('data') event is the problem. I"m testing with a local server, and the amount of logic between the start of the request and registering the on data event is negligible. But the fact that I don't need to cross the internet to do my upload is making this problem that much worse? Are you still experiencing this issue?

Resources