how to let user download files from node js server? - nodejs-server

Guys how to enable users to download stuffs from node js server using it's """core module*"""

what's is core module?
if is express frameļ¼Œthis is a download example code
router.get('/down', function(req,res){
let {fn} = req.query
fn = decodeURI(fn)
fs.access(`./static/${fn}`, function(err){
if(!err){
res.set({
"Content-Type": "application/octet-stream",
"Content-Disposition": `attachment;filename=${encodeURI(fn)}`
})
fs.createReadStream(`./static/${fn}`).pipe(res)
}
})
})

Related

How can I stop external js script from stopping my fetch POST request?

My app generates a custom product page on a Shopify store. I use Vue3 for the frontend. There are other apps running js on the page, e.g. live chat, push notification pop-up, GDPR cookie bar, etc. They are injected by the platform and I can't remove them. (Btw, these js are minified and hard to read)
My app has an add bundle to cart button on the floating footer to send a POST request to my server with Fetch API. But it's blocked by these irrelevant apps. I think these apps are monitoring if a POST / GET request is sent. They assume they are working on standard product pages but not custom one like mine.
I tried to implement a block list with yett. But this passive way is not good enough. It's just a fix after the issue happens. Any way I can protect my fetch request without interfering by other js scripts?
let request = new Request('/create_new_product/', {
method: 'POST',
body: JSON.stringify(data),
headers: {
'Accept': 'application/json',
'Content-Type': 'application/json'
}
});
let vm1 = this;
fetch(request)
.then(response => response.json())
.then(data => {
console.log('Success creating variant:', data);
console.log('variant_id:', data.variant_id);
// stopped here by other apps :-(
if (data.variant_id) {
vm1.addNewVariantToCart(this.variants, data.variant_id);
vm1.$emit('clearall');
setTimeout(function(){ vm1.isLoading = false; }, 2000);
}
else if (data.Error) {
alert(data.Error);
vm1.isLoading = false;
}
})
.catch((error) => {
console.error('Error:', error);
vm1.isLoading = false;
});

POST binary data from browser to JFrog / Artifactory server without using form-data

So we get a file (an image file) in the front-end like so:
//html
<input type="file" ng-change="onFileChange">
//javascript
$scope.onFileChange = function (e) {
e.preventDefault();
let file = e.target.files[0];
// I presume this is just a binary file
// I want to HTTP Post this file to a server
// without using form-data
};
What I want to know is - is there a way to POST this file to a server, without including the file as form-data? The problem is that the server I am send a HTTP POST request to, doesn't really know how to store form-data when it receives a request.
I believe this is the right way to do it, but I am not sure.
fetch('www.example.net', { // Your POST endpoint
method: 'POST',
headers: {
"Content-Type": "image/jpeg"
},
body: e.target.files[0] // the file
})
.then(
response => response.json() // if the response is a JSON object
)
You can directly attach the file to the request body. Artifactory doesn't support form uploads (and it doesn't look like they plan to)
You'll still need to proxy the request somehow to avoid CORS issues, and if you're using user credentials, you should be cautious in how you treat them. Also, you could use a library like http-proxy-middleware to avoid having to write/test/maintain the proxy logic.
<input id="file-upload" type="file" />
<script>
function upload(data) {
var file = document.getElementById('file-upload').files[0];
var xhr = new XMLHttpRequest();
xhr.open('PUT', 'https://example.com/artifactory-proxy-avoiding-cors');
xhr.send(file);
}
</script>
Our front-end could not HTTP POST directly to the JFrog/Artifactory server. So we ended up using a Node.js server as a proxy, which is not very ideal.
Front-end:
// in an AngularJS controller:
$scope.onAcqImageFileChange = function (e) {
e.preventDefault();
let file = e.target.files[0];
$scope.acqImageFile = file;
};
// in an AngularJS service
createNewAcqImage: function(options) {
let file = options.file;
return $http({
method: 'POST',
url: '/proxy/image',
data: file,
headers: {
'Content-Type': 'image/jpeg'
}
})
},
Back-end:
const express = require('express');
const router = express.Router();
router.post('/image', function (req, res, next) {
const filename = uuid.v4();
const proxy = http.request({
method: 'PUT',
hostname: 'engci-maven.nabisco.com',
path: `/artifactory/cdt-repo/folder/${filename}`,
headers: {
'Authorization': 'Basic ' + Buffer.from('cdt-deployer:foobar').toString('base64'),
}
}, function(resp){
resp.pipe(res).once('error', next);
});
req.pipe(proxy).once('error', next);
});
module.exports = router;
not that we had to use a PUT request to send an image to Artifactory, not POST, something to do with Artifactory (the engci-maven.nabisco.com server is an Artifactory server). As I recall, I got CORS issues when trying to post directly from our front-end to the other server, so we had to use our server as a proxy, which is something I'd rather avoid, but oh well for now.

Stripe Payments with parse server cloud code

I have this code included in my main.js:
var stripe = require("/cloud/stripe.js")("sk_test_*********");
//create customer
Parse.Cloud.define('createCustomer', function (req, res) {
stripe.customers.create({
description: req.params.fullName,
source: req.params.token
//email: req.params.email
}, function (err, customer) {
// asynchronously called
res.error("someting went wrong with creating a customer");
});
});
After pushing this code to my Heroku server the logs indicate that: Error: Cannot find module '/cloud/stripe.js'
I have also tried var stripe = require("stripe")("sk_test_*********"); but this returns the same error. Whenever I try add this new module to my server the whole server becomes dysfunctional. What workarounds are there to this? Thanks
Have you added Stripe to the requirements of your package.json file for your node project? If so, you should be able to reference it using the term require('stripe') as opposed to what you're doing.
I'll tell you what worked for me, I racked my brain on this for a day. Instead of using Cloud Code to make a charge, create a route on index.js. Something like this in index.js
var stripe = require('stripe')('sk_test_****');
var bodyParser = require('body-parser');
app.use(bodyParser.urlencoded({
extended: false
}));
app.post('/charge', function(req, res){
var token = req.body.token;
var amount = req.body.amount;
stripe.charges.create({
amount: amount,
currency: 'usd',
source: token,
}, function(err, charge){
if(err)
// Error check
else
res.send('Payment successful!');
}
});
I call this using jQuery post but you could also use a form.

Downloading image in titanium from parse

I need to download images from parse and I am new to titanium. How to do this. I have search the web but no help found regarding the download there are some code available for upload images.
HI this can help you parsing the images.
var request = Titanium.Network.createHTTPClient({
onload: function(e) {
var result=JSON.parse(this.responseText);
console.log(result.url);
},
onerror: function(e) {
alert(e.message);
}
});
// Register device token with Parse
request.open('POST', 'https://api.parse.com/1/files/pic.jpg', true);
request.setRequestHeader('X-Parse-Application-Id', 'MY_APP_KEY');
request.setRequestHeader('X-Parse-REST-API-Key', 'MY_REST_KEY');
request.setRequestHeader('Content-Type', 'image/jpeg');
request.send(image);
Thanks
PRASHAANTH

a file upload progress bar with node (socket.io and formidable) and ajax

I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js

Resources