ERR CONNECTION RESET in xmlhttprequest for file download - ajax

Hi I am running this code to download some ( 4 -5 ) zip files with response set as blob. These calls, approx 5 calls fire at a time as below code runs in loop. But in a production server whis is aws, it returns CONNECTION RESET in console log after 2 files download. Can anyone highlight what may cause this error. I guess too many request at a time is refused by the server, but not sure. Any help is highly appritiated. Thanks
self.ajax_call_1 = function (url, type, callback) {
return new Promise(function (resolve, reject) {
var xhr = new XMLHttpRequest();
xhr.responseType = 'blob';
xhr.onreadystatechange = function () {
if (this.readyState == 4 && this.status == 200) {
resolve(xhr.response);
}
}
xhr.open(type, url, true);
xhr.send();
});
}

I had similar issue in file upload and in AJAX call I was using asp.net service to upload file/s. If you have access to the server that service is hosted on, and if it is ashx or asmx (asp.net services) service, you have to make the following changes to web.config:
<httpRuntime maxRequestLength="51200" executionTimeout="0"/>
Here -
executionTimeout="6000" Specifies the maximum number of seconds that a request is allowed to execute before being automatically shut down by ASP.NET. This time-out applies only if the debug attribute in the compilation element is False.
maxRequestLength="102400" for 1mb=1024 so for 100mb=102400
httpRuntime goes in <system.web>

Related

Taking value from text-field and sending it with AJAX request Javascript

I am trying to sync both client-side and server-side scripts that the client intakes a value from the textbox and sends it to the server, upon which the server displays that input as a cookie.
Here is the code that I have so far
function loadCookie() {
//[1] make a new request object
var xhttp = new XMLHttpRequest();
//[2] set the request options
xhttp.open("GET", "index.html", true);
//[3] define what you will do when you ge a response (callback)
xhttp.onreadystatechange = function(){
if (this.readyState == 4 && this.status == 200) {
document.getElementById("input_response").innerHTML = this.responseText;
}
};
//[4] finally send out the request
xhttp.send();
}
I have the and the button but I am having issue of the page re-loading itself instead of taking the value of the input and showing it as a cookie in the server. I'm suspecting it is having to do with the URL by the index.html

How to download from firebase storage string written by putString() in web application [duplicate]

I uploaded a raw String 'Test' to the firebase storage using the sample provided here and it went through successfully.
But when I tried to "download" the string I uploaded, using the sample below, apparently he only example on how to download data from firebase storage it returns the url of the string file.
storageRef.child('path/to/string').getDownloadURL().then(function(url) {
// I get the url of course
}).catch(function(error) {
// Handle any errors
});
How do I get the contents of the file from the callback url which is 'Test' (The string I uploaded.)
The short answer is that in the Web Storage SDK you can only get a download URL that represents that data. You'll need to "download" the file using an XMLHttpRequest (or equivalent):
storageRef.child('path/to/string').getDownloadURL().then(function(url) {
var XMLHttp = new XMLHttpRequest();
XMLHttp.onreadystatechange = function() {
if (xmlHttp.readyState == 4 && xmlHttp.status == 200)
var response = xmlHttp.responseText; // should have your text
}
XMLHttp.open("GET", url, true); // true for asynchronous
XMLHttp.send(null);
}).catch(function(error) {
// Handle any errors from Storage
});

Client side API requests faster than proxy requests via node

Currently I am developing an app that fire off hundreds of concurrent requests to external API service (like instagram for example) using ajax on client side. Response time is very fast.
However, I am migrating the request handling part to node backend using request + jsonstream but always get socket hang up error due to concurrency > 5 requests (even after changing maxsockets to higher values). Overall it is much much slower than doing API requests directly on client side using ajax.
My question is how can I make the proxy server faster/more responsive? Or maybe using ajax similar to when doing on client side but on node?
Server side: when client hits endpoint /fetchmedia/, node directs to this function.
var fetchInstagram = function(user_id, max_id, min_timestamp, max_timestamp, array, callback) {
http.request({
host: 'endpoint',
path: 'endpoint',
method: 'get'
}, function(res) {
var body = '';
res.on('data', function(chunk) {
body += chunk;
});
res.on('end', function() {
var data = JSON.parse(body);
array = array.concat(data.data);
if (data.pagination.next_max_id != undefined) {
fetchInstagram(user_id, data.pagination.next_max_id, min_timestamp, max_timestamp, array, callback);
} else {
callback(array);
}
});
}).on('error', function(e) {
console.log("Got error: ", e);
}).end();
Client-side: Backbone sends hundreds of requests (/fetchmedia) at the same time, which calls many fetchinstagram functions. The way I was doing before was sending ajax, which also hundreds concurrently but it handles very well. Node hangs up even with 20 users while ajax handles 1000+ users
Thanks

a file upload progress bar with node (socket.io and formidable) and ajax

I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js

NodeJS Web App File Upload Chops Off Beginning Of File

I'm working on a project in NodeJS which involves file upload. The upload is done on the client side with the code:
$('#file-upload').bind('change focus click', function() {
var file = jQuery(this)[0].files[0];
if (file && file.fileName) {
var xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', onProgressHandler, false);
xhr.upload.addEventListener('load', transferComplete, false);
xhr.open('POST', '/upload', true);
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.setRequestHeader('X-File-Name', encodeURIComponent(file.fileName));
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
xhr.send(file);
function onProgressHandler(evt) {
var percentage = event.loaded/event.total*100;
console.log(percentage);
}
function transferComplete(evt) {
console.log('Done');
}
}
});
And on the server-side, I use:
app.post('/upload', function(req, res, next) {
if(req.xhr) {
console.log('Uploading...');
var fName = req.header('x-file-name');
var fSize = req.header('x-file-size');
var fType = req.header('x-file-type');
var ws = fs.createWriteStream('./'+fName)
req.on('data', function(data) {
console.log('DATA');
ws.write(data);
});
req.on('end', function() {
console.log('All Done!!!!');
});
}
});
This code does work alone, but when combined with the rest of my much larger project, it seems to chop of the beginning of large files, and ignore small files all together. If I upload a small file, the console.log('DATA') never fires and it does fire for large files, but not for the beginning of the file. I believe for some reason it is sending the file early and by the time my function picks it up the beginning (or in the case of a small file, the entire thing) has already sent. I don't know what would be causing this, though.
Thanks!
I figured it out. There was so much logic between my route being defined and the actual file upload code running that it wasn't ready listening for the file.
I am having this exact same problem. It bothers me that having too much logic between the request and the on('data') event is the problem. I"m testing with a local server, and the amount of logic between the start of the request and registering the on data event is negligible. But the fact that I don't need to cross the internet to do my upload is making this problem that much worse? Are you still experiencing this issue?

Resources