XHR progress events arrive too quickly - ajax

I'm doing direct upload to S3 using html form and ajax. It's working good, files get uploaded. My problem is that progress events arrive too quickly. Like all of them up to 99.9% are fired immediately at the start of the upload.
var fd = new FormData();
// put data from the form to FormData object
$.each( $('#upload-form').serializeArray(), function(i, field) {
fd.append(field.name, field.value);
});
// add selected filename to the form data
var file = document.getElementById('path-to-file').files[0];
fd.append("file", file);
var xhr = getXmlHttpRequest(); // cross-browser implementation
xhr.upload.addEventListener("progress", function(e) {
console.log(e.loaded + "/" + e.total);
});
xhr.open('POST', 'http://mybucket.s3.amazonaws.com/', true);
xhr.send(fd);
also tried this way
xhr.upload.onprogress = function(evt)
{
if (evt.lengthComputable)
{
console.log(e.loaded + "/" + e.total);
}
};
Browser log looks like this:
[22:54:47.245] POST http://mybucket.s3.amazonaws.com/
[22:54:47.287] 359865/5680475
[22:54:47.330] 1408441/5680475
[22:54:47.408] 2751929/5680475
[22:54:47.449] 3964345/5680475
[22:54:47.509] 5668281/5680475
and then nothing for the time it actually takes to upload a 5M file.
If browser info is relevant, I have Firefox 20.0.1

Related

How to download the model as a JSON file?

My model is held in a JavaScript object on the client side, where the user can edit its properties via the UI controls. I want to offer the user an option to download a JSON file representing the model they're editing. I'm using MVC core with .net 6.
What I've tried
Action method (using Newtonsoft.Json to serialize the model to JSON):
public IActionResult Download([FromForm]SomeModel someModel)
{
var json = JsonConvert.SerializeObject(someModel);
var characters = json.ToCharArray();
var bytes = new byte[characters.Length];
for (var i = 0; i < characters.Length; i++)
{
bytes[i] = (byte)characters[i];
}
var stream = new MemoryStream();
stream.Write(bytes);
stream.Position = 0;
return this.File(stream, "APPLICATION/octet-stream", "someFile.json");
}
Code in the view to call this method:
<button class="btn btn-primary" onclick="download()">Download</button>
And the event handler for this button (using jQuery's ajax magic):
function download() {
$.ajax({
url: 'https://hostname/ControllerName/Download',
method: 'POST',
data: { someModel: someModel },
success: function (data) {
console.log('downloading', data);
},
});
}
What happened
The browser console shows that my model has been posted to the server, serialized to JSON and the JSON has been returned to the browser. However no file is downloaded.
Something else I tried
I also tried a link like this to call the action method:
#Html.ActionLink("Download", "Download", "ControllerName")
What happened
This time a file was downloaded, however, because ActionLink can only make GET requests, which have no request body, the user's model isn't passed to the server and instead the file which is downloaded represents a default instance of SomeModel.
The ask
So I know I can post my model to the server, serialize it to JSON and return that JSON to the client, and I know I can get the browser to download a JSON-serialized version of a model, but how can I do both in the same request?
Edit: What I've done with the answer
I've accepted Xinran Shen's answer, because it works as-is, but because I believe that just copying code from Stack Overflow without understanding what it does or why isn't good practice, I did a bit of digging and my version of the saveData function now looks like this:
function saveData(data, fileName) {
// Convert the data to a JSON string and store it in a blob, a file-like
// object which can be downloaded without it existing on the server.
// See https://developer.mozilla.org/en-US/docs/Web/API/Blob
var json = JSON.stringify(data);
var blob = new Blob([json], { type: "octet/stream" });
// Create a URL from which the blob can be downloaded - see
// https://developer.mozilla.org/en-US/docs/Web/API/URL/createObjectURL
var url = window.URL.createObjectURL(blob);
// Add a hidden hyperlink to the page, which will download the file when clicked
var a = document.createElement("a");
a.style = "display: none";
a.href = url;
a.download = fileName;
document.body.appendChild(a);
// Trigger the click event on the hyperlink to download the file
a.click();
// Release the blob's URL.
// Browsers do this when the page is unloaded, but it's good practice to
// do so as soon as it's no longer needed.
window.URL.revokeObjectURL(url);
// Remove the hidden hyperlink from the page
a.remove();
}
Hope someone finds this useful
First, Your code is right, You can try to access this method without ajax, You will find it can download file successfully,But You can't use ajax to achieve this, because JavaScript cannot interact with disk, you need to use Blob to save the file. change your javascript like this:
function download() {
$.ajax({
url: 'https://hostname/ControllerName/Download',
method: 'Post',
data: { someModel: someModel },,
success: function (data) {
fileName = "my-download.json";
saveData(data,fileName)
},
});
}
var saveData = (function () {
var a = document.createElement("a");
document.body.appendChild(a);
a.style = "display: none";
return function (data, fileName) {
var json = JSON.stringify(data),
blob = new Blob([json], {type: "octet/stream"}),
url = window.URL.createObjectURL(blob);
a.href = url;
a.download = fileName;
a.click();
window.URL.revokeObjectURL(url);
};
}());
I think you may need FileStreamResult, also you need to set the MIME type to text file or json file.
// instead of this
return this.File(stream, "APPLICATION/octet-stream", "someFile.json");
// try this
return new FileStreamResult(stream, new MediaTypeHeaderValue("text/plain"))
{
FileDownloadName = "someFile.txt"
};
// or
return new FileStreamResult(stream, new MediaTypeHeaderValue("application/json"))
{
FileDownloadName = "someFile.json"
};
Reference: https://www.c-sharpcorner.com/article/fileresult-in-asp-net-core-mvc2/

How I can catch and save to file data from form sended by AJAX in ColdFusion

I have the followign JavaScript code:
function upload(blob) {
var xhr = new XMLHttpRequest();
var url = "test.cfm";
xhr.onload=function(e) {
if(this.readyState === 4) {
console.log("Server returned: ",e.target.responseText);
}
};
var fd=new FormData();
fd.append("randomname",blob);
xhr.open("POST",url,true);
xhr.send(fd); }
How can I catch it on server side by ColdFusion and Save blob object to File?
Can someone please some code sample. Thx.
PS. I am pretty new in CF.
Since you are using formdata, you can access the form variable with ajax, just like you would with normal http requests.
#form.randomname#
#form['randomname']#
So you could save the content in a file with
<cfscript>
fileWrite( 'c:\myfile.txt', form.randomname );
</cfscript>

a file upload progress bar with node (socket.io and formidable) and ajax

I was in the middle of teaching myself some Ajax, and this lesson required building a simple file upload form locally. I'm running XAMPP on windows 7, with a virtual host set up for http://test. The solution in the book was to use node and an almost unknown package called "multipart" which was supposed to parse the form data but was crapping out on me.
I looked for the best package for the job, and that seems to be formidable. It does the trick and my file will upload locally and I get all the details back through Ajax. BUT, it won't play nice with the simple JS code from the book which was to display the upload progress in a progress element. SO, I looked around and people suggested using socket.io to emit the progress info back to the client page.
I've managed to get formidable working locally, and I've managed to get socket.io working with some basic tutorials. Now, I can't for the life of me get them to work together. I can't even get a simple console log message to be sent back to my page from socket.io while formidable does its thing.
First, here is the file upload form by itself. The script inside the upload.html page:
document.getElementById("submit").onclick = handleButtonPress;
var httpRequest;
function handleResponse() {
if (httpRequest.readyState == 4 && httpRequest.status == 200) {
document.getElementById("results").innerHTML = httpRequest.responseText;
}
}
function handleButtonPress(e) {
e.preventDefault();
var form = document.getElementById("myform");
var formData = new FormData(form);
httpRequest = new XMLHttpRequest();
httpRequest.onreadystatechange = handleResponse;
httpRequest.open("POST", form.action);
httpRequest.send(formData);
}
And here's the corresponding node script (the important part being form.on('progress')
var http = require('http'),
util = require('util'),
formidable = require('formidable');
http.createServer(function(req, res) {
if (req.url == '/upload' && req.method.toLowerCase() == 'post') {
var form = new formidable.IncomingForm(),
files = [],
fields = [];
form.uploadDir = './files/';
form.keepExtensions = true;
form
.on('progress', function(bytesReceived, bytesExpected) {
console.log('Progress so far: '+(bytesReceived / bytesExpected * 100).toFixed(0)+"%");
})
.on('file', function(name, file) {
files.push([name, file]);
})
.on('error', function(err) {
console.log('ERROR!');
res.end();
})
.on('end', function() {
console.log('-> upload done');
res.writeHead(200, "OK", {
"Content-Type": "text/html", "Access-Control-Allow-Origin": "http://test"
});
res.end('received files: '+util.inspect(files));
});
form.parse(req);
} else {
res.writeHead(404, {'content-type': 'text/plain'});
res.end('404');
}
return;
}).listen(8080);
console.log('listening');
Ok, so that all works as expected. Now here's the simplest socket.io script which I'm hoping to infuse into the previous two to emit the progress info back to my page. Here's the client-side code:
var socket = io.connect('http://test:8080');
socket.on('news', function(data){
console.log('server sent news:', data);
});
And here's the server-side node script:
var http = require('http'),
fs = require('fs');
var server = http.createServer(function(req, res) {
fs.createReadStream('./socket.html').pipe(res);
});
var io = require('socket.io').listen(server);
io.sockets.on('connection', function(socket) {
socket.emit('news', {hello: "world"});
});
server.listen(8080);
So this works fine by itself, but my problem comes when I try to place the socket.io code inside my form.... I've tried placing it anywhere it might remotely make sense, i've tried the asynchronous mode of fs.readFile too, but it just wont send anything back to the client - meanwhile the file upload portion still works fine. Do I need to establish some sort of handshake between the two packages? Help me out here. I'm a front-end guy so I'm not too familiar with this back-end stuff. I'll put this aside for now and move onto other lessons.
Maybe you can create a room for one single client and then broadcast the percentage to this room.
I explained it here: How to connect formidable file upload to socket.io in Node.js

How to send data from server to client via http?

I want to send the filepath of a file on my server to the client in order to play it using a media player. How can I retrieve that string on the client side in order to concatenate it in the src attribute of a <video element without using sockets?
Server snippet:
res.set('content-type', 'text/plain');
res.send('/files/download.mp4');
This is how you make a request to the server without any frameworks. "/path_to_page" is the route you set to the page that is supposed to process the request.
var xhr = new XMLHttpRequest();
xhr.open('GET', '/path_to_page', true);
xhr.onload = function(e) {
if (this.status == 200) {
console.log(this.responseText); // output will be "/files/download.mp4"
}
};
xhr.send();
}
You might also want to send some params.
var formdata = new FormData();
formdata.append("param_name", "value");
So you might for instance want to send the filename or such.
You just need to change 2 lines from the first code snippet. One would be
xhr.open('POST', '/path_to_page', true); // set to post to send the params
xhr.send(formdata); // send the params
To get the params on the server, if you are using express, they are in req.body.param_name
Which framework are you using??
You can declare base path of your project directory in ajax and the followed by your file.
jQuery.ajax({
type: "GET",
url: "/files/download.mp4",
});
Since you are using express (on node), you could use socket.io:
Server:
var io = require('socket.io').listen(80),
fs = require('fs');
io.sockets.on('connection', function (socket) {
socket.on('download', function(req) {
fs.readFile(req.path, function (err, data) {
if (err) throw err;
socket.emit('video', { video: data });
});
});
});
Client:
<script src="/socket.io/socket.io.js"></script>
<script>
var socket = io.connect('http://localhost');
...
// request a download
socket.emit('download', { path: '/files/download.mp4' });
// receive a download
socket.on('video', function (data) {
// do sth with data.video;
});
...
</script>
Edit: didnt notice you didnt want to use sockets. Still it is a viable solution.

NodeJS Web App File Upload Chops Off Beginning Of File

I'm working on a project in NodeJS which involves file upload. The upload is done on the client side with the code:
$('#file-upload').bind('change focus click', function() {
var file = jQuery(this)[0].files[0];
if (file && file.fileName) {
var xhr = new XMLHttpRequest();
xhr.upload.addEventListener('progress', onProgressHandler, false);
xhr.upload.addEventListener('load', transferComplete, false);
xhr.open('POST', '/upload', true);
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.setRequestHeader('X-File-Name', encodeURIComponent(file.fileName));
xhr.setRequestHeader('Content-Type', 'application/octet-stream');
xhr.send(file);
function onProgressHandler(evt) {
var percentage = event.loaded/event.total*100;
console.log(percentage);
}
function transferComplete(evt) {
console.log('Done');
}
}
});
And on the server-side, I use:
app.post('/upload', function(req, res, next) {
if(req.xhr) {
console.log('Uploading...');
var fName = req.header('x-file-name');
var fSize = req.header('x-file-size');
var fType = req.header('x-file-type');
var ws = fs.createWriteStream('./'+fName)
req.on('data', function(data) {
console.log('DATA');
ws.write(data);
});
req.on('end', function() {
console.log('All Done!!!!');
});
}
});
This code does work alone, but when combined with the rest of my much larger project, it seems to chop of the beginning of large files, and ignore small files all together. If I upload a small file, the console.log('DATA') never fires and it does fire for large files, but not for the beginning of the file. I believe for some reason it is sending the file early and by the time my function picks it up the beginning (or in the case of a small file, the entire thing) has already sent. I don't know what would be causing this, though.
Thanks!
I figured it out. There was so much logic between my route being defined and the actual file upload code running that it wasn't ready listening for the file.
I am having this exact same problem. It bothers me that having too much logic between the request and the on('data') event is the problem. I"m testing with a local server, and the amount of logic between the start of the request and registering the on data event is negligible. But the fact that I don't need to cross the internet to do my upload is making this problem that much worse? Are you still experiencing this issue?

Resources