I really love the DropZoneJS component and am currently wrapping it in an EmberJS component (you can see demo here). In any event, the wrapper works just fine but I wanted to listen in on one of Dropzone's events and introspect the file contents (not the meta info like size, lastModified, etc.). The file type I'm dealing with is an XML file and I'd like to look "into" it to validate before sending it.
How can one do that? I would have thought the contents would hang off of the file object that you can pick up on many of the events but unless I'm just missing something obvious, it isn't there. :(
This worked for me:
Dropzone.options.PDFDrop = {
maxFilesize: 10, // Mb
accept: function(file, done) {
var reader = new FileReader();
reader.addEventListener("loadend", function(event) { console.log(event.target.result);});
reader.readAsText(file);
}
};
could also use reader.reaAsBinaryString() if binary data!
Ok, I've answer my own question and since others appear interested I'll post my answer here. For a working demo of this you can find it here:
https://ui-dropzone.firebaseapp.com/demo-local-data
In the demo I've wrapped the Dropzone component in the EmberJS framework but if you look at the code you'll find it's just Javascript code, nothing much to be afraid of. :)
The things we'll do are:
Get the file before the network request
The key thing we need become familiar with is the HTML5 API. Good news is it is quite simple. Take a look at this code and maybe that's all you need:
/**
* Replaces the XHR's send operation so that the stream can be
* retrieved on the client side instead being sent to the server.
* The function name is a little confusing (other than it replaces the "send"
* from Dropzonejs) because really what it's doing is reading the file and
* NOT sending to the server.
*/
_sendIntercept(file, options={}) {
return new RSVP.Promise((resolve,reject) => {
if(!options.readType) {
const mime = file.type;
const textType = a(_textTypes).any(type => {
const re = new RegExp(type);
return re.test(mime);
});
options.readType = textType ? 'readAsText' : 'readAsDataURL';
}
let reader = new window.FileReader();
reader.onload = () => {
resolve(reader.result);
};
reader.onerror = () => {
reject(reader.result);
};
// run the reader
reader[options.readType](file);
});
},
https://github.com/lifegadget/ui-dropzone/blob/0.7.2/addon/mixins/xhr-intercept.js#L10-L38
The code above returns a Promise which resolves once the file that's been dropped into the browser has been "read" into Javascript. This should be very quick as it's all local (do be aware that if you're downloading really large files you might want to "chunk" it ... that's a more advanced topic).
Hook into Dropzone
Now we need to find somewhere to hook into in Dropzone to read the file contents and stop the network request that we no longer need. Since the HTML5 File API just needs a File object you'll notice that Dropzone provides all sorts of hooks for that.
I decided on the "accept" hook because it would give me the opportunity to download the file and validate all in one go (for me it's mainly about drag and dropping XML's and so the content of the file is a part of the validation process) and crucially it happens before the network request.
Now it's important you realise that we're "replacing" the accept function not listening to the event it fires. If we just listened we would still incur a network request. So to **overload* accept we do something like this:
this.accept = this.localAcceptHandler; // replace "accept" on Dropzone
This will only work if this is the Dropzone object. You can achieve that by:
including it in your init hook function
including it as part of your instantiation (e.g., new Dropzone({accept: {...})
Now we've referred to the "localAcceptHandler", let me introduce it to you:
localAcceptHandler(file, done) {
this._sendIntercept(file).then(result => {
file.contents = result;
if(typeOf(this.localSuccess) === 'function') {
this.localSuccess(file, done);
} else {
done(); // empty done signals success
}
}).catch(result => {
if(typeOf(this.localFailure) === 'function') {
file.contents = result;
this.localFailure(file, done);
} else {
done(`Failed to download file ${file.name}`);
console.warn(file);
}
});
}
https://github.com/lifegadget/ui-dropzone/blob/0.7.2/addon/mixins/xhr-intercept.js#L40-L64
In quick summary it does the following:
read the contents of the file (aka, _sendIntercept)
based on mime type read the file either via readAsText or readAsDataURL
save the file contents to the .contents property of the file
Stop the send
To intercept the sending of the request on the network but still maintain the rest of the workflow we will replace a function called submitRequest. In the Dropzone code this function is a one liner and what I did was replace it with my own one-liner:
this._finished(files,'locally resolved, refer to "contents" property');
https://github.com/lifegadget/ui-dropzone/blob/0.7.2/addon/mixins/xhr-intercept.js#L66-L70
Provide access to retrieved document
The last step is just to ensure that our localAcceptHandler is put in place of the accept routine that dropzone supplies:
https://github.com/lifegadget/ui-dropzone/blob/0.7.2/addon/components/drop-zone.js#L88-L95
using the FileReader() solution is working amazingly good for me:
Dropzone.autoDiscover = false;
var dz = new Dropzone("#demo-upload",{
autoProcessQueue:false,
url:'upload.php'
});
dz.on("drop",function drop(e) {
var files = [];
for (var i = 0; i < e.dataTransfer.files.length; i++) {
files[i] = e.dataTransfer.files[i];
}
var reader = new FileReader();
reader.onload = function(event) {
var line = event.target.result.split('\n');
for ( var i = 0; i < line.length; i++){
console.log(line);
}
};
reader.readAsText(files[files.length-1]);
Related
I'd like to create a simple Firefox extension to write a simple message to the Javascript console when a push notification is received from any site. I see there is a notification show event that seems like it should work although I can't seem to find an example of it in use. Is this possible?
I'm not sure that that API would handle your use case because the event is not global. It is placed on the object e.g.
var notify = new Notification("Hi there!").addEventListener('show', e => console.log("We showed it! ", e));
I can't think of a better solution if you want to watch for global events than what is mentioned in https://stackoverflow.com/a/36868084/4875295 -- Monkey Patching!
Copying the code from that answer for posterity:
function setNotificationCallback(callback) {
const OldNotify = window.Notification;
const newNotify = (title, opt) => {
callback(title, opt);
return new OldNotify(title, opt);
};
newNotify.requestPermission = OldNotify.requestPermission.bind(OldNotify);
Object.defineProperty(newNotify, 'permission', {
get: () => {
return OldNotify.permission;
}
});
window.Notification = newNotify;
}
I am working on New web application which is Using Web API as Business Layer and Knock out Js as client side frame work to binding. I have a requirement like Pass the certain search criteria to Web API Controller and get the Data from DB and Create and Send the Excel/MS-Word file on the fly as a downloadable content.
I am new to both the Web API and Knock out, I am searching on the Net and get partial solution and I am looking here to get more optimal solution for this use case.
Below is my code:
Client:
function GetExcelFile() {
var $downloadForm = $("<form method='POST'>")
.attr("action", baseUrl + "api/FileHandler/GetExcelFileTest")
.attr("target", "_blank")
$("body").append($downloadForm);
$downloadForm.submit();
$downloadForm.remove();
}
On Button Click having this code snippet to create a form on the fly and Get response from Web API.
Web API Code:
[HttpPost]
public HttpResponseMessage GetExcelFileTest()
{
var response = new HttpResponseMessage();
//Create the file in Web App Physical Folder
string fileName = Guid.NewGuid().ToString() + ".xls";
string filePath = HttpContext.Current.Server.MapPath(String.Format("~/FileDownload/{0}", fileName));
StringBuilder fileContent = new StringBuilder();
//Get Data here
DataTable dt = GetData();
if (dt != null)
{
string str = string.Empty;
foreach (DataColumn dtcol in dt.Columns)
{
fileContent.Append(str + dtcol.ColumnName);
str = "\t";
}
fileContent.Append("\n");
foreach (DataRow dr in dt.Rows)
{
str = "";
for (int j = 0; j < dt.Columns.Count; j++)
{
fileContent.Append(str + Convert.ToString(dr[j]));
str = "\t";
}
fileContent.Append("\n");
}
}
// write the data into Excel file
using (StreamWriter sw = new StreamWriter(fileName.ToString(), false))
{
sw.Write(fileContent.ToString());
}
IFileProvider FileProvider = new FileProvider();
//Get the File Stream
FileStream fileStream = FileProvider.Open(filePath);
//Set response
response.Content = new StreamContent(fileStream);
response.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment");
response.Content.Headers.ContentDisposition.FileName = fileName;
response.Content.Headers.ContentType = new MediaTypeHeaderValue("application/ms-excel");
response.Content.Headers.ContentLength = fileStream.Length;
//Delete the file
//if(File.Exists(filePath))
//{
// File.Delete(filePath);
//}
return response;
}
Using this code I am able to download an Excel File. Still I have some more open questions to make this code optimal.
Q1) I need to Pass view model(Search Criteria) to API Controller Using the dynamically create form ? (OR) Any better ways to get Excel file from Web API.
Q2) I am sure it's not a good way to create Excel file in Physical folder and Get FileStream and send as a respone. How to do on the fly ? OR any other optimal ways.
Please suggest me to do better ways.. Thanks
Q1) You can quite easily pass the view-model, but it's also similarly easy to pull that information from the posted form.
Passing the view-model
If you want to pass the view-model to a WebAPI method then remember that said method must take as a parameter an object with the same properties. So if the object that you wish to post back always has the same properties then it's trivial to build a server-side class with the same properties and receive an instance of that class.
To post back this client-side object you can do something like this (uses jQuery, which I see you're already using):
$.ajax({
contentType: "application/json",
data: my-view-model.toJSON(),
type: "POST",
url: baseUrl + "api/FileHandler/GetExcelFileTest" });
I haven't attached any success or error handlers here because the JavaScript isn't concerned with the return, but you might wish to add some handlers in case an exception is thrown in your WebAPI method. I recommend doing that by adding the following to the above $.ajax() call:
statusCode: {
500: function(jqXhr, textStatus, errorThrown) {
},
[other HTTP error codes]
}
[Read the documentation for the $.ajax() call here.]
One additional tip here: when you call my-view-model.toJSON() (or self.toJSON(), if called from within your view-model) Knockout will first of all determine if your view-model contains a toJSON() method. If so, it will use this method; if not then it will call the browser's implementation of this function. However, the browser's implementation of this function will serialise everything, which can be particularly length if you have, for example, long select lists in your view-model. Therefore, if you wish only to send back a subset of the view-model's properties then define your own toJSON function on your view-model like so:
var toJSON = function() {
return {
Property1: ...,
Property2: ...
};
}
[Read more about converting a view-model to JSON here.]
Posting the form as-is
If you don't wish to expend the effort to do the view-model wiring then you can just post the form exactly like you have in your question. You can then retrieve the values from the form by using
Request.Form["my-field"];
Q2)
You're probably right in pointing out that it's not wise to create the Excel file in the physical folder. However, as far as I'm aware (interested if someone says otherwise) you'll have to use a 3rd-party library for this. Microsoft do offer an Office automation library but I have a suspicion that you also need Office to be installed at the same location.
Creating Excel spreadsheets dynamically is something I've done several times but for the actual creation I use Aspose.Cells, which requires a license. Although I do create a physical version and then delete it, I believe Aspose.Cells may allow you to create it as a stream. But take a look around, there are certainly other libraries which offer Excel automation.
Returning the File from the Server
Calling $.ajax({...}) alone won't allow you to present the user with a "Save as..." dialog. What I do in this situation - and this won't work if you wish to store the generated file only in memory (FileStream, for example) and not on the file system - is to respond to the $.ajax({...}) call with a filename for the generated file.
The next step is to direct the user towards that filename.
So I have something like this in my JavaScript:
$.ajax({
dataType: "json",
type: "GET", // you'll probably want POST in your case
url: ...,
success: function(response) {
if (response && response.Uri && response.Uri.length) {
window.location.href = [root URL] + response.Uri;
}
}
});
But don't be alarmed by this redirect. That window.location.href points directly to a folder on the server, no controller needed. Because the browser then receives a file it presents the "Save as..." dialog while remaining on the same webpage.
I'm trying to Implement a simple Picture upload from the client to my mongoDB.
I've read many explanations but I can't find a way from start to finish.
My clientside -
function profilePic(input) {
if (input.files && input.files[0]) {
var file = input.files[0];
localStorage.setItem('picture', JSON.stringify(file));
}
}
Later on I take the this JSON from the LocalStorage and send it to my server side like this:
var request = false;
var result = null;
request = new XMLHttpRequest();
if (request) {
request.open("POST", "usersEditProf/");
request.onreadystatechange = function() {
if (request.readyState == 4 && request.status == 200) {
.....//More code to send to Server
request.setRequestHeader('content-type', 'application/json');
request.send(JSON.stringify(localStorage.getItem('picture)));
}
}
On my serverside:
app.post('/usersEditProf/',users.editProfile);
/** Edits the Profile - sends the new one **/
exports.editProfile = function(req, res) {
var toEdit = req.body;
var newPic = toEdit.picture;
And thats where I get lost. is newPic actually holding the picture? I doubt it...
Do I need to change the path? What is the new path I need to give the picture?
How do I put it in my DB? Do I need GridFS?
When trying to simply put that in my collection, it looks like this (example with a image called bar.jpg:
picture: "{\"webkitRelativePath\":\"\",\"lastModifiedDate\":\"2012-10-08T23:34:50.000Z\",\"name\":\"bar.jpg\",\"type\":\"image/jpeg\",\"size\":88929}",
If you want to upload a blob through XMLHTTPRequest(), you need to use an HTML 5 FormData object:
https://developer.mozilla.org/en-US/docs/Web/API/FormData
It alows you to specify a filename to push, then you handle the incoming file as you would with a mime form post. Note the limitations on browser support when you use the FormData object. Your alternative is a form POST to a hidden frame, which works OK but is not nearly as clean looking in code as FormData.
Is there a way to detect if a particular file that is being downloaded is a Gmail attachment?
I am looking for a way to write a Greasemonkey script which would help me organize the downloads, based on their download sources, say Gmail email attachments would have a different behavior from other stuff.
So far, I've found out that attachments redirect to https://mail-attachment.googleusercontent.com/attachment/u/0/ , which I guess is not sufficient.
EDIT
Since an add-on would be more powerful than a userscript, I've decided to pursue the Add On idea. However, the problem of detection remains unsolved.
This is too complicated for just one question; it has at least these major parts:
Do you want to redirect downloads when the user clicks, or automatically download select files? Clarify the question.
Your GM script must identify the appropriate download links, and on which pages, and for which views? For gMail, this is not a trivial task, and the question needs to be clearer. It's worthy of a whole question just on this issue given the variety of views and AJAX involved.
Once identified, the script probably needs to intercept clicks on those links. (Depends on your goal (clarify!) and what the Firefox extension can do.)
Greasemonkey needs to interact with an extension that either intercepts the user-initiated download, or allows for an automatic download. I've detailed the auto-download approach, below.
Once your script has identified the appropriate file URLs and/or links (Open a new question for more help with that, and include pictures of the types of pages and links you want.), it can interface with a Firefox add-on, like the one below, to automatically save those files.
Automatically saving files from Greasemonkey with the help of an additional Add-on:
WARNING: The following is a working proof of concept for education only. It has no security features, and if you use it as-is, for actual surfing, some webpage or script writer or extension writer will use it to completely pwn your computer.
If you use the Add-on builder or SDK to install or "Test" the DANGER. DANGER. DANGER. File download utility,
Then you can use a Greasemonkey script, like this, to automatically save files:
// ==UserScript==
// #name _Call our File download add-on to trigger a file download.
// #include https://mail.google.com/mail/*
// #include https://stackoverflow.com/questions/14440362/*
// #require http://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.min.js
// #grant GM_addStyle
// ==/UserScript==
/*- The #grant directive is needed to work around a design change
introduced in GM 1.0. It restores the sandbox.
*/
var fileURL = "http://userscripts.org/scripts/source/29222.user.js";
var savePath = "D:\\temp\\";
var extensionLoaded = false;
window.addEventListener ("ImAlivefromExtension", function (zEvent) {
console.log ("The test extension appears to be loaded!", zEvent.detail);
extensionLoaded = true;
} );
window.addEventListener ("ReplyToDownloadRequest", function (zEvent) {
//var xxxx = JSON.parse (zEvent.detail);
console.log ("Extension replied: ", zEvent.detail);
} );
$("body").prepend ('<button id="gmFileDownloadBtn">Click to File download request.</button>');
$("#gmFileDownloadBtn").click ( function () {
if (extensionLoaded) {
detailVal = JSON.stringify (
{targFileURL: fileURL, targSavePath: savePath}
);
var zEvent = new CustomEvent (
"SuicidalDownloadRequestToAddOn",
{"detail": detailVal }
);
window.dispatchEvent (zEvent);
}
else {
alert ("The file download extension is not loaded!");
}
} );
You can test the script on this SO question page.
Note that any other extension, userscript, web page, or plugin can listen to or send spoof events, the only security, so far, is to limit which pages the extension runs on.
For reference, the extension source files are below. The rest is supplied by Firefox's Add-on SDK.
The content script:
var zEvent = new CustomEvent ("ImAlivefromExtension",
{"detail": "GM, DANGER, DANGER, DANGER, File download utility" }
);
window.dispatchEvent (zEvent)
window.addEventListener ("SuicidalDownloadRequestToAddOn", function (zEvent) {
console.log ("Extension received download request: ", zEvent.detail);
//-- Relay request to extension main.js
self.port.emit ("SuicidalDownloadRequestRelayed", zEvent.detail);
//-- Reply back to GM, or whoever is pretending to be GM.
var zEvent = new CustomEvent ("ReplyToDownloadRequest",
{"detail": "Your funeral!" }
);
window.dispatchEvent (zEvent)
} );
The background JS:
//--- For security, MAKE THESE AS RESTRICTIVE AS POSSIBLE!
const includePattern = [
'https://mail.google.com/mail/*',
'https://stackoverflow.com/questions/14440362/*'
];
let {Cc, Cu, Ci} = require ("chrome");
Cu.import ("resource://gre/modules/Services.jsm");
Cu.import ("resource://gre/modules/XPCOMUtils.jsm");
Cu.import ("resource://gre/modules/FileUtils.jsm");
let data = require ("sdk/self").data;
let pageMod = require ('sdk/page-mod');
let dlManageWindow = Cc['#mozilla.org/download-manager-ui;1'].getService (Ci.nsIDownloadManagerUI);
let fileURL = "";
let savePath = "";
let activeWindow = Services.wm.getMostRecentWindow ("navigator:browser");
let mod = pageMod.PageMod ( {
include: includePattern,
contentScriptWhen: 'end',
contentScriptFile: [ data.url ('ContentScript.js') ],
onAttach: function (worker) {
console.log ('DANGER download utility attached to: ' + worker.tab.url);
worker.port.on ('SuicidalDownloadRequestRelayed', function (message) {
var detailVal = JSON.parse (message);
fileURL = detailVal.targFileURL;
savePath = detailVal.targSavePath;
console.log ("Received request to \ndownload: ", fileURL, "\nto:", savePath);
downloadFile (fileURL, savePath);
} );
}
} );
function downloadFile (fileURL, savePath) {
dlManageWindow.show (activeWindow, 1);
try {
let newFile;
let fileURIToDownload = Services.io.newURI (fileURL, null, null);
let persistWin = Cc['#mozilla.org/embedding/browser/nsWebBrowserPersist;1']
.createInstance (Ci.nsIWebBrowserPersist);
let fileName = fileURIToDownload.path.slice (fileURIToDownload.path.lastIndexOf ('/') + 1);
let fileObj = new FileUtils.File (savePath);
fileObj.append (fileName);
if (fileObj.exists ()) {
console.error ('*** Error! File "' + fileName + '" already exists!');
}
else {
let newFile = Services.io.newFileURI (fileObj);
let newDownload = Services.downloads.addDownload (
0, fileURIToDownload, newFile, fileName, null, null, null, persistWin, false
);
persistWin.progressListener = newDownload;
persistWin.savePrivacyAwareURI (fileURIToDownload, null, null, null, "", newFile, false);
}
} catch (exception) {
console.error ("Error saving the file! ", exception);
dump (exception);
}
}
So far from what you are saying,the only thing you can do is making add-on(Firefox) and Extension(for chrome if you want).
If you have closer look at download of attachment,it happens when:
1) You click on icon of attachments
2) If you click download
For these two things you can find the click event of <a> tag containing download_url value.You can easily do that using js/jquery for creting extension.
So you can write the functionality when user tries to download attachment.
You could use Gmail contextual gadgets to modify the behavior on the Google side:
Gmail Contexual Gadgets
Contextual Gadgets don't have direct access to attachments but server side, you could use IMAP to access the attachment (based on the Gmail message ID identified by the gadget):
Gmail IMAP Extensions
Using gadgets and server-side IMAP has the advantage of being browser-agnostic.
It's not entirely clear what you want to do differently with the downloaded Gmail attachment as opposed to any given download (save it to a different location? Perform actions upon the attachment data?) But the contextual gadget and IMAP should give you some chance to modify the attachment data as needed before the browser download begins.
So I'm building a multipart form uploader over ajax on node.js, and sending progress events back to the client over socket.io to show the status of their upload. Everything works just fine until I have multiple clients trying to upload at the same time. Originally what would happen is while one upload is going, when a second one starts up it begins receiving progress events from both of the forms being parsed. The original form does not get affected and it only receives progress updates for itself. I tried creating a new formidable form object and storing it in an array along with the socket's session id to try to fix this, but now the first form stops receiving events while the second form gets processed. Here is my server code:
var http = require('http'),
formidable = require('formidable'),
fs = require('fs'),
io = require('socket.io'),
mime = require('mime'),
forms = {};
var server = http.createServer(function (req, res) {
if (req.url.split("?")[0] == "/upload") {
console.log("hit upload");
if (req.method.toLowerCase() === 'post') {
socket_id = req.url.split("sid=")[1];
forms[socket_id] = new formidable.IncomingForm();
form = forms[socket_id];
form.addListener('progress', function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
});
form.parse(req, function (err, fields, files) {
file_name = escape(files.upload.name);
fs.writeFile(file_name, files.upload, 'utf8', function (err) {
if (err) throw err;
console.log(file_name);
})
});
}
}
});
var socket = io.listen(server);
server.listen(8000);
If anyone could be any help on this I would greatly appreciate it. I've been banging my head against my desk for a few days trying to figure this one out, and would really just like to get this solved so that I can move on. Thank you so much in advance!
Can you try putting console.log(socket_id);
after form = forms[socket_id]; and
after progress = (bytesReceived / bytesExpected * 100).toFixed(0);, please?
I get the feeling that you might have to wrap that socket_id in a closure, like this:
form.addListener(
'progress',
(function(socket_id) {
return function (bytesReceived, bytesExpected) {
progress = (bytesReceived / bytesExpected * 100).toFixed(0);
socket.sockets.socket(socket_id).send(progress);
};
})(socket_id)
);
The problem is that you aren't declaring socket_id and form with var, so they're actually global.socket_id and global.form rather than local variables of your request handler. Consequently, separate requests step over each other since the callbacks are referring to the globals rather than being proper closures.
rdrey's solution works because it bypasses that problem (though only for socket_id; if you were to change the code in such a way that one of the callbacks referenced form you'd get in trouble). Normally you only need to use his technique if the variable in question is something that changes in the course of executing the outer function (e.g. if you're creating closures within a loop).