How do I get file names in Fineuploader's onAllComplete event - fine-uploader

I'm trying to retrieve the filenames for the successfully uploaded files in the onAllComplete() event. My code below returns "undefined" for the getName() call. I'm not sure if the file name is in fact no longer defined at this point or if I'm attempting to access it incorrectly. Any help will be much appreciated. Thanks!
var uploader = $('#fine-uploader').fineUploader({
...
callbacks: {
onAllComplete: function(succeeded, failed) {
if (failed.length > 0) {
alert("Error: Some files were not uploaded");
} else {
if (succeeded.length > 0 ) {
alert("Success!");
}
this.reset();
}
for (var id in succeeded) {
alert(this.getName(id))
}
},
...
},
...
});

The this.reset() is clearing the file names.

Related

If table contains something then do something

How can I code: If table contains some text then do something.
I tried it with contains but that throws me error if table does not contains.
if(cy.get(tableCode).contains('td', value)){
cy.get(tableCode).contains('td', value).click()
}else{
cy.reload()
}
Thanks for your time.
The proper way to us jQuery :contains() would be
cy.get(tableCode).then($table => {
const $cell = $table.find('td:contains(value)')
if ($cell.length) {
$cell.click()
} else {
cy.reload()
}
})
You can use the Jquery length method to check if the element is present in the table or not -
cy.get('tableCode').then(($ele) => {
if ($ele.find('td').contains('value').length > 0) {
//Element found
cy.wrap($ele).find('td').contains('value').click()
}
else {
//Element not found
cy.reload()
}
})

FineUploader 5.14.0 - setParams not working with large files

I've had the following code working for quite a while now (over a year) but a user has tried to upload a 14MB file and the extra data I post along with the upload seems to not get posted anymore.
In Chrome dev tools I look at the header of the (single) XHR and I see the data in the "Form data" section but nothing get's to the server which I don't understand.
Files that are a few MB or smaller work without issue. I've not found a magic MB limit yet.
The extra data is in the onUpload call back. board_hash is in the head of page.
var fu_instance = new qq.FineUploader(
{
element: $uploader[0],
template: 'agenda_file_template',
debug: true,
request: {
endpoint: '/m/upload',
forceMultipart: false,
customHeaders: {
Accept: 'application/json'
}
},
autoUpload: false,
messages:
{
noFilesError: "There is no files to upload. Select or drag and drop some files to upload.",
},
failedUploadTextDisplay:
{
mode: 'custom',
responseProperty: 'error'
},
callbacks:
{
onSubmit: function(id, filename)
{
// File added to upload
$uploader.addClass('hide-drop-msg');
$btn_submit_upload.html('Upload').show();
unsaved = true;
},
onUpload: function(id, name)
{
fu_instance.setParams({'board_hash': board_hash, 'parent': $parent.val()});
},
onCancel: function(id, name)
{
// Actually onBeforeCancel
if ($uploader.find('ul.qq-upload-list li').length == 1)
{
// There is currently 1 & it's about to be axed
$uploader.removeClass('hide-drop-msg');
$btn_reset_uploads.hide();
$btn_submit_upload.html('Upload').show();
unsaved = false;
}
},
onError: function(id, name, reason, resp)
{
// Specific file error
if (resp.hasOwnProperty('auth_expired'))
{
window.location.href = auth_url;
}
},
onComplete: function(id, name, resp)
{
if (resp.success)
{
var $parent_el = $('#'+$parent.val());
$parent_el.find('.files').append(resp.html);
$parent_el.find('.no-agenda-files').hide();
}
},
onAllComplete: function(succeeded, failed)
{
// Every file is done
$btn_submit_upload.hide();
$btn_reset_uploads.show();
unsaved = false;
}
}
});
My understanding is that chunking is off by default. Have I configured this wrong or am I in the wrong call back?

How to wait until all bulk writes are completed in elastic search api

Using NodeJS elastic search client. Trying to write a data importer to bulk import documents from MongoDB. The problem I'm having is the index refresh doesn't seem to wait until all documents are written to elastic before checking the counts.
Using the streams API in node to read the records into a batch, then using the elastic API bulk command to write the records. Shown below:
function rebuildIndex(modelName, queryStream, openStream, done) {
logger.debug('Rebuilding %s index', modelName);
async.series([
function (next) {
deleteType(modelName, function (err, result) {
next(err, result);
});
},
function (next) {
var Model;
var i = 0;
var batchSize = settings.indexBatchSize;
var batch = [];
var stream;
if (queryStream && !openStream) {
stream = queryStream.stream();
} else if (queryStream && openStream) {
stream = queryStream;
}else
{
Model = mongoose.model(modelName);
stream = Model.find({}).stream();
}
stream.on("data", function (doc) {
logger.debug('indexing %s', doc.userType);
batch.push({
index: {
"_index": settings.index,
"_type": modelName.toLowerCase(),
"_id": doc._id.toString()
}
});
var obj;
if (doc.toObject){
obj = doc.toObject();
}else{
obj = doc;
}
obj = _.clone(obj);
delete obj._id;
batch.push(obj);
i++;
if (i % batchSize == 0) {
console.log(chalk.green('Loaded %s records'), i);
client().bulk({
body: batch
}, function (err, resp) {
if (err) {
next(err);
} else if (resp.errors) {
next(resp);
}
});
batch = [];
}
});
// When the stream ends write the remaining records
stream.on("end", function () {
if (batch.length > 0) {
console.log(chalk.green('Loaded %s records'), batch.length / 2);
client().bulk({
body: batch
}, function (err, resp) {
if (err) {
logger.error(err, 'Failed to rebuild index');
next(err);
} else if (resp.errors) {
logger.error(resp.errors, 'Failed to rebuild index');
next(resp);
} else {
logger.debug('Completed rebuild of %s index', modelName);
next();
}
});
} else {
next();
}
batch = [];
})
}
],
function (err) {
if (err)
logger.error(err);
done(err);
}
);
}
I use this helper to check the document counts in the index. Without the timeout, the counts in the index are wrong, but with the timeout they're okay.
/**
* A helper function to count the number of documents in the search index for a particular type.
* #param type The type, e.g. User, Customer etc.
* #param done A callback to report the count.
*/
function checkCount(type, done) {
async.series([
function(next){
setTimeout(next, 1500);
},
function (next) {
refreshIndex(next);
},
function (next) {
client().count({
"index": settings.index,
"type": type.toLowerCase(),
"ignore": [404]
}, function (error, count) {
if (error) {
next(error);
} else {
next(error, count.count);
}
});
}
], function (err, count) {
if (err)
logger.error({"err": err}, "Could not check index counts.");
done(err, count[2]);
});
}
And this helper is supposed to refresh the index after the update completes:
// required to get results to show up immediately in tests. Otherwise there's a 1 second delay
// between adding an entry and it showing up in a search.
function refreshIndex(done) {
client().indices.refresh({
"index": settings.index,
"ignore": [404]
}, function (error, response) {
if (error) {
done(error);
} else {
logger.debug("deleted index");
done();
}
});
}
The loader works okay, except this test fails because of timing between the bulk load and the count check:
it('should be able to rebuild and reindex customer data', function (done) {
this.timeout(0); // otherwise the stream reports a timeout error
logger.debug("Testing the customer reindexing process");
// pass null to use the generic find all query
searchUtils.rebuildIndex("Customer", queryStream, false, function () {
searchUtils.checkCount("Customer", function (err, count) {
th.checkSystemErrors(err, count);
count.should.equal(volume.totalCustomers);
done();
})
});
});
I observe random results in the counts from the tests. With the artificial delay (setTimeout in the checkCount function) then the counts match. So I conclude that the documents are eventually written to elastic and the test would pass. I thought the indices.refresh would essentially force a wait until the documents are all written to the index, but it doesn't seem to be working with this approach.
The setTimeout hack is not really sustainable when the volume goes to actual production level....so how can I ensure the bulk calls are completely written to elastic index before checking the count of documents?
Take a look at the "refresh" parameter (elasticsearch documentation)
For example:
let bulkUpdatesBody = [ bulk actions / docs to index go here ]
client.bulk({
refresh: "wait_for",
body: bulkUpdatesBody
});
I'm not sure if this is the answer or not - but I flushed the index prior to checking the count. It "appears" to work, but I don't know if it's just because of the timing between the calls. Perhaps someone from elastic team knows if flushing the index will really solve the issue?
function checkCount(type, done) {
async.series([
function(next) {
client().indices.flush({
"index": settings.index,
"ignore": [404]
}, function (error, count) {
if (error) {
next(error);
} else {
next(error, count.count);
}
});
},
function (next) {
refreshIndex(type, next);
},
function (next) {
client().count({
"index": settings.index,
"type": type.toLowerCase(),
"ignore": [404]
}, function (error, count) {
if (error) {
next(error);
} else {
next(error, count.count);
}
});
}
], function (err, count) {
if (err)
logger.error({"err": err}, "Could not check index counts.");
done(err, count[2]);
});
}

TypeError:productlist.products.push is not a function

Here is code for Factories
LifeStyleFactoryMOdule.factory("PurchaseFactory",function(){
var productlist={products:[]};
return{
getpurchaseCart:function(){
return productlist;
},
addPurchaseCart:function(products){
productlist.products.push(products);
},
deletePurchase:function(idx){
productlist.products.splice(idx,1)
}
}
})
Services
LifeStyleServiceModule.service("PurchaseService",function(PurchaseFactory){
this.getAllPurchase=function(){
return PurchaseFactory.getpurchaseCart();
}
this.addPurchase=function(products)
{
PurchaseFactory.addPurchaseCart(products);
}
this.deletePurchase=function(idx,id)
{
PurchaseFactory.deletePurchase(idx,id);
}
})
Controllers having function
LifeStyleController.controller("PurchaseController",function($scopePurchaseService){
$scope.savepurchase=function(products){
if($scope.products._id==undefined){
$scope.products=angular.extend($scope.products,$scope.sizes)
PurchaseService.addPurchase($scope.products);
$scope.products={};
$scope.sizes={size:[]}
}
}
In HTML i having a button and all the data i am sending.
First Time I push the data is is done successfully, but for second time its showing me error as productlist.products.push is not a function
Knowing answer will be great help

Problems with using nsIURIContentListener in Firefox Extension

I am developing a small extension that has to redirect certain URLs to another Site. It's working fine, except for one situation: if open the Link with "Context-Menu -> Open in new Tab", the current page is redirectet to my page and a second tab opens with the link that should be redirected. What am I making wrong? Is there a better way to achieve what I want?
var myListener =
{
QueryInterface: function(iid)
{
if (iid.equals(Components.interfaces.nsIURIContentListener) ||
iid.equals(Components.interfaces.nsISupportsWeakReference) ||
iid.equals(Components.interfaces.nsISupports))
return this;
throw Components.results.NS_NOINTERFACE;
},
onStartURIOpen: function(aUri)
{
if (check_url(aUri)) {
getBrowser().mCurrentTab.linkedBrowser.loadURI(######REDIRECT IS HERE#############);
return true;
}
return false;
},
doContent: function(aContentType, aIsContentPreferred, aRequest, aContentHandler )
{
throw Components.results.NS_ERROR_NOT_IMPLEMENTED;
},
canHandleContent: function(aContentType, aIsContentPreferred, aDesiredContentType)
{
throw Components.results.NS_ERROR_NOT_IMPLEMENTED;
},
isPreferred: function(aContentType, aDesiredContentType)
{
try
{
var webNavInfo =
Components.classes["#mozilla.org/webnavigation-info;1"]
.getService(Components.interfaces.nsIWebNavigationInfo);
return webNavInfo.isTypeSupported(aContentType, null);
}
catch (e)
{
return false;
}
},
GetWeakReference : function()
{
throw Components.results.NS_ERROR_NOT_IMPLEMENTED;
}
}
The complete extension can be found here : http://github.com/bitboxer/firefox-detinyfy
Okay, I did some research. The Hook was a wrong aproach. I changed the code now. Look into the git to find out more...

Resources