Extjs store load success handler not getting fired - ajax

I have a store load method which returns data via an ajax request. I can see that the data is being returned using Firebug, but my success handler is not getting called:
this.getCategoriesStore().load({params:{'id':d.data.category_id}}, {
success: function(category) {
console.log("Category: " + category.get('name'));
},
error: function(e) {
console.log(e);
}
});
I am returning a success parameter, along with the data:
{"success":true,"categories":{"id":5,"name":"Frying","section_id":2}}
Is there something missing or am I doing anything wrong?

Well I suppose you are looking for this:
store.load({
params:{'id':d.data.category_id},
scope: this,
callback: function(records, operation, success) {
if (success) {
console.log("Category: " + category.get('name'));
} else {
console.log('error');
}
}
});
It is not that obvious in the API that your additional params can be placed there too. But ExtJS often uses the config objects to wrap things up.
Edit to answer comment:
The short answer is: Yes
Now the longer version:
In case of the store it is up to you to directly provide anonymous (or concrete) callbacks or register events. Both will work the same in your situation here.
But you can only have one callback while you can have many events. In further scenarios you will find situations where events fits much better or where events are the only way at all. That will always be the case when you are listening. Here are some notes on that:
make use of the { single: true } property when you just need a callback once. Example: store.on('load', function(s) { /* do something*/ }, scope, { single: true }) The listener will be removed after it was called. This is needed cause of the use of a anonymous function, that cannot be removed.
make use of mon() in most cases where you bind listeners directly in class-definitions to ensure the listeners get destroyed along with the instance of the class.
Both will save you browser memory.

Try this:
store.load({
scope: this,
callback: function(records, operation, success) {
// the operation object
// contains all of the details of the load operation
console.log(records);
}
});
http://docs.sencha.com/ext-js/4-1/#!/api/Ext.data.Store-method-load according to the docs there is no success and error callback.
Another alternative to providing callback you can also add a "load" event listener on the store for the same effect.

Related

AJAX + jQuery Deferred: execution sequence

Task: get data from server with $.post, process them by method .success(), after that call some function.
var t;
$.when($.post("get_json.php", function(res) {
t = res;
}, 'json')).done(function() {
console.log(t);
});
Do I understand correctly that the Deferred method .done() is executed after .success is done (ie t = res)?
But why "console.log(t)" shows "undefined"?
Is .done() fires after request, but before .success()?
Passing a "success" callback to $.post() is an alternative to the (preferred) chaining of .done(...). Do one or the other, not both, then you don't need to worry about the execution order.
Also, unless you have a decent caching strategy for async data, you shouldn't be setting t as an outer var.
$.post("get_json.php", ...).done(function(t) {
console.log(t);
//do awesome things with t here
});
Caching would be something like this :
var asyncCache = {};
...
function get_t() {
return (asyncCache.t) ? $.when(asyncCache.t) : $.post("get_json.php", ...).done(function(t) {
asyncCache.t = t;
});
}
...
get_t().done(function(t) {
console.log(t);
//do awesome things with t here
}
$.when() is ONLY needed when you have multiple promises and you want to wait for all of them to complete. You simply don't need it at all for your single ajax call. You can just do it like this:
$.post("get_json.php").done(function(t) {
// use the results of the ajax call here
console.log(t);
});
In addition, your code example was using BOTH a success callback function AND a .done() handler. Pick one of the other, not both as they are different ways of getting a callback when the ajax call is done. I'd suggest the promise implementation above because it's more flexible. But, you could also use just the success handler:
$.post("get_json.php", function(t) {
// use the results of the ajax call here
console.log(t);
}, 'json');
Note, when you have an asynchronous operation like this, you need to consume the results of the ajax call in the success callback (or the promise callback) or call some function from there and pass it the data. You do not want to put the data into a global variable or a variable in a higher scope because other code will simply have no way of knowing when the data is ready and when it is not. Put your action in the callback.
If you have more than one ajax call that you want to wait for, then you can do it like this:
$.when($.post("get_json.php"), $.post("get_json2.php")).done(function(r1, r2) {
// use the results of the ajax call here
console.log(r1[0]); // results from first ajax call
console.log(r2[0]); // results from second ajax call
});

Setting both $(document).ajaxSuccess and $.ajax().done()

I have several ajax calls and when they succeed I need, firstly, to do the same check for all of the responses and then, if the check does not fail, do different things with the responses.
At the moment I'm using success option in each of the calls and insert this check in each of the calls, like this:
$.ajax({
success: function (data){
if (response_has_errors(data))
{return}
// do stuff
}
});
So, I have this idea: use $(document).ajaxSucces() for doing the same checking and then use $.ajax().done() or success option with each of the calls.
But I need the handler in $(document).ajaxSucces() to always be executed first, and if it returns false, not to execute individual handlers.
How do I do that?
First you should make sure you server does not return a success status (200 OK) when the response is in fact an error. This saves you one step in your processing.
Then you could use jQuery's when() to process requests together.
$.when(
$.ajax(...), $.ajax(...), $.ajax(...), $.ajax(...)
)
.then(function () (result1, result2, result3, result4) {
// all requests have successfully returned
})
.fail(function () {
// handle error (inspect arguments)
})
.always(function () {
// stop throbbers or other clean up work if necessary
});
Be sure to thoroughly read the documentation on Deferreds if you've never used them before.
Note that you also can pass an array of jQuery XHRs using apply().
$.when.apply($, allReqests).then( /* ... */ );
I could suggest using "dataFilter" from $.ajax(): http://api.jquery.com/jQuery.ajax/
You can put some error handling in this dataFilter and if it fails - update some flag directly in the data, and have all "success" handlers check for that flag before executing.

NodeJS wait for callback to finish on event emit

I have and application written in NodeJS with Express and am attempting to use EventEmitter to create a kind of plugin architecture with plugins hooking into the main code by listening to emitted events.
My problem comes when a plugin function makes an async request (to get data from mongo in this case) this causes the plugin code to finish and return control back to the original emitter which will then complete execution, before the async request in the plugin code finishes.
E.g:
Main App:
// We want to modify the request object in the plugin
self.emit('plugin-listener', request);
Plugin:
// Plugin function listening to 'plugin-listener', 'request' is an arg
console.log(request);
// Call to DB (async)
this.getFromMongo(some_data, function(response){
// this may not get called until the plugin function has finished!
}
My reason for avoiding a callback function back to the main code from the 'getFromMongo' function is that there may be 0 or many plugins listening to the event. Ideally I want some way to wait for the DB stuff to finish before returning control to the main app
Many Thanks
Using the EventEmitter for plugin/middleware management is not ideal, because you cannot ensure that the listeners are executed sequentially, if they have asynchroneous code. This especially is a problem when these listeners interact with each other or the same data.
That's why i.e. connect/express middleware functions are stored in an array and executed one after the other, instead of using an EventEmitter; They each need to call a next(); function when they are done doing their task.
You can't mix asynchronous calls with synchronous behavior. If you're going to stick with event emitter (which may not be ideal for you as Klovadis pointed out), you'll need to have your plugin emit an event that triggers a function in the main app which contains the code that you want to 'wait' to execute. You would also have to in turn keep track of all the plugin calls you made that you are waiting for event calls for so that your main code doesn't run until all the plugin calls have finished their MongoDB callbacks.
var callList = ['pluginArgs1', 'pluginArgs2', 'pluginArgs3'];
for (var i = 0; i < callList.length; i++){
self.emit('plugin-listener', callList[i], i);
}
self.on('plugin-callback', function(i){
callList.splice(i, 1);
if (callList.length < 1){
//we're done, do something
}
});
Had the same kind of decision to make about some events that I sometime need to wait for before returning the response to the client and sometimes not (when not in an HTTP request context).
The easiest way for me was to add a callback as the last argument of the event.
Stuff.emit('do_some_stuff', data, data2, callback);
In the event check if there is a callback:
Stuff.on('do_some_stuff', function(data, data2, callback) {
// stuff to do
// ...
if (typeof callback === "function") return callback(err, result);
});
I know that mixing event and callbacks can be messy but that work fine for what I need.
The other solution I see is the one proposed by #redben: add an emit function at the end of the event. The problem when in a HTTP context is that you need unique keys so your events don't mess up if they do different stuff per user.
Haven't tried it myself but you could use a property in the event's data object as an array of functions to execute by the code that emitted the event :
Listeners
foo.on('your-event', function(data) {
console.log(data);
// Then add the asynchronous code to a callbacks array
// in the event data object
data.callbacks.push(function(next) {
getFromMongo(some_data, function(err, result) { next(err) }
}
});
Emitter
self.emit('your-event', data);
// listeners have modified data object,
// some might have added callback to data.callbacks
// (suppose you use async)
async.series(data.callbacks);
This seems quite dangerous, but I have to do it anyway...
const ee = new EventEmitter();
if (ee.listeners("async-event").length > 0) {
await new Promise((resolve) => {
ee.emit("async-event", data1, data2, resolve);
});
}
Otherwise, just emit the event back-and-forth.

Extjs store.on('save', afterSave(resp));

I have a simple ExtJs (3.4) Grid with a Writer. When the user makes some changes the store is saved to the server as follows:
store.on('save', afterSave(resp));
All is fine. However, I want to get a response as to wheather the record has been saved successfully, failed or an update conflict happed. How to best do this?
Are you using Ext.data.proxy.Ajax to load your stores? If so, you can use the reader property to evaluate and handle the server responses.
Another option would be to make AJAX called directly and handle the responses from there as well
I used exception listener to parse the data as suggested here. But, is this the right way to do this.
Ext.data.DataProxy.addListener('exception', function(proxy, type, action,
options, res) {
if (type == 'response') {
var success = Ext.util.JSON.decode(res.responseText).success;
if (success) {
console.log('UPDATE OK');
} else {
console.log('UPDATE FAILED');
}
}
});

Calling multiple views in CouchApp query

I need to search the CouchDB based on several criteria entered in a form. Name, an array of Tags and so on. I would then need various views to index on these fields. Ultimately, all the results will be collated in data.js and provided to mustache.html. Say there are 3 views - docsByName, docsByTags, docsById.
What I don't know is, how to query all these views in query.js. Can this be done and how ?
Or should the approach be of that to write one view that makes multiple emits for each search somehow ?
Thank you.
From what you say I assume you are using Evently, so I will quote from Evently primer:
The async function is the main star, which in this case makes an Ajax request (but it can do anything it wants). Another important thing to note is that the first argument to the async function is a callback which you use to tell Evently when you are done with your asynchronous action. [...] Whatever you pass to the callback function then becomes the first item passed to the data function.
In short: put your Ajax requests in async.js.
As a side note: Evently is only one of the possible choices to write a couchapp and it is not clear if it is maintained. However it works and it is easy to rearrange the code to not use it.
EDIT: here is a sample async function (cut&paste from an old program):
function(cb, e) {
var app = $$(this).app
;
app.db.openDoc('SOMEDOCID', {
error: function(code, error, reason) {
alert("Error("+code+" "+error+"): "+reason);
}
, success: function(doc) {
app.view('SOMEVIEWNAME', {
include_docs: true
, error: function(code, error, reason) {
alert("Error("+code+" "+error+"): "+reason);
}
, success: function(resp) {
resp.doc = doc;
cb(resp);
}
});
}
});
}

Resources