I'm currently observing some Ember arrays like so:
observe_array: function() {
this.get("my_array").forEach(function(e,i) {
// do something
});
}.observes("my_array.#each");
Most times, if my_array is updated, multiple elements are added at once.
However, the observer fires one-by-one as each element is added, which becomes extremely inefficient. Is there anyway to do this more efficiently? Essentially, I need to be able to have a mutated array based on "my_array"
For reference, realistic sizes of my_array will be between 600-1200 elements. The "do something" block involves some operations that take a little more time - creating Date objects from strings and converting each element to json representation.
Instead of doing an observer I also tried a property with the cacheable() method/flag, but that didn't seam to speed things up very much....
Assuming (via comments) that your array is an ember-data populated one, you should try observing array.isUpdating property. I got success w/ this one.
The only drawback is it is only set when using .findAll()! (so Model.find())
Related
I'm in a loop where I add several new keys (about 1 to 3) to an indexeddb row.
The dexie table looks like:
{
event_id: <event_id>
<event data>
groups: {
<group_id> : { group_name: <name>, <group data> }
}
}
I add the keys using Dexie's modify() callback, in a loop:
newGroupNr++
db.event.where('event_id').equals(event_id).modify(x => x.groups[newGroupNr]=objData)
objData is a simple object containing some group attributes.
However, this way when I add two or three groups, only one group is actually written to the database. I've tried wrapping them in a transaction(), but no luck.
I have the feeling that the issue is that the modify()-calls overlap each other, as they run asynchronously. Not sure if this is true, nor how to deal with this scenario.
Dexie modify():
https://dexie.org/docs/Collection/Collection.modify()
Related:
Dexie : How to add to array in nested object
EDIT: I found the problem, and it's not related to Dexie. However, I do not fully understand why this fix works, perhaps something to do with that in javascript everything is passed by reference instead of value? My theory is that the integer newGroupNr value was passed as reference, and in the next iteration of the loop, before Dexie was able to finish, incremented, causing effectively two creations of the same key. This fixed it:
newGroupNr++
let newGroupNrLocal = newGroupNr
db.event.where('event_id').equals(event_id).modify(x => x.groups[newGroupNrLocal]=objData)
There's a bug in Safari that hits Dexie's modify method in dexie versions below 3. If that's the case, upgrade dexie to latest. If it's not that, try debugging and nailing down when the modify callbacks are actually happening. A transaction won't help as all IDB operations go through transactions anyway and the modification you do should by no means overwrite the other.
I never quite understood the if needed part of the description.
.fetchAll()
Fetches the given list of Parse.Object.
.fetchAllIfNeeded()
Fetches the given list of Parse.Object if needed.
What is the situation where I might use this and what exactly determines the need? I feel like it's something super elementary but I haven't been able to find a satisfactory and clear definition.
In the example in the API, I notice that the fetchAllIfNeeded() has:
// Objects were fetched and updated.
In the success while the fetchAll only has:
// All the objects were fetched.
So does the fetchAllIfNeeded() also save stuff too? Very confused here.
UPDATES
TEST 1
Going on some of the hints #danh left in the comments I tried the following things.
var todos = [];
var x = new Todo({content:'Test A'}); // Parse.Object
todos.push(x);
x.save();
// So now we have a todo saved to parse and x has an id. Async assumed.
x.set({content:'Test B'});
Parse.Object.fetchAllIfNeeded(todos);
So in this scenario, my client x is different than the server. But the x.hasChanged() is false since we used the set function and the change event is triggered. fetchAllIfNeeded returns no results. So it isn't that it's trying to compare this outright to what is on the server to sync and fetch.
I notice that in the request payload, running the fetchAllIfNeeded is sending the following interesting thing.
{where: {objectId: {$in: []}}, _method: "GET",…}
So it seems that on the clientside something determines whether an object isNeeded
Test 2
So now, based on the comments I tried manipulating the changed state of the object by setting with silent.
x.set({content:'Test C'}, {silent:true});
x.hasChanged(); // true
Parse.Object.fetchAllIfNeeded(todos);
Still nothing interesting. Clearly the server state ("Test A") is different than clientside ("Test C"). and I still results [] and the request payload is:
{where: {objectId: {$in: []}}, _method: "GET",…}
UPDATE 2
Figured it out by looking at the Parse source. See answer.
After many manipulations, then taking a look at the source - I figured this out. Basically fetchAllIfNeeded will fetch models in an array that have no data, meaning there are no attribute properties and values.
So the use case would be you have lets say a parent object with an array of nested Parse Objects. When you fetch the parent object, the nested child objects in the array will not be included (unless you have the include query constraint set). Instead, the pointers are sent back to clientside and in your client, those pointers are translated into 'empty' models with no data, basically just blank Parse.Objects with ids.
Specifically, the Parse.Object has an internal Boolean property called _hasData which seems to be toggled true any time stuff like set, or fetch, or whatever gives that model attributes.
So, lets say you need to fetch those child objects. You can just do something like
var childObjects = parent.get('children'); // Array
Parse.Object.fetchAllIfNeeded(childObjects);
And it will search for those children who are currently only represented as empty Objects with id.
It's useful as opposed to fetchAll in that you might go through the children array and lazily load one at a time as needed, then at a later time need to "get the rest". fetchAllIfNeeded essentially just filters "the rest" and sends a whereIn query that limits fetching to those child objects that have no data.
In the Parse documentation, they have a comment in the callback response to fetchAllIfNeeded as:
// Objects were fetched and UPDATED.
I think they mean the clientside objects were updated. fetchAllIfNeeded is definitely sending GET calls so I doubt anything updates on the serverside. So this isn't some sync function. This really confused me as I instantly thought of serverside updating when they really mean:
// Client objects were fetched and updated.
I am using d3.map() in an update pattern to map some values.
My code looks like this:
selectedNeighborhood.map(function(d) { rateById.set(d.endneighborhood, d.rides); }) ;
My issue is, when I make a new selection and new update, instead of replacing the existing map with a new set of values, the map is expanded. I would like my map to reset back to default every time I run my update function. How do I go about this?
One working method (not clean) is to set the map object equal to {}, then redefine the map altogether to the variable name.
Re-setting rateById to a new, blank map is not entirely unclean, but it could cause bugs if there are some objects/functions out there that retain a reference to the value of rateById in a separate variable, in which case the existing reference wouldn't update to point to the newly created map.
You want to clear the map "in place" (i.e. mutate it, so that the var rateById continues to points to the same d3.map). You can do so by looping over its entries and removing them one by one:
rateById.forEach(function(key) { rateById.remove(key); });
As a side note: it's not a big deal, but still, using Array map() for looping, as in selectedNeighborhood.map(...) ends up instantiating and returning a new Array of undefineds. If selectedNeighborhood was a giant array, this would be wasteful (in terms of memory and CPU). Using selectedNeighborhood.forEach(...) instead achieves the same result but without creating the new array, so it's more appropriate.
I have a line graph that is being updated every 5 seconds as new data is pulled from a mySQL database.
https://gist.github.com/Majella/5fc4cd5f41a6ddf2df23
How do I remove the first/oldest element from the array each time the data is called to stop the line/path being compressed?
I've tried adding data.shift() in the update function just after the data is called but only works for the first call?
I don't know the details of what lives behind getdata.php, but I assume it's returning progressively more data points each time, thus removing only the first one still lives you with a larger data set than you want. So you have a couple choices:
Change the server-side of getdata.php to return only the latest x data points (or maybe add a querystring parameter for how many points/minutes/whatever to retrieve)
Change the client-side in updateData to check the length of the array and .slice off the elements starting at lengthYouWant minus lengthYouReceived (assuming the data is already sorted correctly)
I have an array of users who are managers.
However there are repeated Users.
I would like to group them so that there is only one instance of each User in the array.
What would be the best way to go about this?
#managers.sort_by{|obj| obj.id} # Just sorted the data but did not eliminate duplicats
#managers.group_by{|u|u.name} # just created a bunch of arrays for each name
Use the uniq method, which returns a new array with duplicates removed.
#managers.uniq
If by duplicate you mean the same object ID, then you can do the following:
#managers.uniq.group_by(&:name)
Filtering the array feels like fixing symptoms. Why does the array contain rubbish in the first place?
I would suggest adding a manager? method to your User model that returns true if the user is a manager. Then you could to something like
#managers = User.select &:manager?
and get an array that only contains managers.
you can also do
Manager.select('DISTINCT user_id')
to get a clean array in the first place, whith better performance.