Multiple parallel Increments on Parse.Object - parse-platform

Is it acceptable to perform multiple increment operations on different fields of the same object on Parse Server ?
e.g., in Cloud Code :
node.increment('totalExpense', cost);
node.increment('totalLabourCost', cost);
node.increment('totalHours', hours);
return node.save(null,{useMasterKey: true});
seems like mongodb supports it, based on this answer, but does Parse ?

Yes. One thing you can't do is both add and remove something from the same array within the same save. You can only do one of those operations. But, incrementing separate keys shouldn't be a problem. Incrementing a single key multiple times might do something weird but I haven't tried it.
FYI you can also use the .increment method on a key for a shell object. I.e., this works:
var node = new Parse.Object.("Node");
node.id = request.params.nodeId;
node.increment("myKey", value);
return node.save(null, {useMasterKey:true});
Even though we didn't fetch the data, we don't need to know the previous value in order to increment it on the database. Note that you don't have the data so can't access any other necessary data here.

Related

How to update Radis for dynamic values?

I am working with some coaching using Redis in Nodejs.
here is my code implimentation.
redis.get(key)
if(!key) {
redis.set(key, {"SomeValue": "SomeValue", "SomeAnohterValue":"SomeAnohterValue"}
}
return redis.get(key)
Till here everything works well.
But let's assume a situation where I need to get the value from a function call and set it to Redis and then I keep getting the same value from Redis whenever I want, in this case, I don't need to call the function again and again for getting the value.
But for an instance, the values have been changed or some more values have been added to my actual API call, now I need to call that function again to update the values again inside the Redis corresponding to that same key.
But I don't know how can I do this.
Any help would be appreciated.
Thank you in advanced.
First thing is that your initial code has a bug. You should use the set if not exist functionality that redis provides natively instead of doing check and set calls
What you are describing is called cache invalidation and is one of the hardest parts in software development
You need to do a 'notify' in some way when the value changes so that the fetchers know that it is time to grab the most up to date value.
One simple way would be to have a dirty boolean variable that is set to true when the value is updated and when fetching you check that variable. If dirty then get from redis and set to false else return the vue from prior

How to add multiple nested object keys in Dexie?

I'm in a loop where I add several new keys (about 1 to 3) to an indexeddb row.
The dexie table looks like:
{
event_id: <event_id>
<event data>
groups: {
<group_id> : { group_name: <name>, <group data> }
}
}
I add the keys using Dexie's modify() callback, in a loop:
newGroupNr++
db.event.where('event_id').equals(event_id).modify(x => x.groups[newGroupNr]=objData)
objData is a simple object containing some group attributes.
However, this way when I add two or three groups, only one group is actually written to the database. I've tried wrapping them in a transaction(), but no luck.
I have the feeling that the issue is that the modify()-calls overlap each other, as they run asynchronously. Not sure if this is true, nor how to deal with this scenario.
Dexie modify():
https://dexie.org/docs/Collection/Collection.modify()
Related:
Dexie : How to add to array in nested object
EDIT: I found the problem, and it's not related to Dexie. However, I do not fully understand why this fix works, perhaps something to do with that in javascript everything is passed by reference instead of value? My theory is that the integer newGroupNr value was passed as reference, and in the next iteration of the loop, before Dexie was able to finish, incremented, causing effectively two creations of the same key. This fixed it:
newGroupNr++
let newGroupNrLocal = newGroupNr
db.event.where('event_id').equals(event_id).modify(x => x.groups[newGroupNrLocal]=objData)
There's a bug in Safari that hits Dexie's modify method in dexie versions below 3. If that's the case, upgrade dexie to latest. If it's not that, try debugging and nailing down when the modify callbacks are actually happening. A transaction won't help as all IDB operations go through transactions anyway and the modification you do should by no means overwrite the other.

Doing an update instantly after read

Suppose I have multiple instances of an app reading a single row with a query like the following
r.db('main').
table('Proxy').
filter(r.row('Country').eq('es').and(r.row('Reserved').eq(false))).
min(r.row('LastRequestTimeMS'))
the 'Reserved' field is a bool
I want to guarantee that the same instance that have read that value do an update to set the 'Reserved' value to true to prevent other instances from reading it
can someone please point me to how I can make this guarantee?
Generally the way you want to do something like this is you want to issue a conditional update instead, and look at the returned changes to get the read. Something like document.update(function(row) { return r.branch(row('Reserved'), r.error('reserved'), {Reserved: true}); }, {returnChanges: true}).

How to get initial documents when calling cursor.changes() with RethinkDB

The docs are explicitly vague about this:
http://rethinkdb.com/docs/changefeeds/javascript/
Point changefeeds will always return initial values and have an initializing state; feeds that return changes on unfiltered tables will never return initial values. Feeds that return changes on more complex queries may or may not return return initial values, depending on the kind of aggregation.
Is there a way to force the initial documents through the changes feed?
Suppose I have an arbitrary query. We can call query.changes.run(//...) and get the change feed, but I want to make sure I get the initial documents. At the very least, I want consistency!
Currently there's no optarg you can put there to get that, but in the 2.2 release you'll be able to use the include_initial optarg for that: https://github.com/rethinkdb/rethinkdb/issues/3579 .

How are d3.map() values reset back to default?

I am using d3.map() in an update pattern to map some values.
My code looks like this:
selectedNeighborhood.map(function(d) { rateById.set(d.endneighborhood, d.rides); }) ;
My issue is, when I make a new selection and new update, instead of replacing the existing map with a new set of values, the map is expanded. I would like my map to reset back to default every time I run my update function. How do I go about this?
One working method (not clean) is to set the map object equal to {}, then redefine the map altogether to the variable name.
Re-setting rateById to a new, blank map is not entirely unclean, but it could cause bugs if there are some objects/functions out there that retain a reference to the value of rateById in a separate variable, in which case the existing reference wouldn't update to point to the newly created map.
You want to clear the map "in place" (i.e. mutate it, so that the var rateById continues to points to the same d3.map). You can do so by looping over its entries and removing them one by one:
rateById.forEach(function(key) { rateById.remove(key); });
As a side note: it's not a big deal, but still, using Array map() for looping, as in selectedNeighborhood.map(...) ends up instantiating and returning a new Array of undefineds. If selectedNeighborhood was a giant array, this would be wasteful (in terms of memory and CPU). Using selectedNeighborhood.forEach(...) instead achieves the same result but without creating the new array, so it's more appropriate.

Resources