Query based on Pointer Values - parse-platform

I have read majority of the questions which quite havent helped me.
So I have three tables, User (parse Default), DataSet and Datapoint.
DataSet is linked to User based on Pointer.
Datapoint is linked to Dataset based on Pointer (object ID of DataSet).
I can easily load DataSets for each User Parse. What i would like to do is load datapoints for a dataset.
I am using Angular, and I can pass the dataset object to, and based of that ID I want to be able to get datapoints for that dataset.
He is what I have so far.
getmyDataPoint: function getMyDataPoint(dataset, callback) {
//get object ID of dataset
var datasetid = dataset.id;
var query = new Parse.Query(Datapoint);
query.include('inDataset');
//inDataset is column name that is pointer, linking to dataset.
query.EqualTo('inDataset', datasetid);
query.find({
success: function(results) {
callback(results);
alert("found some here " + results);
},
error: function(error) {
alert("Error: no datapoint found " + error.message);
}
});
As you would know this quite doesnt work. Any help?

Lets assume you fix up your classes as follows:
DataSet class
user: Pointer<User>
Datapoint class
inDataset: Pointer<DataSet>
Now you can query for Datapoint rows where the inDataset column matches a DataSet quite easily:
query.equalTo('inDataset', dataset);
Under the covers Parse's SDK extracts the object ID and matches on that, but using an actual pointer makes other types of queries possible that you couldn't do with just the strings as you are currently doing it.

Related

Using afterDelete trigger to modify a lot of users

When a specific object is deleted, I need to use an afterDelete trigger to remove references to the object that was just deleted. Specifically, the User class has a column that is a pointer to an object of the type that was just deleted. Therefore I need to unset that column for users who had that set to the object that was just deleted. To do this I am querying for the users, looping over the results of the query, unseting the attribute, then calling saveAll. My worry is that the results of the query may return a lot of users, and I need to ensure all of them are updated.
My question is, do Cloud Code triggers have the 1000 max query limit? Is there a better way to unset this pointer once that object is deleted? Is there no automatic removal of pointers to this deleted object?
Parse.Cloud.afterDelete("Book", function(request) {
Parse.Cloud.useMasterKey();
var book = request.object;
var userQuery = new Parse.Query(Parse.User);
userQuery.equalTo("Favorite_Book", book);
userQuery.limit(1000);
userQuery.find( {
success:function(users){
for (var i = 0; i < users.length; i++) {
users[i].unset("Favorite_Book");
}
Parse.Object.saveAll(users, {
success: function(users) {},
error: function(users, error) {
console.error("Failed to update users: " + error.code + ": " + error.message);
}
});
}, error: function(error) {
console.error("Failed to fetch users: " + error.code + ": " + error.message);
}
});
});
There are mainly two issue you need to know:
Parse query returns only maximum of 1000 records. To process more records, you need paginate the results using skip method on your query object. You can use Promises in Series to process all your records in batches of 1000 records.
On Parse free plan, you are limited to make only 1800 requests per minute. This means that you cannot save/update a large number of records over a short time span.

Data binding in D3 fails when using "cloned" data

D3 data binding seem to be behave differently when using the original data object, vs. using a cloned version of the data object. I have a function updateTable which updates an array of tables based on the passed array of arrays. If an array (representing one new table row) is added to the array of arrays, and passed to the updateFunction, all works as expected (the row is added to the table). If however, we make a shallow copy (clone) of this data structure and pass it to the updateFunction, the data binding fails and no table row is added. Please note that the original data structure and clone are two different objects, however with identical values.
Please see this JSFiddle example. Two tables are generated, one fed the original data, the other the cloned data. The two tables are clearly different, as the second table (built using cloned data) does NOT contain the third row.
'use strict';
d3.select("body").append("h3").text("D3 Data Binding Issue");
// create two divs to hold one table each
var tableDiv1 = d3.select("body").append("div");
d3.select("body").append("hr");
var tableDiv2 = d3.select("body").append("div");
// define data
// here, an array of a single item (which represents a table), containing an array of arrays,
// each destined for a table row
var data = [
{ table: "Table1", rows: [
{ table: "Table1", row: "Row1", data: "DataT1R1" },
{ table: "Table1", row: "Row2", data: "DataT1R2" }
]
}
];
// run update on the initial data
update(data);
// add 3rd array to the data structure (which should add a third row in each table)
data[0].rows.push({ table: "Table1", row: "Row3", data: "DataT1R3" });
// run update again
// observe that the Lower table (which is using cloned data) does NOT update
update(data);
/*
// remove first array of the data structure
data[0].rows.shift();
// run update again
// observe that the Lower table (which again is using cloned data) does NOT update
update(data);
*/
// function to run the tableUpdate function targeting two different divs, one with the
// original data, and the other with cloned data
function update(data) {
// the contents of the two data structures are equal
console.log("\nAre object values equal? ", JSON.stringify(data) == JSON.stringify(clone(data)));
tableUpdate(data, tableDiv1, "Using Original Data"); // update first table
tableUpdate(clone(data), tableDiv2, "Using Cloned Data"); // update second table
}
// generic function to manage array of tables (in this simple example only one table is managed)
function tableUpdate(data, tableDiv, title) {
console.log("data", JSON.stringify(data));
// get all divs in this table div
var divs = tableDiv.selectAll("div")
.data(data, function(d) { return d.table }); // disable default by-index eval
// remove div(s)
divs.exit().remove();
// add new div(s)
var divsEnter = divs.enter().append("div");
// append header(s) in new div(s)
divsEnter.append("h4").text(title);
// append table(s) in new div(s)
var tableEnter = divsEnter.append("table")
.attr("id", function(d) { return d.table });
// append table body in new table(s)
tableEnter.append("tbody");
// select all tr elements in the divs update selection
var tr = divs.selectAll("table").selectAll("tbody").selectAll("tr")
.data(function(d, i, a) { return d.rows; }, function(d, i, a) { return d.row; }); // disable by-index eval
// remove any row(s) with missing data array(s)
tr.exit().remove();
// add row(s) for new data array(s)
tr.enter().append("tr");
// bind data to table cells
var td = tr.selectAll("td")
.data(function(d, i) { return d3.values(d); });
// add new cells
td.enter().append("td");
// update contents of table cells
td.text(function(d) { return d; });
}
// source: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm
function clone(objectToBeCloned) {
return JSON.parse(JSON.stringify(objectToBeCloned));
}
Can anybody shed some light on this behavior? I believe I'm using the key functions properly, but could be wrong. In my application I need to re-generate the data structure before each table update, and I don't have option of reusing the original object.
The root of the problem is that you have a nested structure and .selectAll() doesn't update the data bound to the elements (but .append() automatically "inherits" the data). So the data that you use to render the table is simply not updated -- you can fix this by using .select() instead of .selectAll() (see the updated example).
The subtle difference between .select() and .selectAll() is that the former (similar to .append()) "inherits" the data bound to the elements in the current selection to the newly selected elements, while .selectAll() does not.
So why does it work for the original data? Well, D3 doesn't copy the data when it binds it to an element, but references it. By modifying the original data, you're also modifying what's bound to the elements. Hence simply running the code without rebinding any data works. The cloned data isn't updated as you're not modifying it directly.
Actually, the problem is due to an anti-pattern that you are using to "muscle" the tr structure.
The problem
During the second pass through tableUpdate, the key function finds a match on d.table for both the original and the un-cloned data. This is because the key is converted to a string during the binding process so even though
d.table === data.table; // false
it's still a match because
d.table == data.table; // true
Therefore the enter selection is empty in both cases and all of this code
var divsEnter = divs.enter().append("div");
// append header(s) in new div(s)
divsEnter.append("h4").text(title);
// append table(s) in new div(s)
var tableEnter = divsEnter.append("table")
.attr("id", function(d) { return d.table });
// append table body in new table(s)
tableEnter.append("tbody");
does nothing.
So the original data is not re-bound and the new, cloned data is not bound. But...
the data bound to the first table now has three rows because, as Lars pointed out, it is bound by reference. so, for the first table,
divs.datum() === data; // true
and it now has three rows.
In the case of the cloned data, the key function also returns true because you haven't changed it. Even though it has an extra row, data.key is still "Table1". So you are telling the key function that it's the same table. Consequently, the enter selection is also empty so, the new, cloned data is also not bound so, for the second table,
divs.datum() === data; // false
d.table == data.table == "Table1" // um, true true
and it still has two rows.
The problem is you use an an anti-pattern to bind the data and build the tr elements.
Instead of selecting and binding the data following the hierarchy of it's structure, you go off piste and go back to the div and just ram it down to the tr element to build the structure. This is dangerous because the returned tr elements are unqualified, none of the important context that you gained from carefully selecting/creating the correct tbody element is used to ensure that these are the correct tr elements, they are in fact, whatever tr elements that happen to be laying around - regardless of which table they belong to - inside the div.
In both cases you simply rebuild the tr elements using the original arrays that are still attached, which is fine for the first table but for the second one... not so much.
My "current theory" of best practice is to build your data structure to model the intended structure of your visualisation first and then construct the DOM elements by walking that data structure, binding at each level and kicking the remaining data ahead of you as you go, until finally, it's all bound.
The solution
You need to be truly "data driven" and strictly follow the data structure when building and binding your elements. I re-built your updateTable function below...
'use strict';
d3.select("body").append("h3").text("D3 Data Binding Issue").style({margin:0});
// create two divs to hold one table each
var tableDiv1 = d3.select("body").append("div");
var tableDiv2 = d3.select("body").append("div");
// define data
// here, an array of a single item (which represents a table), containing an array of arrays,
// each destined for a table row
var data = [{
table: "Table1",
rows: [{
table: "Table1",
row: "Row1",
data: "DataT1R1"
}, {
table: "Table1",
row: "Row2",
data: "DataT1R2"
}]
}];
// run update on the initial data
update(data);
update(data);
// add 3rd array to the data structure (which should add a third row in each table)
data[0].rows.push({
table: "Table1",
row: "Row3",
data: "DataT1R3"
});
// run update again
// observe that the Lower table (which is using cloned data) does NOT update
update(data);
/*
// remove first array of the data structure
data[0].rows.shift();
// run update again
// observe that the Lower table (which again is using cloned data) does NOT update
update(data);
*/
// function to run the tableUpdate function targeting two different divs, one with the
// original data, and the other with cloned data
function update(data) {
// the contents of the two data structures are equal
console.log("\nAre object values equal? ", JSON.stringify(data) == JSON.stringify(clone(data)));
tableUpdate(data, tableDiv1, "Using Original Data"); // update first table
tableUpdate(clone(data), tableDiv2, "Using Cloned Data"); // update second table
}
// generic function to manage array of tables (in this simple example only one table is managed)
function tableUpdate(data, tableDiv, title) {
console.log("data", JSON.stringify(data));
// get all divs in this table div
var divs = tableDiv.selectAll("div")
.data(data, function (d) {
return d.table
}); // disable default by-index eval
// remove div(s)
divs.exit().remove();
// add new div(s)
var divsEnter = divs.enter().append("div");
// append header(s) in new div(s)
divsEnter.append("h4").text(title);
// append or replace table(s) in new div(s)
var table = divs.selectAll("table")
.data(function (d) {
// the 1st dimension determines the number of elements
// this needs to be 1 (one table)
return [d.rows];
}, function (d) {
// need a unique key to diferenciate table generations
var sha256 = new jsSHA("SHA-256", "TEXT");
return (sha256.update(JSON.stringify(d)),
console.log([this.length ? "data" : "node", sha256.getHash('HEX')].join("\t")),
sha256.getHash('HEX'));
});
table.exit().remove();
// the table body will have the same data pushed down from the table
// it will also be the array of array of rows
table.enter().append("table").append("tbody");
console.log(table.enter().size() ? "new table" : "same table")
var tBody = table.selectAll("tbody");
// select all tr elements in the divs update selection
var tr = tBody.selectAll("tr")
.data(function (d, i, a) {
// return one element of the rows array
return d;
}, function (d, i, a) {
return d.row;
}); // disable by-index eval
// remove any row(s) with missing data array(s)
tr.exit().remove();
// add row(s) for new data array(s)
tr.enter().append("tr");
// bind data to table cells
var td = tr.selectAll("td")
.data(function (d, i) {
return d3.values(d);
});
// add new cells
td.enter().append("td");
// update contents of table cells
td.text(function (d) {
return d;
});
}
// source: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm
function clone(objectToBeCloned) {
return JSON.parse(JSON.stringify(objectToBeCloned));
}
table, th, td {
border: 1px solid gray;
}
body>div { display: inline-block; margin: 10px;}
<body>
<script src="https://cdnjs.cloudflare.com/ajax/libs/d3/3.5.6/d3.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jsSHA/2.0.1/sha.js"></script>
The interesting thing(s)
The interesting thing is, that the table bound to the original data never gets replaced. The reason being that, again, as mentioned by #Lars, the data is bound by reference.
As an experiment (and inspired by my love-hate relationship with git) I used a 256 bit sha as a key, feeding the stringified data to it. If you're managing a bunch of tables in the same space then maybe this is the way to go. If you always clone the data and calculate a sha then that feels like a very secure approach.
By way of illustration, here is a redacted log (I added a second update with the same data at the start...)
This is the first pass where there are no nodes yet. The key function is only invoked once on each data element because the update selection is empty.
Are object values equal? true
data [{"table":"Table1","rows":[{"tab...,"data":"DataT1R2"}]}]
data a09a5ef8f6b81669eed13c93f609884...
new table ...
data [{"table":"Table1","rows":[{"tab...,"data":"DataT1R2"}]}]
data a09a5ef8f6b81669eed13c93f609884...
new table ...
...
This is the second call with the same data. You can see that the key function is called twice for each table and that the sha is the same for both, hence the "same table" anotation.
Are object values equal? true ...
data [{"table":"Table1","rows":[{"tab...,"data":"DataT1R2"}]}]
node a09a5ef8f6b81669eed13c93f609884...
data a09a5ef8f6b81669eed13c93f609884...
same table ...
data [{"table":"Table1","rows":[{"tab...,"data":"DataT1R2"}]}]
node a09a5ef8f6b81669eed13c93f60...
data a09a5ef8f6b81669eed13c93f60...
same table
Here is the interesting case where, even though the data has changed, the key function returns the same sha for node and data for the first table. The second table is as expected, with different sha for node and data and a new table generated.
Are object values equal? true
data [{"table":"Table1","rows":[{...,"data":"DataT1R3"}]}]
node 7954982db25aee37483face1602...
data 7954982db25aee37483face1602...
same table ...
data [{"table":"Table1","rows":[{...,"data":"DataT1R3"}]}]
node a09a5ef8f6b81669eed13c93f60...
data 7954982db25aee37483face1602...
new table

Is it possible to retrieve data of all classes in Parse using single REST URL request?

just a scenario :
I have 4 classes created in Parse cloud database for a particular Application - ClassA, ClassB, ClassC, ClassD.
I can retrieve data related to ClassA using REST URL like - https://api.parse.com/1/classes/ClassA
Is it possible to retrieve data of all 4 classes using single REST URL ?
No, it's not possible to do this. You can query from a single class at a time, and a maximum of 1,000 objects.
A cloud function can make multiple queries and merge the results, meaning that a single REST call (to call the function) could return results from multiple classes (but a maximum of 1,000 objects per query). Something like this:
Parse.Cloud.define("GetSomeData", function(request, response) {
var query1 = new Parse.Query("ClassA");
var query2 = new Parse.Query("ClassB");
query1.limit(1000);
query2.limit(1000);
var output = {};
query1.find().then(function(results) {
output['ClassA'] = results;
return query2.find();
}).then(function(results) {
output['ClassB'] = results;
response.success(output);
}, function(error) {
response.error(error);
});
});

How to select distinct row in CouchDB?

I have the view:
function (doc) {
var obj;
obj = {
one:doc.document.someParameter1,
two:doc.document.someParameter2
};
emit(doc.document.id, obj);}
On request it returns several something like
{"total_rows":511,"offset":381,"rows":[
{"id":"CDOC_2.16.840.1.113883.3.59.3:0947___QCPR___80717","key":"7012979","value":{"one":"one","two":"two"}},
{"id":"CDOC_2.16.840.1.113883.3.59.3:0947___QCPR___80921","key":"7012979","value":{"one":"one","two":"two"}}
]}
Is there a way instead of several results get just one?
Of course, I could do filtering on the application side, but this could be very expensive since I have to transfer all unnecessary results.

How can I fill the text fields in page by selecting combo selection by dyanmicaly?

Suppose in the database there is one table (for example student records) with a number of columns, but my HTML page only displays a few of them. For example: a table having the columns sno, sname, addr, age, dept, and dob and my page having only 3 fields: sno, sname, dept. Here I display the dept field in a combo box control and the rest of the fields are text and empty values.
My requirement is: when I select the dept from combobox the corresponding row vlaues like sno and sname have to display automatically in the text fields. How can I do this?
You make an ajax call to your server when you change the department. Your server returns your object as JSON. Your success handler in your ajax call takes the fields it needs from your JSON object and sets the appropriate html elements to those fields. For example, if you're using jQuery:
var myUrl = "http://some.domain/action/";
var success = function(response) {
$('#textField1').val(response.sno);
$('#textField2').val(response.sname);
}
$('#myDropDown').change(function() {
$.get(myUrl + $(this).val(), success);
});
If you don't want to send back all the properties of your object because you want to minimize bandwidth, you can form your own JSON object, but that's kind of a pain in the neck and if your object is not huge you might as well use whatever JSON serializer is available in the framework you're using and just serialize the object and send it through the wire.
Or here's another option. You could just make one initial AJAX call and then create a map from department to the other fields you want to set. Example:
var map = {};
var success = function(response) {
for (obj in response.objects) {
map[obj.department] = { sno: obj.sno, sname: obj.sname };
}
}
$.get(myUrl, success);
$('#myDropDown').change(function() {
var obj = map[$(this).val()];
$('#textField1').val(obj.sno);
$('#textField2').val(obj.sname);
}
Hope that helps.

Resources