Parse server - get related classes - rest api - parse-platform

Assuming i have a class User and a class Profile
The profile class has a field called "sex" and a field called "user" which is a pointer to user class.
If i get the profile endpoint with : https://myapi.back4app.io/classes/Profile i can get the Profile object:
{
"results": [
{
"objectId": "sIE6lOZP7R",
"user": {
"__type": "Pointer",
"className": "_User",
"objectId": "asP3EFYSR4"
},
"sex": "male",
"createdAt": "2020-05-25T17:15:49.324Z",
"updatedAt": "2020-05-25T17:15:49.324Z"
}
]
}
and if i want to include the user of this profile, i can include with: https://myapi.back4app.io/classes/Perfil?include=user so i get:
{
"results": [
{
"objectId": "sIE6lOZP7R",
"user": {
"objectId": "asP3EFYSR4",
"username": "fabiojansen",
"createdAt": "2020-05-25T17:15:16.273Z",
"updatedAt": "2020-05-25T17:15:16.273Z",
"ACL": {
"*": {
"read": true
},
"asP3EFYSR4": {
"read": true,
"write": true
}
},
"__type": "Object",
"className": "_User"
},
"sex": "male",
"createdAt": "2020-05-25T17:15:49.324Z",
"updatedAt": "2020-05-25T17:15:49.324Z"
}
]
}
Its ok, but if i want to get all the users, with the profile information in one query? Its possible? In my User class, i dont have any pointer to Profile class, only in profile class.
Is there any way?
Thanks

You have several options:
1) You can use an aggregate pipeline and $lookup the user in the Perfil class which performs a LEFT JOIN. However, this will not return an array of Parse.Object, you'd have to parse the results manually. From the docs:
{
$lookup:
{
from: <collection to join>,
localField: <field from the input documents>,
foreignField: <field from the documents of the "from" collection>,
as: <output array field>
}
}
2) You can do 2 requests by first getting all the users and then getting all their profiles by user IDs.
3) You can change your data model and add a pointer to Perfil in your User class. If you are running this query at scale it may be beneficial.

Related

Strapi mongodd de-populate

Is there any way to prevent a call like this strapi.services.MODEL_NAME.find(query) from populating its relations?
In my specific case I have a simple Message model:
"attributes": {
"body": {
"type": "string",
"minLength": 1,
"required": true,
"maxLength": 300
},
"chat": {
"model": "chat"
},
"user": {
"model": "user",
"plugin": "users-permissions"
}
}
and in a particular case I wish not to populate user & chat, just reference their IDs.
I believe in my case the solution would be to add an empty array:
strapi.services.MODEL_NAME.find(query, [])
You can add autoPopulate option to false.
"user": {
"model": "user",
"plugin": "users-permissions",
"autoPopulate": false
}

GraphQL - How to get field types from the retrieved schema?

Knowing the schema (fetched via getIntrospectionQuery), how could I get the type of a particular field?
For example, say I run this query:
query {
User {
name
lastUpdated
friends {
name
}
}
}
and get this result:
{
"data": {
"User": [
{
"name": "alice",
"lastUpdated": "2018-02-03T17:22:49+00:00",
"friends": []
},
{
"name": "bob",
"lastUpdated": "2017-09-01T17:08:49+00:00",
"friends": [
{
"name": "eve"
}
]
}
]
}
}
I'd like to know the types of the fields and construct something like this:
{
"name": "String",
"lastUpdated": "timestamptz",
"friends": "[Friend]"
}
How could I do that without extra requests to the server?
After retrieving the schema, you can build it into a JSON object (if your graphql framework does not do it already for you).
Using a JSON parser, you can retrieve the the types of each field.
I will not enter into the detail, as it would depend on the technology your are using.

DynamoDB DocumentClient returns Set of strings (SS) attribute as an object

I'm new to DynamoDB.
When I read data from the table with AWS.DynamoDB.DocumentClient class, the query works but I get the result in the wrong format.
Query:
{
TableName: "users",
ExpressionAttributeValues: {
":param": event.pathParameters.cityId,
":date": moment().tz("Europe/London").format()
},
FilterExpression: ":date <= endDate",
KeyConditionExpression: "cityId = :param"
}
Expected:
{
"user": "boris",
"phones": ["+23xxxxx999", "+23xxxxx777"]
}
Actual:
{
"user": "boris",
"phones": {
"type": "String",
"values": ["+23xxxxx999", "+23xxxxx777"],
"wrapperName": "Set"
}
}
Thanks!
The [unmarshall] function from the [AWS.DynamoDB.Converter] is one solution if your data comes as e.g:
{
"Attributes": {
"last_names": {
"S": "UPDATED last name"
},
"names": {
"S": "I am the name"
},
"vehicles": {
"NS": [
"877",
"9801",
"104"
]
},
"updatedAt": {
"S": "2018-10-19T01:55:15.240Z"
},
"createdAt": {
"S": "2018-10-17T11:49:34.822Z"
}
}
}
Please notice the object/map {} spec per attribute, holding the attr type.
Means you are using the [dynamodb]class and not the [DynamoDB.DocumentClient].
The [unmarshall] will Convert a DynamoDB record into a JavaScript object.
Stated and backed by AWS. Ref. https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/Converter.html#unmarshall-property
Nonetheless, I faced the exact same use case, as yours. Having one only attribute, TYPE SET (NS) in my case, and I had to manually do it. Next a snippet:
// Please notice the <setName>, which represents your set attribute name
ddbTransHandler.update(params).promise().then((value) =>{
value.Attributes[<setName>] = value.Attributes[<setName>].values;
return value; // or value.Attributes
});
Cheers,
Hamlet

Apollo readQuery Fails Even Though Target Object is Present?

I'm working on a call to readQuery. I'm getting an error message:
modules.js?hash=2d0033b4773d9cb6f118946043f7a3d4385825fe:25847
Error: Can't find field resolutions({"id":"Resolution:DHSzPa8bvPCDjuAac"})
on object (ROOT_QUERY) {
"resolutions": [
{
"type": "id",
"id": "Resolution:AepgCCio9KWGkwyMC",
"generated": false
},
{
"type": "id",
"id": "Resolution:DHSzPa8bvPCDjuAac", // <==ID I'M SEEKING
"generated": false
}
],
"user": {
"type": "id",
"id": "User:WWv57KsvqWeAoBNHY",
"generated": false
}
}.
The object with that id appears to be plainly visible as the second entry in the list of resolutions.
Here's my query:
const GET_CURRENT_RESOLUTION_AND_GOALS = gql`
query Resolutions($id: String!) {
resolutions(id: $id) {
_id
name
completed
goals {
_id
name
completed
}
}
}
`;
...and here's how I'm calling it:
<Mutation
mutation={CREATE_GOAL}
update={(cache, {data: {createGoal}}) => {
let id = 'Resolution:' + resolutionId;
const {resolutions} = cache.readQuery({
query: GET_CURRENT_RESOLUTION_AND_GOALS,
variables: {
id
},
});
}}
>
What am I missing?
Update
Per the GraphQL Dev Tools extension for Chrome, here's the whole GraphQL data store:
{
"data": {
"resolutions": [
{
"_id": "AepgCCio9KWGkwyMC",
"name": "testing 123",
"completed": false,
"goals": [
{
"_id": "TXq4nvukpLcqQhMRL",
"name": "test goal abc",
"completed": false,
"__typename": "Goal"
},
],
"__typename": "Resolution"
},
{
"_id": "DHSzPa8bvPCDjuAac",
"name": "testing 345",
"completed": false,
"goals": [
{
"_id": "PEkg5oEEi2tJ6i8LH",
"name": "goal abc",
"completed": false,
"__typename": "Goal"
},
{
"_id": "X4H4dFzGm5gkq5bPE",
"name": "goal bcd",
"completed": false,
"__typename": "Goal"
},
{
"_id": "hYunrXsMq7Gme7Xck",
"name": "goal cde",
"completed": false,
"__typename": "Goal"
}
"__typename": "Resolution"
}
],
"user": {
"_id": "WWv57KsvqWeAoBNHY",
"__typename": "User"
}
}
}
Posted as answer for fellow apollo users with similar problems:
Remove the prefix of Resolution:, the query should only take the id.
Then the question arises how is your datastore filled?
To read a query from cache, the query needs to have been called with exactly the same arguments on the remote API before. This way apollo knows what the result for a field is with specific arguments. If you never called the remote endpoint with the arguments you want to use but know what the result would be, you can circumvent that and resolve the query locally by implementing a cache resolver. Have a look at the example in the documentation. Here the store contains a list of books (in your case resultions) and the query for a single book by id can be resolved with a simple cache lookup.

Parse Query by subfield/dot notation

tl;dr
Can ParseCloud/MongoDB filter by Pointer<class>.filed ? By
Pointer<class>.Pointer<class> ? By existence of data in that filed?
Long question:
Round is object which will be played automatically when time will come.
Payment object which indicates that user made payment. When payment being spent we set field round to it.
Player which links online User with Payment
I need to query player for few conditions:
Player
online
has valid(no round and valid equal to 'valid') payment
Player
user equal to specific user
has no payment
Player
user equal to specific user
has valid(no round and valid equal to 'valid') payment
And I made everything to work except validating Payment inside Player query.
Here is condition 1 from the list.
var query = new Parse.Query(keys.Player);
query.skip(0);
query.limit(oneRoundMaxPlayers);
query.greaterThanOrEqualTo(keys.last_online_date, lastAllowedOnline);
// looks like no filter applied here
query.doesNotExist("payment.round");
query.exists(keys.payment);
// This line will make query return 0 elements
// query.equalTo("payment.valid", "valid");
query.include(keys.user);
query.include(keys.payment);
Here is 2 OR 3
var queryPaymentExists = new Parse.Query(keys.Player);
queryPaymentExists.skip(0);
queryPaymentExists.limit(1);
queryPaymentExists.exists(keys.payment);
//This line not filtering
queryPaymentExists.doesNotExist(keys.payment + "." + keys.round);
queryPaymentExists.equalTo(keys.user, user);
// This line makes query always return 0 elements
// queryPaymentExists.equalTo(keys.payment + "." + keys.valid, keys.payment_valid);
var queryPaymentDoesNotExist = new Parse.Query(keys.Player);
queryPaymentDoesNotExist.skip(0);
queryPaymentDoesNotExist.limit(1);
queryPaymentDoesNotExist.doesNotExist(keys.payment);
queryPaymentDoesNotExist.equalTo(keys.user, user);
var compoundQuery = Parse.Query.or(queryPaymentExists, queryPaymentDoesNotExist);
compoundQuery.include(keys.user);
compoundQuery.include(keys.payment);
compoundQuery.include(keys.payment + "." + keys.round);
I've checked logs from Mongo and they looks following
verbose: REQUEST for [GET] /classes/Player: {
"include": "user,payment,payment.round",
"where": {
"$or": [
{
"payment": {
"$exists": true
},
"payment.round": {
"$exists": false
},
"user": {
"__type": "Pointer",
"className": "_User",
"objectId": "ASPKs6UVwb"
}
},
{
"payment": {
"$exists": false
},
"user": {
"__type": "Pointer",
"className": "_User",
"objectId": "ASPKs6UVwb"
}
}
]
}
}
Here is response:
verbose: RESPONSE from [GET] /classes/Player: {
"response": {
"results": [
{
"objectId": "VHU9uwmLA7",
"last_online_date": {
"__type": "Date",
"iso": "2017-10-28T15:15:23.547Z"
},
"user": {
"objectId": "ASPKs6UVwb",
"username": "cn92Ekv5WPJcuHjkmTajmZMDW",
},
"createdAt": "2017-10-22T11:43:16.804Z",
"updatedAt": "2017-10-25T09:23:20.035Z",
"ACL": {
"*": {
"read": true
},
"ASPKs6UVwb": {
"read": true,
"write": true
}
},
"__type": "Object",
"className": "_User"
},
"createdAt": "2017-10-27T21:03:35.442Z",
"updatedAt": "2017-10-28T15:15:23.556Z",
"payment": {
"objectId": "nr7ln7U3eJ",
"payment_date": {
"__type": "Date",
"iso": "2017-10-27T23:42:50.614Z"
},
"user": {
"__type": "Pointer",
"className": "_User",
"objectId": "ASPKs6UVwb"
},
"createdAt": "2017-10-27T23:42:50.624Z",
"updatedAt": "2017-10-28T15:12:30.131Z",
"valid": "valid",
"round": {
"objectId": "jF9gqG4ndh",
"round_date": {
"__type": "Date",
"iso": "2017-10-28T15:12:00.027Z"
},
"createdAt": "2017-10-28T15:11:00.036Z",
"updatedAt": "2017-10-28T15:12:30.108Z",
,
"ACL": {
"*": {
"read": true
}
},
"__type": "Object",
"className": "Round"
},
"ACL": {
"ASPKs6UVwb": {
"read": true
}
},
"__type": "Object",
"className": "Payment"
},
"ACL": {
"ASPKs6UVwb": {
"read": true
}
}
}
]
}
}
You can see that response contains payment.round.
My question is following:
Can ParseCloud/MongoDB filter by Pointer<class>.filed ? By Pointer<class>.Pointer<class> ? By existence of data in that filed?
How can I workaround in situation when I need to check field presence if User can have may Players, User can have many Payments.
UPD
As far as I found mongo should support filtering by "dot notation"
mongodb query by sub-field
So what am I doing wrong?
Short answer:
No
Simplify your data structure
Long answer:
Dot notation can be used to
include documents of pointers, as you already did in your code, e.g. include(keys.user)
filter for properties of fields, e.g. {properyA: 1, propertyB: 2}. All the data is in the field, not in another document in another collection that is referenced by a Parse pointer.
Dot notation cannot be used as filter parameter for referenced pointers in a Parse query. MongoDB also does not support such a filtering, the concept of pointer is one by Parse and not by MongoDB. In a NoSQL environment like MongoDB there are no relations between tables to be used in the query language, as it is not a "relational database" like an SQL database. However Parse provides some comfort of an SQL for simple queries with its concepts of pointer, compoundQuery and matchesKeyInQuery.
If that is not sufficient in your case, simply add the fields to the collection. To the expense that you may have the same fields and data in multiple collections but with the advantage of faster query execution time.
Finding the right data structure is one of the big topics for NoSQL as there is no general right structure. The collections and document structures are basically designed as a trade off between:
execution performance
query necessity / frequency
security (access level)
and data storage size
And they are liquid and can change over time. As your app and its queries mutate you'd also change the data structure if the long term gain is greater than the one time effort.

Resources