Limit nested objects results in rethinkdb query - rethinkdb

I would like to limit the number of nested objects inside a Rethinkdb query. Suppose I have conversations with nested messages.
[conversations]
[{
id: "fgh675",
name: "Some conversation",
messages: [{
id:"jhu432",
contents: "Hello world!",
createdAt: "2016-01-01 00:01:01"
},
{
id:"bgj876",
contents: "Hello earth",
createdAt: "2016-01-01 00:01:01"
}]
}]
How can i limit the number of messages objects ?
Event better, how can i write a query returning only the last message .merge(function(c) { return {msg: c("messages").slice(-1)}; }), but I cant find how to order messages first... (would that query be efficient if there are many messages) ?

limit can limit the number of messages:
conversations.merge(conversation => {
messages: conversation('messages').limit(3)
})
orderBy can be used to sort the array:
conversations.merge(conversation => {
messages: conversation('messages').orderBy('createdAt')
})
If you sort the messages on every query, it may be more efficient to store the message list already sorted.

Related

What is the recommended schema for paginated GraphQL results

Let's say I have users list to be returned. What would be best schema strategy among following.
Users returned contains only the data of user as follows, separate query is used for pagination details. In this query the downside is we need to pass same filters to both users and usersCount query.
query {
users(skip: 0, limit: 100, filters: someFilter) {
name
},
usersCount(filters: someFilters)
}
Which return following
{
results: {
users: [
{ name: "Foo" },
{ name: "Bar" },
],
usersCount: 1000,
}
}
In this strategy we make pagination details as part of users query, we don't need to pass filters twice. I feel this query is not nice to read.
query {
users(skip: 0, limit: 100, filters: someFilter) {
items: {
name
},
count
}
}
Which returns the following result
{
results: {
users: {
items: [
{ name: "Foo" },
{ name: "Bar" },
],
count: 1000,
}
}
}
I am curious to know which strategy is the recommended way while designing paginated results?
I would recommend to follow the official recommendation on graphql spec,
You need to switch to cursor based pagination.
This type of pagination uses a record or a pointer to a record in the dataset to paginate results. The cursor will refer to a record in the database.
You can follow the example in the link.
GraphQL Cursor Connections Specification
Also checkout how GitHub does it here: https://docs.github.com/en/graphql/reference/interfaces#node

Apollo mixes two different arrays of the same query seemingly at random

With a schema like
schema {
query: QueryRoot
}
scalar MyBigUint
type Order {
id: Int!
data: OrderCommons!
kind: OrderType!
}
type OrderBook {
bids(limit: Int): [Order!]!
asks(limit: Int): [Order!]!
}
type OrderCommons {
quantity: Int!
price: MyBigUint! // where it doesn't matter whether it's MyBigUint or a simple Int - the issue occurs anyways
}
enum OrderType {
BUY
SELL
}
type QueryRoot {
orderbook: OrderBook!
}
And a query query { orderbook { bids { data { price } }, asks { data { price } } } }
In a graphql playground of my graphql API (and on the network level of my Apollo app too) I receive a result like
{
"data": {
"orderbook": {
"bids": [
{
"data": {
"price": "127"
}
},
{
"data": {
"price": "74"
}
},
...
],
"asks": [
{
"data": {
"price": "181"
}
},
{
"data": {
"price": "187"
}
},
...
]
}
}
}
where, for the purpose of this question, the bids are ordered in descending order by price like ["127", "74", "73", "72"], etc, and asks are ordered in ascending order, accordingly.
However, in Apollo, after a query is done, I notice that one of the arrays gets seemingly random data.
For the purpose of the question, useQuery react hook is used, but the same happens when I query imperatively from a freshly initialized ApolloClient.
const { data, subscribeToMore, ...rest } = useQuery<OrderbookResponse>(GET_ORDERBOOK_QUERY);
console.log(data?.orderbook?.bids?.map(r => r.data.price));
console.log(data?.orderbook?.asks?.map(r => r.data.price));
Here, corrupted data of Bids gets printed i.e. ['304', '306', '298', '309', '277', '153', '117', '108', '87', '76'] (notice the order being wrong, at the least), whereas Asks data looks just fine. Inspecting the network, I find that Bids are not only properly ordered there, but also have different (correct, from DB) values!
Therefore, it seems something's getting corrupted on the way while Apollo delivers the data.
What could be the issue here I wonder, and where to start debugging such kind of an issue? There seem to be no warnings from Apollo either, it seems to just silently corrupt the data.
I'm clearly doing something wrong, but what?
The issue seems to stem from how Apollo caches data.
My Bids and Asks could have the same numeric IDs but share the same Order graphql type. Apollo rightfully assumes a Bid and an Ask with the same ID are the same things and the resulting data gets wrecked as a consequence.
An easy fix is to show Apollo that there's a complex key to the Order type on cache initialization:
cache: new InMemoryCache({
typePolicies: {
Order: {
keyFields: ['id', 'kind'],
}
}
})
This way it'll understand that the Order entities Ask and Bid with the same ID are different pieces of data indeed.
Note that the field kind should be also added to the query strings accordingly.

How can I response to client based on what fields they are querying in graphql?

I am using AWS appsync for graphql server and have schema like:
type Order {
id: ID!
price: Int
refundAmount: Int
period: String!
}
query orders (userId: ID!) [Order]
It is to support query orders based on user id. It responses an array of orders for different time period. The response could be:
[{
id: xxx
price: 100
refundAmount: 10
period: '2021-01-01'
},{
id: xxx
price: 200
refundAmount: 0
period: '2021-01-03'
},
...
]
If the price and refundAmount in the period is 0, I won't response empty element in the array. In the above example, there is price and refundAmount on 2021-01-02, so there is no such element in the array.
My problem is how can I response the data based on what frontend queries? If customer only query refundAmount field in the response, I don't want to response 2021-01-03 period. How do I know what fields frontend wants to show in the response?
e.g.
If clients send this query:
query {
orders (userId: "someUserId") {
refundAmount
}
}
I will response below data but I don't want the second one to be there since the value is 0.
[{
id: xxx
refundAmount: 10
period: '2021-01-01'
},{
id: xxx
refundAmount: 0
period: '2021-01-03'
}
]
My problem is how can I response the data based on what frontend
queries?
GraphQL will do that out of the box for you provided you have the resolvers for the fields in the query. Look at appropriate resolver based on your underlying data source.
How do I know what fields frontend wants to show in the response?
This is what the frontend decides, it can send a different query based on the fields it is interested. A few examples below.
If the frontend is interested in only one field i.e. refundAmount, then it would send a query something like this.
query {
orders (userId: "someUserId") {
refundAmount
}
}
If it is interested in more than 1 field say price and refundAmount then the query would be something like this
query {
orders (userId: "someUserId") {
price,
refundAmount
}
}
Update: Filter response:
Now based on the updated question, you need to enhance your resolver to do this additional filtering.
The resolver can do this filtering always (Kind of hard coded like refundAmount > 0 )
Support a filter criteria in the query model query orders (userId: ID!, OrderFilterInput) [Order] and the define the criteria based on which you want to filter. Then support those filter criteria in the resolvers to query the underlying data source. Also take the filter criteria from the client.
Look at the ModelPostFilterInput generated model on this example.
Edit 2: Adds changed Schema for a filter
Let's say you change your Schema to support filtering and there is no additional VTL request/response mappers and you directly talk to a Lambda.
So this is how the Schema would look like (of course you would have your mutations and subscriptions and are omitted here.)
input IntFilterInput { # This is all the kind of filtering you want to support for Int data types
ne: Int
eq: Int
le: Int
lt: Int
ge: Int
gt: Int
}
type Order {
id: ID!
price: Int
refundAmount: Int
period: String!
}
input OrderFilterInput { # This only supports filter by refundAmount. You can add more filters if you need them.
refundAmount: IntFilterInput
}
type Query {
orders(userId: ID!, filter: OrderFilterInput): [Order] # Here you add an optional filter input
}
schema {
query: Query
}
Let's say you attached the Lambda resolver at the Query orders.
In this case, the Lambda would need to return an array/list of Orders.
If you are further sending this query to some table/api, you need to understand the filter, and create an appropriate query or api call for the downstream system.
I showing a simple Lambda with hard coded response. If we bring in the Filter, this is what changes.
const getFilterFunction = (operator, key, value) => {
switch (operator) {
case "ne":
return x => x[key] != value
case "eq":
return x => x[key] == value
case "le":
return x => x[key] <= value
case "lt":
return x => x[key] < value
case "ge":
return x => x[key] >= value
case "gt":
return x => x[key] > value
default:
throw Error("Unsupported filter operation");
}
}
exports.handler = async(event) => {
let response = [{
"id": "xxx1",
"refundAmount": 10,
"period": '2021-01-01'
}, {
"id": "xxx2",
"refundAmount": 0,
"period": '2021-01-03'
}]
const filter = event.arguments.filter;
if (filter) { // If possible send the filter to your downstream system rather handling in the Lambda
if (filter.refundAmount) {
const refundAmountFilters = Object.keys(filter.refundAmount)
.map(operator => getFilterFunction(operator + "", "refundAmount", filter.refundAmount[operator]));
refundAmountFilters.forEach(filterFunction => { response = response.filter(filterFunction) });
}
}
return response; // You don't have to return individual fields the query asks for. It is taken care by AppSync. Just return a list of orders.
};
With the above in place, you can send various queries like
query MyQuery {
orders(userId: "1") { #without any filters
id
refundAmount
}
}
query MyQuery {
orders(userId: "1", filter: {refundAmount: {ne: 0}}) { # The filter you are interested
id
refundAmount
}
}
query MyQuery {
orders(userId: "1", filter: {refundAmount: {ne: 0, gt: 5}}) { # Mix and Match filters
id
refundAmount
}
}
You don't have to support all the operators for filtering and you can focus only on ne or != and further simplify things. Look at this blog for a more simple version where the filter operation is assumed.
Finally the other possibility to filter without modifying the Schema is to change your Lambda only to ensure it returns a filtered set of results either doing the filtering itself or sending an appropriate query/request to the underlying system to do the filtering.

Creating a GraphQLObjectType with an indexable field signature?

I'm currently in the process of transforming a REST API into GraphQL, but I've hit a bit of a snag in one of the endpoints.
Currently, this endpoint returns an object who's keys can be an unlimited set of strings, and whos values all match a certain shape.
So, as a rudimentary example, I have this situation...
// response
{
foo: { id: 'foo', count: 3 },
bar: { id: 'bar', count: 6 },
baz: { id: 'baz', count: 1 },
}
Again, the keys are not known at runtime and can be an unlimited set of strings.
In TypeScript, for example, this sort of situation is handled by creating an interface using an indexable field signature, like so...
interface Data {
id: string;
count: number;
}
interface Response {
[key: string]: Data;
}
So, my question is: Is this sort of thing possible with graphql? How would I go about creating a type/schema for this?
Thanks in advance!
I think that one solution can be usage of JSON.stringify() method
exampleQuery: {
type: GraphQLString,
resolve: (root, args, context) => {
let obj = {
foo: { id: 'foo', count: 3 },
bar: { id: 'bar', count: 6 },
baz: { id: 'baz', count: 1 }
};
return JSON.stringify(obj);
}
}
Then, after retrieving the result of GraphQL query you could use JSON.parse(result) (in case the part performing the query is also written in JavaScript - otherwise you would have to use equivalent method of other language to parse the incoming JSON response).
Disadvantage of such a solution is that you do not have the possibility to choose what fields of obj you want to retrieve from the query, but, as you said, the returning object can have unlimited set of strings that probably are not known on the front end of the application, so there is no need to choose it's keys, am I right?

Query nested object based on key in MongoDB

My Schema Sample:
{
_id: '1234',
daily: {
'12-06-03':{
a:1,
b:2
},
'12-06-04':{
c:1,
d:2
},
'12-06-05':{
e:1,
f:2
},
'12-06-06':{
a:1,
b:2
}
}
}
My Query: i want to query All 'daily' object's nested objects greater than or less than particular date (assume: 12-06-05).
I understand one method is to retrieve entire daily object and then compare by iterating over each key of daily object.

Resources