How to use secondary indexes for a "contains" query - rethinkdb

Rethinkdb docs has this example to improve getAll/contains queries with a secondary index:
// Create the index
r.table("users").indexCreate("userEquipment", function(user) {
return user("equipment").map(function(equipment) {
return [ user("id"), equipment ];
});
}, {multi: true}).run(conn, callback);
// Query equivalent to:
// r.table("users").getAll(1).filter(function (user) {
// return user("equipment").contains("tent");
// });
r.table("users").getAll([1, "tent"], {index: "userEquipment"}).distinct().run(conn, callback);
My questions is if there's a way to do the same but for querying with multiple tags. What would be the equivalent to make this query possible with a secondary index?
r.table("users").getAll(1).filter(function (user) {
return user("equipment").contains("tent", "tent2");
});

Probably we can do this
r.table("users").getAll([1, "tent"]).filter(function (user) {
return user("equipment").contains("tent2");
});
So build a multi index as you did, and try to getAll first, so that part is efficient with index, then filter to continue ensure that equipment contains array we want.

Related

Updating Apollo Cache Without Variables

Originally, I make a GraphQL call as follows:
query getItems($filter_ids: [Int!], $filter: item_records_bool_exp) {
items(order_by: { negative: asc, parent: asc }, where: { level: { _in: [2, 3] } }) {
i18n {
value
}
id
parent
negative
}
filters(where: { category: { _eq: "PLAN" } }) {
id
value
}
}
Now, when I insert a new item, I update the cache using update function in the mutation options, and I'm supposed to use readQuery/readFragment and writeQuery/writeFragment to interact with the Apollo Cache as described here.
My question is, my readQuery calls always fail if I do not provide the exact same variables that I had previous provided to the original GraphQL query. Is there a way around this? In other words, can I just read objects from the cache by their ID irrespective of the original query that was used to fetch these objects?

updating an array of nested documents rethinkdb

I have a document schema like this:
{
"name":"",
"type":"",
"posts":[
{
"url":"",
"content":"",
...
},
{
"url":"",
"content":"",
...
}
...
]
}...
I forgot to create id's for each post on insertion in database. So i'm trying to create a query for that:
r.db('test').table('crawlerNovels').filter(function (x){
return x.keys().contains('chapters')
}).map(function (x){
return x('chapters')
}).map(
function(x){
return x.merge({id:r.uuid()})
}
)
instead this query return all posts with an id but doesn't actually update in the database. I tried using a forEach instead of a map function at the end this doesn't work
After lots of tweaking and frustration i figured it out:
r.db('test').table('crawlerNovels').filter(function (x){
return x.keys().contains('chapters')
}).update(function(novel){
return {"chapters":novel('chapters').map(
function(chapter){
return chapter.merge({"id":r.uuid()})
})}
},{nonAtomic:true})

ReQL Updating object inside an embedded array emits "create" and "delete" events when listening to changes

I am hitting the following problem: Suppose that I have the following structure:
{
"id": 1,
"data": {
"arr": [{"text":"item1"}]
}
}
And the following query:
r.db('test').table('test').get(1).update(function (item) {
return {
data: {
arr: item('data')('arr').map(function (row) {
return r.branch(
row('text').eq('item1'),
row.merge({updated:true}),
row
)
})
}
}
})
I am listening for changes in this specific array only, and when the item is updated both create and delete events are emitted. I really need to receive an update event, e.g. old_val is not null and new_val is not null.
Thanks in advance guys
After all, I decided to drop the embedded array and use table joins, this avoids all possible hacks.
You can use something like this
r.db('test').table('test')('data')('arr').changes()
.filter(function(doc) {
return doc('new_val').ne(null).and(doc('old_val').ne(null))
})
I'll only show update to array. If you need to get access to other document field, try this:
r.db('test').table('test').changes()
.filter(function(doc) {
return doc('new_val')('data')('arr').ne(null).and(doc('old_val')('data')('arr').ne(null))
})

Relay how to write outputFields, getFatQuery, getConfigs for create new Item

How to write outputFields, getFatQuery, getConfigs for create new item and update items list
Please take a look gist or live
Questions are
getFatQuery() {
return Relay.QL`
???
`;
}
getConfigs() {
return [???];
}
outputFields: {
???
},
The outputFields in your schema make up the GraphQL type CreateActivityPayload that will be generated from your schema.js file. A mutation is like a regular query, but with side effects. In outputFields you get to decide what's queryable. Since your store is the only thing in your app that can change as a result of this mutation, we can start with that.
outputFields: {
store: {
type: storeType,
resolve: () => store,
},
}
The fat query operates on these output fields. Here you tell Relay what could possibly change as a result of this mutation. Adding an activity could change the following fields:
getFatQuery() {
return Relay.QL`
fragment on CreateActivityPayload #relay(pattern: true) {
store {
activities
}
}
`;
}
Finally, the config tells Relay what to do with the query when it gets it, or even if it needs to be made at all. Here, you're looking to update a field after creating a new activity. Use the FIELDS_CHANGE config to tell Relay to update your store.
getConfigs() {
return [{
type: 'FIELDS_CHANGE',
fieldsIDs: {
store: this.props.storeId,
},
}];
}
See more: https://facebook.github.io/relay/docs/guides-mutations.html

Chaining queries in rethinkdb

Lets say I have a "category" table, each category has associated data in the "data" table and it has associated data in other tables "associated" and I want to remove a category with all it's associated data.
What I'm currently doing is something like this:
getAllDataIdsFromCategory()
.then(removeAllAssociated)
.then(handleChanges)
.then(removeDatas)
.then(handleChanges)
.then(removeCategory)
.then(handleChanges);
Is there a way to chain these queries on the db-side?
my functions currently look like this:
var getAllDataIdsFromCategory = () => {
return r
.table('data')
.getAll(categoryId, { index: 'categoryId' })
.pluck('id').map(r.row('id')).run();
}
var removeAllAssociated = (_dataIds: string[]) => {
dataIds = _dataIds;
return r
.table('associated')
.getAll(dataIds, { index: 'dataId' })
.delete()
.run()
}
var removeDatas = () => {
return r
.table('data')
.getAll(dataIds)
.delete()
.run()
}
notice that I cannot use r.expr() or r.do() since I want to do queries based on the result of the previous query.
The problem with my approach is that it won't work for large amounts of "data" since I have to bring all of the ids to the client side, and doing paging for it in the client side in a loop seems like a workaround.
You can use forEach for this:
r.table('data').getAll(categoryID, {index: 'categoryId'})('id').forEach(function(id) {
return r.table('associated').getAll(id, {index: 'dataId'}).delete().merge(
r.table('data').get(id).delete())
});

Resources