Is there an equivalent of the mongo addToSet command in rethinkdb? - rethinkdb

Say I have a users table with an embedded followers array property.
{
"followers": [
"bar"
] ,
"name": "foo" ,
}
In rethinkDb, what's the best way to add a username to that followers property. i.e. add a follower to a user.
In mongodb I would use the addToSet command to add a unique value to the embedded array. Should I use the merge command?

RethinkDB has a setInsert command. You can write table.get(ID).update({followers: r.row('followers').setInsert(FOLLOWER)}).

Related

Reverse mapping in aerospike

I have a few records in aerospike in following key-value pair :
Key : "1234"
Value : {
"XYZ":{
"B":[1,3]
"C":[3,4]
}
}
Key : "5678"
Value : {
"XYZ":{
"B":[1,3,5]
"C":[3,4]
}
}
I want to get all the keys from set where field "B" in json value contains let say 3. Is there any way to query all such keys in golang ?
Yes, you can build a secondary index on the values in map key "B" ... at that nested level and then run a secondary index query to get all matching records.
You can do the same in Go using equivalent APIs.
Many Java interactive code examples at: https://developer.aerospike.com/tutorials/java/cdt_indexing
For example, this is top level example with string values:
Then another example where you can build a SI on nested sublevel:

Accessing other documents inside update and reindex script

Is there anyway to run a query and fetch matching documents inside a script?
For example I want to add all of the documents matching the query x to the field f of my document d like this:
d:
{
other fields...
"f": [ {doc 1 matching the query}, {doc 2 matching the query}, ... ]
It is possible in sql to use a query inside another query. Does elasticsearch have anything similar?
Tried writing all kinds of code in script to fetch data from indexes in my database but didn't work.

How can I filter if any value of an array is contained in another array in rethinkdb/reql?

I want to find any user who is member of a group I can manage (using the webinterface/javascript):
Users:
{
"id": 1
"member_in_groups": ["all", "de-south"]
},
{
"id": 2
"member_in_groups": ["all", "de-north"]
}
I tried:
r.db('mydb').table('users').filter(r.row('member_in_groups').map(function(p) {
return r.expr(['de-south']).contains(p);
}))
but always both users are returned. Which command do I have to use and how can I use an index for this (I read about multi-indexes in https://rethinkdb.com/docs/secondary-indexes/python/#multi-indexes but there only one value is searched for)?
I got the correct answer at the slack channel so posting it here if anyone else comes to this thread through googling:
First create a multi index as described in
https://rethinkdb.com/docs/secondary-indexes/javascript/, e. g.
r.db('<db-name>').table('<table-name>').indexCreate('<some-index-name>', {multi: true}).run()
(you can omit .run() if using the webadmin)
Then query the data with
r.db('<db-name>').table('<table-name>').getAll('de-north', 'de-west', {index:'<some-index-name>'}).distinct()

For 1 billion documents, Populate data from one field to another fields in the same collection using MongoDB

I need to populate data from one field to multiple fields on the same collection. For example:
Currently I have document like below:
{ _id: 1, temp_data: {temp1: [1,2,3], temp2: "foo bar"} }
I want to populate into two different fields on the same collection as like below:
{ _id: 1, temp1: [1,2,3], temp2: "foo bar" }
I have one billion documents to migrate. Please suggest me the efficient way to update all one billion documents?
In your favorite language, write a tool that runs through all documents, migrates them, and store them in a new database.
Some hints:
When iterating the results, make sure they are sorted (e.g. on the _id) so you can implement resume should your migration code crash at 90%...
Do batch inserts: read, say, 1000 items, migrate them, then write 1000 items in a single batch to the new database. Reads are automatically batched.
Create indexes after the migration, not before. That will be faster and lead to less fragmentation
Here I made a query for you, use following query to migrate your data
db.collection.find().forEach(function(myDoc) {
db.collection_new.update(
{_id: myDoc._id},
{
$unset: {'temp_data': 1},
$set: {
'temp1': myDoc.temp_data.temp1,
'temp2': myDoc.temp_data.temp2
}
},
{ upsert: true }
)
});
To learn more about foreach cursor please visit link
Need $limit and $skip operator to migrate data in batches. In update query i have used upsert beacuse there if already exist it will update otherwise inserted entry wiil be new.
Thanks

In Couchbase or N1QL how can I check if the values in an array match

In a couchbase I have the following document structure...
{
name: "bob",
permissions: [
2,
4,
6
]
}
I need to be able to create a view, or N1QL query which will check if the permissions for "bob" are contained within a given array.
e.g I have an array with contents
[1,2,3,4,5,6]
I need the "bob" document to be returned because my array contains 2,4,6 and so does "bob"
If my array contained 1,3,4,5,6 "bob" should not be selected because my array does not contain "2"
Essentially I want to match any documents whose permission entries are all contained in my array.
The solution can either a view or an N1QL query.
Using N1QL, you can do the following:
SELECT * FROM my_bucket WHERE EVERY p IN permissions SATISFIES p IN [ 1,2,3,4,5,6 ] END;

Resources