Rethinkdb insert query results into a table - rethinkdb

I'm trying to insert the results of a query from one table into another table. However, when I attempt to run the query I am receiving an error.
{
"deleted": 0 ,
"errors": 1 ,
"first_error": "Expected type OBJECT but found ARRAY." ,
"inserted": 0 ,
"replaced": 0 ,
"skipped": 0 ,
"unchanged": 0
}
Here is the the insert and query:
r.db('test').table('destination').insert(
r.db('test').table('source').map(function(doc) {
var result = doc('result');
return result('section_list').concatMap(function(section) {
return section('section_content').map(function(item) {
return {
"code": item("code"),
"name": item("name"),
"foo": result("foo"),
"bar": result("bar"),
"baz": section("baz"),
"average": item("average"),
"lowerBound": item("from"),
"upperBound": item("to")
};
});
});
});
);
Is there a special syntax for this, or do I have to retrieve the results and then run a separate insert?

The problem is that your inner query is returning a stream of arrays. You can't insert arrays into a table (only objects), so the query fails. If you change the outermost map into a concatMap it should work.

The problem here was that the result was a sequence of an array of objects. i.e
[ [ { a:1, b:2 }, { a:1, b:2 } ], [ { a:2, b:3 } ] ]
Therefore, I had to change the outer map call to a concatMap call. The query then becomes:
r.db('test').table('destination').insert(
r.db('test').table('source').concatMap(function(doc) {
var result = doc('result');
return result('section_list').concatMap(function(section) {
return section('section_content').map(function(item) {
return {
"code": item("code"),
"name": item("name"),
"foo": result("foo"),
"bar": result("bar"),
"baz": section("baz"),
"average": item("average"),
"lowerBound": item("from"),
"upperBound": item("to")
};
)});
});
});
}
Thanks goes to #AtnNn on the #rethinkdb freenode for pointing me in the right direction.

Related

Is there a more efficient way to refactor the iteration of the hash on ruby?

I have a iteration here:
container = []
summary_data.each do |_index, data|
container << data
end
The structure of the summary_data is listed below:
summary_data = {
"1" => { orders: { fees: '25.00' } },
"3" => { orders: { fees: '30.00' } },
"6" => { orders: { fees: '45.00' } }
}
I want to remove the numeric key, e.g., "1", "3".
And I expect to get the following container:
[
{
"orders": {
"fees": "25.00"
}
},
{
"orders": {
"fees": "30.00"
}
},
{
"orders": {
"fees": "45.00"
}
}
]
Is there a more efficient way to refactor the code above?
Appreciate for any help.
You can use Hash#values method, like this:
container = summary_data.values
If the inner hashes all have the same structure, the only interesting information are the fees:
summary_data.values.map{|h| h[:orders][:fees] }
# => ["25.00", "30.00", "45.00"]
If you want to do some calculations with those fees, you could convert them to numbers:
summary_data.values.map{|h| h[:orders][:fees].to_f }
# => [25.0, 30.0, 45.0]
It might be even better to work with cents as integers to avoid any floating point error:
summary_data.values.map{|h| (h[:orders][:fees].to_f * 100).round }
=> [2500, 3000, 4500]
You need an array having values of provided hash. You can get by values method directly.
summary_data.values

GraphQL Mutation with JSON Patch

Are there any data types in GraphQL that can be used to describe a JSON Patch operation?
The structure of a JSON Patch operation is as follows.
{ "op": "add|replace|remove", "path": "/hello", "value": ["world"] }
Where value can be any valid JSON literal or object, such as.
"value": { "name": "michael" }
"value": "hello, world"
"value": 42
"value": ["a", "b", "c"]
op and path are always simple strings, value can be anything.
If you need to return JSON type then graphql have scalar JSON
which return any JSON type where you want to return it.
Here is schema
`
scalar JSON
type Response {
status: Boolean
message: String
data: JSON
}
type Test {
value: JSON
}
type Query {
getTest: Test
}
type Mutation {
//If you want to mutation then pass data as `JSON.stringify` or json formate
updateTest(value: JSON): Response
}
`
In resolver you can return anything in json format with key name "value"
//Query resolver
getTest: async (_, {}, { context }) => {
// return { "value": "hello, world" }
// return { "value": 42 }
// return { "value": ["a", "b", "c"] }
// return anything in json or string
return { "value": { "name": "michael" } }
},
// Mutation resolver
async updateTest(_, { value }, { }) {
// Pass data in JSON.stringify
// value : "\"hello, world\""
// value : "132456"
// value : "[\"a\", \"b\", \"c\"]"
// value : "{ \"name\": \"michael\" }"
console.log( JSON.parse(value) )
//JSON.parse return formated required data
return { status: true,
message: 'Test updated successfully!',
data: JSON.parse(value)
}
},
the only thing you need to specifically return "value" key to identify to get in query and mutation
Query
{
getTest {
value
}
}
// Which return
{
"data": {
"getTest": {
"value": {
"name": "michael"
}
}
}
}
Mutation
mutation {
updateTest(value: "{ \"name\": \"michael\" }") {
data
status
message
}
}
// Which return
{
"data": {
"updateTest": {
"data": null,
"status": true,
"message": "success"
}
}
}

loopback REST API filter by nested data

I would like to filter from REST API by nested data. For example this object:
[
{
"name": "Handmade Soft Fish",
"tags": "Rubber, Rubber, Salad",
"categories": [
{
"name": "women",
"id": 2,
"parent_id": 0,
"permalink": "/women"
},
{
"name": "kids",
"id": 3,
"parent_id": 0,
"permalink": "/kids"
}
]
},
{
"name": "Tasty Rubber Soap",
"tags": "Granite, Granite, Chair",
"categories": [
{
"name": "kids",
"id": 3,
"parent_id": 0,
"permalink": "/kids"
}
]
}
]
is comming by GET /api/products?filter[include]=categories
and i would like to get only products which has category name "women". How do this?
LoopBack does not support filters based on related models.
This is a limitation that we have never had bandwidth to solve, unfortunately :(
For more details, see the discussion and linked issues here:
Filter on level 2 properties: https://github.com/strongloop/loopback/issues/517
Filter by properties of related models (use SQL JOIN in queries): https://github.com/strongloop/loopback/issues/683
Maybe you want to get this data by the Category REST API. For example:
GET /api/categories?filter[include]=products&filter[where][name]=woman
The result will be a category object with all products related. To this, will be necessary declare this relation on the models.
Try like this.It has worked for me.
const filter = {
where: {
'categories.name': {
inq: ['women']**strong text**
}
}
};
Pass this filter to request as path parameters and the request would be like bellow
GET /api/categoriesfilter=%7B%22where%22:%7B%categories.name%22:%7B%22inq%22:%5B%women%22%5D%7D%7D%7D
Can you share how it looks like without filter[include]=categorie, please ?
[edit]
after a few questions in comment, I'd build a remote method : in common/models/myModel.js (inside the function) :
function getItems(filter, categorieIds = []) {
return new Promise((resolve, reject) => {
let newInclude;
if (filter.hasOwnProperty(include)){
if (Array.isArray(filter.include)) {
newInclude = [].concat(filter.include, "categories")
}else{
if (filter.include.length > 0) {
newInclude = [].concat(filter.include, "categories");
}else{
newInclude = "categories";
}
}
}else{
newInclude = "categories";
}
myModel.find(Object.assign({}, filter, {include: newInclude}))
.then(data => {
if (data.length <= 0) return resolve(data);
if (categoriesIds.length <= 0) return resolve(data);
// there goes your specific filter on categories
const tmp = data.filter(
item => item.categories.findIndex(
categorie => categorieIds.indexOf(categorie.id) > -1
) > -1
);
return resolve(tmp);
})
}
}
myModel.remoteMethod('getItems', {
accepts: [{
arg: "filter",
type: "object",
required: true
}, {
arg: "categorieIds",
type: "array",
required: true
}],
returns: {arg: 'getItems', type: 'array'}
});
I hope it answers your question...

Get specific object field with condition and make opertion on it

I have objects like this:
{
buildings: {
"1": {
"l": 0 ,
"r": 0 ,
"s": 0 ,
"type": "GoldMine" ,
"x": 2 ,
"y": 15
} ,
"10": {
"l": 0 ,
"r": 6 ,
"s": 2 ,
"type": "MagicMine" ,
"x": 26 ,
"y": 22
}
} ,
[...]
}
I want to get objects with buildings of type "GoldMine".
I tried something with map:
r.table("Characters").map(function(row) {
return row("planet")("buildings")
})
With keys() I can iterate it:
r.db("Unnyworld").table("Characters").map(function(row) {
return row("planet")("buildings").keys().map(function(key) {
return "need to get only buildings with type == GoldMine";
})
}).limit(2)
But it returns all buildings. I want to get only buildings with type == GoldMine and change field x.
Something like this may work:
r.table('Characters')
.concatMap(function(doc) {
return doc("planet")("buildings").keys().map(function(k) {
return {id: doc('id'), key: k, type: doc("planet")("buildings")(k)('type'), x: doc("planet")("buildings")(k)('x')}
})
})
.filter(function(building) {
return building('type').eq('GoldMine')
})
.forEach(function(doc) {
return r.table('Characters').get(doc('id'))
.update({
planet: {buildings: r.object(doc('key'), {x: 1111111})}
})
})
Basically create a flat array from building by using concatMap then filter it. With result data, we can iterator over it and update to value that we want.

Returning MAX() and MIN() in one query

On my RethinkDB 1.16.2-1 on Linux, I have a "products" table that has a "categories" array and a "models" array like this:
{
"name": "ABC Cable Series" ,
"categories": [
"Analog Audio>Instrument>Cables" ,
"Analog Audio>Microphone Cables"
] ,
"models": [
{
"modelCode": "ABC-1" ,
"ssp": 11.95 , ...
} ,
{
"modelCode": "ABC-2" ,
"ssp": 15.95 , ...
}
]
} , ...
I need to get both the minimum and maximum price (ssp) range of models in products that contain the given product category. I can currently get the maximum price like this:
r.db("store").table("products").filter(function(prod) {
return prod("categories").contains(
function(cat){return cat.match("^Analog Audio>")
})
}).concatMap(function(doc) {
return doc("models")("ssp")
}).max()
Other than running 2 queries, is there a more efficient way to get both MAX and MIN values in one query?
Presuming you want an object with both values, you can do the following:
r.db('test').table('products').filter(function(prod) {
return prod("categories").contains(
function(cat){return cat.match("^Analog Audio>")
})
}).concatMap(function(doc) {
return doc("models")("ssp")
})
.coerceTo('array') // Convert Stream to Array
.do(function (rows) { // Pass array to to `.do`
return { // Return Object
max: rows.max(),
min: rows.min()
}
})
You can also use reduce (http://rethinkdb.com/api/javascript/reduce/) to compute both values without converting all data to an array first:
r.db("store").table("products").filter(function(prod) {
return prod("categories").contains(
function(cat){return cat.match("^Analog Audio>")
})
}).map(function(doc) {
return {
min: doc("models")("ssp").min(),
max: doc("models")("ssp").max()
}
}).reduce(function (le, ri) {
return {
min: r.expr([le("min"), ri("min")]).min(),
max: r.expr([le("max"), ri("max")]).max()
}
})

Resources