Dynamic fields and slow queries - performance

Currently, I'm managing a set of lists containing a number of members.
Every list can look different, when it comes to fields and the naming of these fields.
Typically, a basic list member could look like so (from my members collection):
{
"_id" : ObjectId("52284ae408edcb146200009f"),
"list_id" : 1,
"status" : "active",
"imported" : 1,
"fields" : {
"firstname" : "John",
"lastname" : "Doe",
"email" : "john#example.com",
"birthdate" : ISODate("1977-09-03T23:08:20.000Z"),
"favorite_color" : "Green",
"interests" : [
{
"id" : 8,
"value" : "Books"
},
{
"id" : 10,
"value" : "Travel"
},
{
"id" : 12,
"value" : "Cooking"
},
{
"id" : 15,
"value" : "Wellnes"
}
]
},
"created_at" : ISODate("2012-05-06T15:12:26.000Z"),
"updated_at" : ISODate("2012-05-06T15:12:26.000Z")
}
All the fields under the "fields" index, is fields that is unique for the current list id - and these fields can change for every list ID, which means a new list could look like so:
{
"_id" : ObjectId("52284ae408edcb146200009f"),
"list_id" : 2,
"status" : "active",
"imported" : 1,
"fields" : {
"fullname" : "John Doe",
"email" : "john#example.com",
"cell" : 123456787984
},
"created_at" : ISODate("2012-05-06T15:12:26.000Z"),
"updated_at" : ISODate("2012-05-06T15:12:26.000Z")
}
Currently, my application is allowing users to search dynamically in each of the customs fields, but since they have no indexes, this process can be very slow.
I don't believe it's an option to allow list creaters to select which fields should be indexed - but I really need to speed this up.
Is there any solution for this?

If you refactor your documents in a way that you have an array of fields, you can leverage indexes.
fields: [
{ name: 'fullName', value: 'John Doe' },
{ name: 'email', value: 'john#example.com' },
...
]
Create an index on fields.name and fields.value.
Of course this is not a solution for "deeper" values like your interests list.

Related

Elastic Search single index vs multiple index

I want to insert nested Structure in Elastic Search.
For Example :
[
{ "Product" : "P1",
"Desc" : "productDesc",
"Items":[{
"I1": "i1",
"I_desc" : "i1_desc",
"prices" :[{
"id" : "price1",
"value" : 10
},{
"id" : "price2",
"value" : 20
}]
},
{
"I2": "i2",
"I_desc" : "i2_desc",
"prices" :[{
"id" : "price1",
"value" : 10
},{
"id" : "price",
"value" : 20
}]
}]
},
{ "Product" : "P12",
"Desc" : "product2Desc",
"Items":[{
"I1": "i1",
"I_desc" : "i1_desc",
"prices" :[{
"id" : "price11",
"value" : 12
},{
"id" : "price12",
"value" : 10
}]
},{
"I2": "i3",
"I_desc" : "i3_desc",
"prices" :[{
"id" : "price11",
"value" : 12
},{
"id" : "price31",
"value" : 33
}]
}]
}
]
I want to insert similar to this nested structure in Elastic Serach with index pro and id = P1 and P12 (2 insert data).
Then query for the data like
1. Give me all Product Ids -> which has prices -> id = price11
2. All Products which has item = i1
Should I use single index to Id or index all the attributes like Item, productDesc, prices, id , value?

Mongoose + GraphQL (Apollo Server) Schema

We have db collection which is little complicated. Many of our keys are JSON objects where fields aren't fixed and change based on input given by user on UI. How should we write mongoose and GraphQL Schema for such complex type ?
{
"_id" : ObjectId("5ababb359b3f180012762684"),
"item_type" : "Sample",
"title" : "This is sample title",
"sub_title" : "Sample sub title",
"byline" : "5c6ed39d6ed6def938b71562",
"lede" : "Sample description",
"promoted" : "",
"slug" : [
"myurl"
],
"categories" : [
"Technology"
],
"components" : [
{
"type" : "Slide",
"props" : {
"description" : {
"type" : "",
"props" : {
"value" : "Sample value"
}
},
"subHeader" : {
"type" : "",
"props" : {
"value" : ""
}
},
"ButtonWorld" : {
"type" : "a-button",
"props" : {
"buttonType" : "product",
"urlType" : "Internal Link",
"isServices" : false,
"title" : "Hello World",
"authors" : [
{
"__dataID__" : "Qm9va0F1dGhvcjo1YWJhYjI0YjllNDIxNDAwMTAxMGNkZmY=",
"_id" : null,
"First_Name" : "John",
"Last_Name" : "Doe",
"Display_Name" : "John Doe",
"Slug" : "john-doe",
"Role" : 1
}
],
"isbns" : [
"9781497603424"
],
"image" : "978-cover.jpg",
"price" : "8.99",
"bisacs" : [],
"customCategories" : [],
},
"salePrice" : {
"type" : "",
"props" : {
"value" : ""
}
}
}
},
"tags" : [
{
"id" : "5abab58042e2c90011875801",
"name" : "Tag Test 1"
},
{
"id" : "5abab5831242260011c248f9",
"name" : "Tag Test 2"
},
{
"id" : "592450e0b1be5055278eb5c6",
"name" : "horror",
},
{
"id" : "59244a96b1be5055278e9b0e",
"name" : "Special Report",
"_id" : "59244a96b1be5055278e9b0e"
}
],
"created_at" : ISODate("2018-03-27T21:44:21.412Z"),
"created_by" : ObjectId("591345bda429e90011c1797e")
}
I believe Mongoose have Mixed type but how do i represent such complex type in Apollo GraphQL Server and Mongoose Schema. Also, currently my resolver is just models.product.find(). So if i have such complex type, need to understand what update needs to make to my resolver.
It will be great if i get complete solution for GraphQL Apollo schema, mongoose schema and resolver for my data.
Finally found solution for problem.
You can declare new type and reference it in typeDef for GraphQL Schema.
In mongoose model, you can reference it as {type: Array}

Spring Boot MongoDB find records by value in list

I have in my mongodb collection with news.
{
"_id" : ObjectId("593a97cdb17cc6535522d16a"),
"title" : "Title",
"text" : "Test",
"data" : "9.06.2017, 14:39:33",
"author" : "Admin",
"categoryList" : [
{
"_id" : null,
"text" : "category1"
},
{
"_id" : null,
"text" : "category2"
},
{
"_id" : null,
"text" : "category3"
}
]
}
Every news record has list of categories. I woudl like to find all news who has category1 in categoryList I try do that by
newsRepository.findByCategoryList("category1"); but not working.
How to do that?
With your current repository method the generated query is
{ "categoryList" : "category1"}
What you need is
{ "categoryList.text" : "category1"}
You can create the query in two ways.
Using Repository
findByCategoryListText(String category)
Using Query Method
#Query("{'categoryList.text': ?0}")
findByCategoryList(String category)

Index Type.Relationship value from KeystoneJS into elastic search

I'm new to KeystoneJs and I want to index a Type.Relationship in elasticsearch using mongoosastic. The issue is in Mongo data of Type.Relationship is stored as an array of ObjectIDs - not the value of the actual relationship - as there may be multiple of each (see categories and platform below
{
"_id" : ObjectId("56d49e8ebe469b9614119259"),
"slug" : "philips-hue",
"title" : "Philips Hue",
"categories" : [
ObjectId("56d4a03abe469b961411925a"),
ObjectId("56d4a047be469b961411925b")
],
"state" : "published",
"__v" : 3,
"author" : ObjectId("56d499ad995a533813adef63"),
"publishedDate" : ISODate("2016-02-29T13:00:00.000Z"),
"manufacturer" : ObjectId("56d5eeec1ffcbb0517f07e1b"),
"platform" : [
ObjectId("56d5ee0ab85dbdf916007009"),
ObjectId("56d5ee37b85dbdf91600700a"),
ObjectId("56d5ee34uj3i223io23h2394")
]
}
These ObjectIds of platform are of format:
{
"_id" : ObjectId("56d5ee0ab85dbdf916007009"),
"key" : "homekit",
"name" : "Homekit",
"__v" : 0
}
Can anyone put me on the right path to add to the elastic search index name of the ObjectIds that are referenced?

Update object in array with new fields mongodb

ai have some mongodb document
horses is array with id, name, type
{
"_id" : 33333333333,
"horses" : [
{
"id" : 72029,
"name" : "Awol",
"type" : "flat",
},
{
"id" : 822881,
"name" : "Give Us A Reason",
"type" : "flat",
},
{
"id" : 826474,
"name" : "Arabian Revolution",
"type" : "flat",
}
}
I need to add new fields
I thought something like that, but I did not go to his head
horse = {
"place" : 1,
"body" : 11
}
Card.where({'_id' => 33333333333}).find_and_modify({'$set' => {'horses.' + index.to_s => horse}}, upsert:true)
But all existing fields are removed and inserted new how to do that would be new fields added to existing
Indeed, this command will overwrite the subdocument
'$set': {
'horses.0': {
"place" : 1,
"body" : 11
}
}
You need to set individual fields:
'$set': {
'horses.0.place': 1,
'horses.0.body': 11
}

Resources