$pull and $push Query in Elastic Search - elasticsearch

MongoDB supports $pull and $push query to remove or add an element into an array. Can a similar type of query present in the elastic search which used to add or remove element from array.

You can use scripts for this, and update using update_by_query API.
{
"script": {
"inline": "ctx._source.<name_of_array>.add(params.<name_of_variable>)",
"params": {"<name_of_variable>": 1}
},
"query": {
// specify the query if want to apply on filtered documents.
}

Related

Insert data when no match by update_by_query in elastic search

I have this command that don't match any data in elastic search and I want to insert it after that.
//localhost:9200/my_index/my_topic/_update_by_query
{
"script": {
"source": "ctx._source.NAME = params.NAME",
"lang": "painless",
"params": {
"NAME": "kevin"
}
},
"query": {
"terms": {
"_id": [
999
]
}
}
}
I try using upsert but it return errors Unknown key for a START_OBJECT in [upsert].
I don't want using update + doc_as_upsert cause I have a case that I will don't send id in my update query.
How can I insert this with update_by_query. Thank you.
If elastic search don't support. I think I will check condition if have id or not, and use indexAPI to create and update to update.
_update_by_query runs on existing documents contained in an existing index. What _update_by_query does is scroll over all documents in your index (that optionally match a query) and perform some logic on each of them via a script or an ingest pipeline.
Hence, logically, you cannot create/upsert data that doesn't already exist in the index. The Index API will always overwrite your document. Upsert only works with in conjunction with the _update endpoint, which is what you should probably do.

Filtering documents by an unknown value of a field

I'm trying to create a query to filter my documents by one (can be anyone) value from a field (in my case "host.name"). The point is that I don't know previously the unique values of this field. I need found these and choose one to be used in the query.
I had tried the below query using a painless script, but I have not been able to achieve the goal.
{
"sort" : [{"#timestamp": "desc"}, {"host.name": "asc"}],
"query": {
"bool": {
"filter": {
"script": {
"script": {
"source": """
String k = doc['host.name'][0];
return doc['host.name'].value == k;
""",
"lang": "painless"
}
}
}
}
}
I'll appreciate if any can help me improving this idea of suggesting me a new one.
TL;DR you can't.
The script query context operates on one document at a time and so you won't have access to the other docs' field values. You can either use a scripted_metric aggregation which does allow iterating through all docs but it's just that -- an aggregation -- and not a query.
I'd suggest to first run a simple terms agg to figure out what values you're working with and then build your queries accordingly.

Multi_match elasticsearch on all fields with boost to specific fields

I am using Elastic 6.1+
I have created an index and added some values to it, the index mapping is text and numbers.
I want to create a multi_match on all of the fields in the index, query a text or a number and get the results back.
Also i would like to define that the score of field1 on the index is boosted
For some reason once i add the fields array it only search on that fields (added it in order to be able to define which field i want to boost and how much) and if i add to the fields array the "*" as field it return an error.
GET MyIndex/_search
{
"query": {
"multi_match": {
"query": "test1",
"fields": [
"field1^3",
"*"
]
}
}
}
Thank you
Apparently adding
"lenient": true
to the query solved the problem

Elasticsearch DSL Query for Update

I understand that I am able to update a particular document by http://localhost:9200/[index_name]/[index_type]/[_id], but I have document where the _id has # symbols which Sense couldn't find them.
Understand that the Query DSL will be able to perform a search where I am able to indicate the _id not in the URL.
Resource: https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-ids-query.html
Can I check with you, how can I do the same for updating document?
If you don't want to put the ID in the URL, the only option you have is to use the update by query API, like this:
POST index/_update_by_query
{
"query": {
"ids": {
"values": ["2323#23423"]
}
},
"script": {
"source": "do some update here"
}
}
Use localhost:9200/index/type/ID%23, %23 for #
So if '_id' is 10, url will look like localhost:9200/index/type/10%23

Update all documents of Elastic Search using existing column value

I have a field "published_date" in elastic search and there I have full date like yyyy-MM-dd'T'HH:mm:ss.
I want to create 3 more columns for year, month and date where I have to use the existing published_date to update new 3 columns.
Is there any inbuilt api to do this kind of work in e.s.? I am using elasticsearch 5.
You can use the update-by-query API in order to do this. It would simply boil down to running something like this:
POST your_index/_update_by_query
{
"script": {
"inline": "ctx._source.year = ctx._source.published_date.date.getYear(); ctx._source.month = ctx._source.published_date.date.getMonthOfYear(); ctx._source.day = ctx._source.published_date.date.getDayOfYear(); ",
"lang": "groovy"
},
"query": {
"match_all": {}
}
}
Also note that you need to enable dynamic scripting in order for this to work.

Resources