Elastic search query using python list - elasticsearch

How do I pass a list as query string to match_phrase query?
This works:
{"match_phrase": {"requestParameters.bucketName": {"query": "xxx"}}},
This does not:
{
"match_phrase": {
"requestParameters.bucketName": {
"query": [
"auditloggingnew2232",
"config-bucket-123",
"web-servers",
"esbck-essnap-1djjegwy9fvyl",
"tempexpo",
]
}
}
}

match_phrase simply does not support multiple values.
You can either use a should query:
GET _search
{
"query": {
"bool": {
"should": [
{
"match_phrase": {
"requestParameters.bucketName": {
"value": "auditloggingnew2232"
}
}
},
{
"match_phrase": {
"requestParameters.bucketName": {
"value": "config-bucket-123"
}
}
}
]
},
...
}
}
or, as #Val pointed out, a terms query:
{
"query": {
"terms": {
"requestParameters.bucketName": [
"auditloggingnew2232",
"config-bucket-123",
"web-servers",
"esbck-essnap-1djjegwy9fvyl",
"tempexpo"
]
}
}
}
that functions like an OR on exact terms.
I'm assuming that 1) the bucket names in question are unique and 2) that you're not looking for partial matches. If that's the case, plus if there are barely any analyzers set on the field bucketName, match_phrase may not even be needed! terms will do just fine. The difference between term and match_phrase queries is nicely explained here.

Related

multi fields search query for elasticsearch golang

I have a situation where I need to do elastic search based on multi-field. For Example: I have multiple fields in my postindex and I want to apply condition on four these fields (i.e. userid, channelid, createat, teamid) to meet my search requirement. When value of all these fields matched then search query displays results and if one of these is not match with values in postindex then it display no result.
I am trying to make a multifield search query for go-elasticsearch to search data from my post index. For the searcquery result four field must match otherwise it display 0 hit/no-result.
So, I think you need to write a following query :
GET postindex/_search
{
"query": {
"bool": {
"minimum_should_match": 1,
"should": [
{
"bool": {
"must": [
{
"term": {
"userid": {
"value": "mcqmycxpyjrddkie9mr13txaqe"
}
}
},
{
"term": {
"channelid": {
"value": "dnoihmrinins3qrm6bb9175ume"
}
}
},
{
"range": {
"createat": {
"gt": 1672909114890
}
}
}
]
}
},
{
"term": {
"teamid": {
"value": "qomrg11o8b8ijxoy8hrcnweoay"
}
}
}
]
}
}
}
In here, there is a bool query with should in parent scope, which is like OR. And inside the should there is another bool query with must which is like AND. We can also write the query shorter, but this will be better for you to understand.

Elastic Search 7.9: exact on multiple fields

I am using the latest version of Elastic Search (7.9) and i'm trying to find a good way to do a multiple term query as an AND.
Essentially what i want do is:
select * where field1 === 'word' AND field2 === 'different word'
I am currently using term to do an exact keyword match on field1. But adding in the second field is causing me some jip.
{
"query": {
"term": {
"field1": {
"value": "word"
}
}
}
}
This is my current query, i have tried using BOOL. I came across answers in previous versions where i could maybe use a filtered query. But i cant seem to get that to work either.
Can someone help me please. It's really doing my nut.
EDIT
Here is what i've tried with a bool / must with multiple terms. But i get no results even though i know in this case this query should return data
{
"query": {
"bool": {
"must": [
{
"term": {
"field1": {
"value": "word"
}
}
},
{
"term": {
"field2": {
"value": "other word"
}
}
}
]
}
}
}
Ok, so, fun fact. You can do the bool / match as i have tried. What you should remember is that keyword matches (which is what i'm using) are case sensitive. ES doesnt do any analyzing or filtering of the data when you set the type of a field to keyword.
It's also possible to use a bool with a filter to get the same results (when you factor in a type)
{
"query": {
"bool": {
"must":
{
"term": {
"field1": {
"value": "word"
}
}
},
"filter":
{
"term": {
"field2": {
"value": "other word"
}
}
}
}
}
}

How to transform a Kibana query to `elasticsearch_dsl` query

I have a query
GET index/_search
{
"query": {
"bool": {
"should": [
{
"match": {
"key1": "value"
}
},
{
"wildcard": {
"key2": "*match*"
}
}
]
}
}
}
I want to make the same call with elasticsearch_dsl package
I tried with
s = Search(index=index).query({
"bool": {
"should": [
{
"match": {
"key1": "value"
}
},
{
"wildcard": {
"key2": "*match*"
}
}
]
}
})
s.using(self.client).scan()
But the results are not same, am I missing something here
Is there a way to represent my query with elasticsearch_dsl
tried this, no results
s = Search(index=index).query('wildcard', key2='*match*').query('match', key1=value)
s.using(self.client).scan()
it seems to me that you forgot the stars in the query.
s = Search(index=index).query('wildcard', key='*match*').query('match', key=value)
This query worked for me
s = Search(index=index).query('match', key1=value)
.query('wildcard', key2='*match*')
.source(fields)
also, if key has _ like key_1 elastic search behaves differently and query matches results even which do not match your query. So try to choose your key which do not have underscores.

Elasticsearch : filter results based on the date range

I'm using Elasticsearch 6.6, trying to extract multiple results/records based on multiple values (email_address) passed to the query (Bool) on a date range. For ex: I want to extract information about few employees based on their email_address (annie#test.com, charles#test.com, heman#test.com) and from the period i.e project_date (2019-01-01).
I did use should expression but unfortunately it's pulling all the records from elasticsearch based on the date range i.e. it's even pulling other employees information from project_date 2019-01-01.
{
"query": {
"bool": {
"should": [
{ "match": { "email_address": "annie#test.com" }},
{ "match": { "email_address": "chalavadi#test.com" }}
],
"filter": [
{ "range": { "project_date": { "gte": "2019-08-01" }}}
]
}
}
}
I also tried must expression but getting no result. Could you please help me on finding employees using their email_address with the date range?
Thanks in advance.
Should(Or) clauses are optional
Quoting from this article.
"In a query, if must and filter queries are present, the should query occurrence then helps to influence the score. However, if bool query is in a filter context or has neither must nor filter queries, then at least one of the should queries must match a document."
So in your query should is only influencing the score and not actually filtering the document. You must wrap should in must, or move it in filter(if scoring not required).
GET employeeindex/_search
{
"query": {
"bool": {
"filter": {
"range": {
"projectdate": {
"gte": "2019-01-01"
}
}
},
"must": [
{
"bool": {
"should": [
{
"term": {
"email.raw": "abc#text.com"
}
},
{
"term": {
"email.raw": "efg#text.com"
}
}
]
}
}
]
}
}
}
You can also replace should clause with terms clause as in #AlwaysSunny's answer.
You can do it with terms and range along with your existing query inside filter in more shorter way. Your existing query doesn't work as expected because of should clause, it makes your filter weaker. Read more here:
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-bool-query.html
{
"query": {
"bool": {
"filter": [
{
"terms": {
"email_address.keyword": [
"annie#test.com", "chalavedi#test.com"
]
}
},
{
"range": {
"project_date": {
"gte": "2019-08-01"
}
}
}
]
}
}
}

Elasticsearch terms query on array of values

I have data on ElasticSearch index that looks like this
{
"title": "cubilia",
"people": [
"Ling Deponte",
"Dana Madin",
"Shameka Woodard",
"Bennie Craddock",
"Sandie Bakker"
]
}
Is there a way for me to do a search for all the people whos name starts with
"ling" (should be case insensitive) and get distinct terms properly cased "Ling Deponte" not "ling deponte"?
I am find with changing mappings on the index in any way.
Edit does what I want but is really bad query:
{
"size": 0,
"aggs": {
"person": {
"filter": {
"bool":{
"should":[
{"regexp":{
"people.raw":"(.* )?[lL][iI][nN][gG].*"
}}
]}
},
"aggs": {
"top-colors": {
"terms": {
"size":10,
"field": "people.raw",
"include":
{
"pattern": ["(.* )?[lL][iI][nN][gG].*"]
}
}
}
}
}
}
}
people.raw is not_analyzed
Yes, and you can do it without a regular expression by taking advantage of Elasticsearch's full text capabilities.
GET /test/_search
{
"query": {
"match_phrase": {
"people": "Ling"
}
}
}
Note: This could also be match or match_phrase_prefix in this case. The match_phrase* queries imply an order of the values in the text. match simply looks for any of the values. Since you only have one value, it's pretty much irrelevant.
The problem is that you cannot limit the document responses to just that name because the search API returns documents. With that said, you can use nested documents and get the desired behavior via inner_hits.
You do not want to do wildcard prefixing whenever possible because it simply does not work at scale. To put it in SQL terms, that's like doing a full table scan; you effectively lose the benefit of the inverted index because it has to walk it entirely to find the actual start.
Combining the two should work pretty well though. Here, I use the query to widdle down results to what you are interested in, then I use your inner aggregation to only include based on the value.
{
"size": 0,
"query": {
"match_phrase": {
"people": "Ling"
}
}
"aggs": {
"person": {
"terms": {
"size":10,
"field": "people.raw",
"include": {
"pattern": ["(.* )?[lL][iI][nN][gG].*"]
}
}
}
}
}
Hi Please find the query it may help for your request
GET skills/skill/_search
{
"query": {
"filtered": {
"query": {
"match_all": {}
},
"filter": {
"bool": {
"must": [
{
"wildcard": {
"skillNames.raw": "jav*"
}
}
]
}
}
}
}
}
My intention is to find documents starting with the "jav"

Resources