Exact match in elastic search query - elasticsearch

I want to exactly match the string ":Feed:" in a message field and go back a day pull all such records. The json I have seems to also match the plain word " feed ". I am not sure where I am going wrong. Do I need to add "constant_score" to this query JSON? The JSON I have currently is as shown below:
{
"query": {
"bool": {
"must": {
"query_string": {
"fields": ["message"],
"query": "\\:Feed\\:"
}
},
"must": {
"range": {
"timestamp": {
"gte": "now-1d",
"lte": "now"
}
}
}
}
}
}

As stated here: Finding Exact Values, since the field has been analyzed when indexed - you have no way of exact-matching its tokens (":"). Whenever the tokens should be searchable the mapping should be "not_analyzed" and the data needs to be re-indexed.
If you want to be able to easily match only ":feed:" inside the message field you might want to costumize an analyzer which doesn't tokenize ":" so you will be able to query the field with a simple "match" query instead of wild characters.

Not able to do this with query_string but managed to do so by creating a custom normalizer and then using a "match" or "term" query.
The following steps worked for me.
create a custom normalizer (available >V5.2)
"settings": {
"analysis": {
"normalizer": {
"my_normalizer": {
"type": "custom",
"filter": ["lowercase"]
}
}
}
}
Create a mapping with type "keyword"
{
"mappings": {
"default": {
"properties": {
"title": {
"type": "text",
"fields": {
"normalize": {
"type": "keyword",
"normalizer": "my_normalizer"
},
"keyword" : {
"type": "keyword"
}
}
}
}
}
}
use match or term query
{
"query": {
"bool": {
"must": [
{
"match": {
"title.normalize": "string to match"
}
}
]
}
}
}

Use match phrase
GET /_search
{
"query": {
"match_phrase": {
"message": "7000-8900"
}
}
}
In java use matchPhraseQuery of QueryBuilder
QueryBuilders.matchPhraseQuery(fieldName, searchText);

Simple & Sweet Soln:
use term query..
GET /_search
{
"query": {
"term": {
"message.keyword": "7000-8900"
}
}
}
use term query instead of match_phrase,
match_phrase this find/match with ES-document stored sentence, It will not exactly match. It matches with those sentence words!

Related

not able to search in compounding query using analyzer

I have a problem index which has multiple fields e.g tags (comma separated string of tags), author, tester. I am creating a global search where problems can be searched by all these fields at once.
I am using boolean query
e.g
{
"query": {
"bool": {
"must": [{
"match": {
"author": "author_username"
}
},
{
"match": {
"tester": "tester_username"
}
},
{
"match": {
"tags": "<tag1,tag2>"
}
}
]
}
}
}
Without Analyzer I am able to get the results but it uses space as separator e.g python 3 is getting searched as python or 3.
But I wanted to search Python 3 as single query. So, I have created an analyzer for tags so that every comma-separated tag is considered as one, not by standard whitespace.
{
"settings": {
"analysis": {
"analyzer": {
"my_analyzer": {
"tokenizer": "my_tokenizer"
}
},
"tokenizer": {
"my_tokenizer": {
"type": "pattern",
"pattern": ","
}
}
}
},
"mappings": {
"properties": {
"tags": {
"type": "text",
"analyzer": "my_analyzer",
"search_analyzer": "standard"
}
}
}
}
But now I am not getting any results. Please let me know what I am missing here. I am not able to find the use of analyzer in compound queries in the documentation: https://www.elastic.co/guide/en/elasticsearch/reference/current/compound-queries.html
Adding an example:
{
"query": {
"bool": {
"must": [{
"match": {
"author": "test1"
}
},
{
"match": {
"tester": "test2"
}
},
{
"match": {
"tags": "test3, abc 4"
}
}
]
}
}
}
Results should match all the fields but for the tags field there should be a union of tags and query should be comma-separated not by space. i.e query should match test and abc 4 but above query searching for test, abc and 4.
You need to either remove search_analyzer from your mapping or pass my_analyzer in match query
GET tags/_search
{
"query": {
"bool": {
"must": [
{
"match": {
"tags": {
"query": "python 3",
"analyzer": "my_analyzer" --> by default search analyzer is used
}
}
}
]
}
}
}
By default, queries will use the analyzer defined in the field mapping, but this can be overridden with the search_analyzer setting.

Is is possible to term query with asciifolding?

I would need to match the whole field but using lowercase and asciifolding token filters. Is this possible in Elasticsearch?
For example, if I have a "Title" field for products and the product title is "Potovalni Kovček". And the user search query is "potovalni kovcek" then I need to return this product as the result. But only if the whole title matches the search query. If the user search query is "potovalni" or "Potovalni" or "kovcek" no results should be returned.
Can I create a term query with lowercase and asciifolding token filters? I couldn't figure out how to do that.
What I would do is to define the title field as a keyword and use a custom normalizer to do the job.
First let's create the index:
PUT test
{
"settings": {
"analysis": {
"normalizer": {
"exact": {
"type": "custom",
"filter": [
"lowercase",
"asciifolding"
]
}
}
}
},
"mappings": {
"doc": {
"properties": {
"title": {
"type": "keyword",
"normalizer": "exact"
}
}
}
}
}
Then, we index a sample document:
PUT test/doc/1
{
"title": "Potovalni Kovček"
}
Finally, we can search:
# Record 1 is returned
POST test/_search
{
"query": {
"term": {
"title": "Potovalni Kovček"
}
}
}
# Record 1 is returned
POST test/_search
{
"query": {
"term": {
"title": "potovalni kovcek"
}
}
}
# No record is returned
POST test/_search
{
"query": {
"term": {
"title": "potovalni"
}
}
}
# No record is returned
POST test/_search
{
"query": {
"term": {
"title": "kovcek"
}
}
}

elasticsearch aggregation on field containing spaces

I have a field that contains spaces called "CompanyName". The CompanyName field contains things like, "ABC Client", "BCD CLIENT 123", "EFG CLIENT HIJ"
When I index the data I set the mapping to "index" : "not_analyzed". When I run an aggregation, without any other queries, it appears to work fine.
The issue I have is that if I want to first run another query and then get an aggregation of those results, the aggregation then breaks because it interprets the spaces in the company names, so it looks like the aggregation is run over the output of the first query and not over the field that I setup.
The query:
{
"size": 0,
"query": {
"filtered": {
"filter": {
"bool": {
"must": [
{
"term": {
"Stuff": "1"
}
},
{
"term": {
"filename": "FileOfData.sourcedata"
}
}
]
}
}
}
},
"aggs": {
"users": {
"terms": {
"field": "CompanyName"
}
}
}
}
I have also tried adding a custom analyzer using:
"analysis": {
"analyzer": {
"companynamestring": {
"type": "custom",
"tokenizer": "keyword",
"filter": "lowercase"
}
}
}
}
And it is still not working. Does anyone know how I can run a query and then get an aggregation that returns only the full CompanyName field and is not tokenized?
Thanks!

Elasticsearch: Unable to search with wordforms

I am trying to setup Elasticsearch, created index, added some records but can not make it return results with word forms (for example: records with substring "dreams" when I search for "dream").
My records look like this (index "myindex/movies"):
{
"id": 1,
"title": "What Dreams May Come",
... other fields
}
The configuration I tried to use:
{
"settings": {
"analysis": {
"analyzer": {
"stem": {
"tokenizer": "standard",
"filter": [
"standard",
"lowercase",
"stop",
"porter_stem"
]
}
}
}
},
"mappings": {
"movies": {
"dynamic": true,
"properties": {
"title": {
"type": "string",
"analyzer": "stem"
}
}
}
}
}
And query look like this:
{
"query": {
"query_string": {
"query": "Dream"
}
}
}
I can get result back using word "dreams" but not "dream".
Do I do something wrong?
Should I install porter_stem somehow first?
You haven't done anything wrong , just that you are searching in wrong field.
query_string , does the search on _all by default. And _all is having its own analyzer.
So either you need to apply the same analyzer to _all or point your query to title field like below -
{
"query": {
"query_string": {
"query": "dream",
"default_field": "title"
}
}
}

Elasticsearch exact matches when query text is a substring

I have data in my Elasticsearch with a field
PUT /logs/visited_domains/1
{
"visited_domain":"microsoft.com"
}
PUT /logs/visited_domains/2
{
"visited_domain":"not-microsoft.com"
}
The mapping is:
{
"properties": {
"visited_domain": {
"type": "string",
"index": "not_analyzed"
}
}
}
When I do an ElasticSearch of
{
"query": {
"filtered": {
"filter": {
"term": {
"visited_domain": "microsoft.com"
}
}
}
}
}
I will get both results. But I only want the exact match. Any ideas of how I alter the query or improve the mapping?
EDIT: I changed one of my examples from notmicrosoft.com to not-microsoft.com because this dash is causing alot of the trouble. notmicrosoft.com does not return, but not-microsoft.com does, when searching for microsoft.com.
Use query_string which gives exact match when used with quotes
"query": {
"query_string": {
"default_field": "visited_domain",
"query": "\"microsoft.com\""
}
}

Resources