Elasticsearch doesn't return results for a specific term search - elasticsearch

I am attempting to do a query where I filter on term for a specific term. This is the query I am attempting to run:
{
"query": {
"filtered": {
"filter": {
"term": {
"tags": "sports"
}
}
}
},
"sort": {
"timestamp": "desc"
}
}
When I run the same query with a different field (ex: "type": "blog_post") it works, so I am confident in the syntax.
I checked to make sure that tags was properly mapped (I checked at "http://server_name/index/_mapping") and it was.
I also checked that there are documents with "tags" : "sports" in Elasticsearch.
Any ideas what the issue could be? It is only that field, all others work, and "tags" is indexed.

What is the mapping/analyzer you have defined for the field "tags"? If you have not defined any analyzer then it will be analysed using the standard analyzer which in turn will give stemmed token "sport" instead of "sports"
If you do a term search or term filter the input is not analyzed, and will try to search for an exact match. So search for term "sports" won't match.
You should either change the mapping for tags to "not_analyzed" or change the search query to something other than term, like query string query.

Based on a use case you've described I assume tags is mapped as an array of values. That said, term filter can only be used for exact matches.
What I would try is to use terms filter or exist filter instead and change the query to this:
"terms" : { "tags" : "sports" }
or this
"exists" : { "tags" : "sports" }

Related

is there match phrase any query in elasticsearch?

In elasticsearch match_phrase query will match full phrase.
match_phrase_prefix query will match phrase as prefix.
for example:
"my_field": "confidence ab"
will match: "confidence above" and "confidence about".
is there query for "match phrase any" like below example:
"my_field": "dence ab"
should fetch match: "confidence above" and "confidence about"
Thanks
There are 2 ways that you can do this
Store the field values as-is in ES by applying keyword analyzer type in mapping => Do a wildcard search
(OR)
Store the field using ngram tokenizer => Do search your data based on your requirement with or without using standard or keyword search analyzers
usually wildcard search are performance inefficient .
Please do let me know on your progress based on my above suggestions so that I can help you further if needed
You need to define the mapping of your field to keyword like below:
PUT test
{
"mappings": {
"properties": {
"name":{
"type": "keyword"
}
}
}
}
Then search over this field using wildcard like below:
GET test/_search
{
"query": {
"wildcard": {
"name": {
"value": "*dence ab*"
}
}
}
}
Please let me know if your have any problem with this.
In your case, the simplest solution is using Query string query or Simple query string query. The latter one is less strict with the query syntax error.
First, make sure that your field is mapped with type text. The example below create a mapping for field named my_field under the test-index.
{
"test-index" : {
"mappings" : {
"properties" : {
"my_field" : {
"type" : "text"
}
}
}
}
}
Then, for searching, use query string query with wild-cards.
{
"query": {
"query_string": {
"fields": ["my_field"],
"query": "*dence ab*"
}
}
}

Elastic search REST query returning more than expected

In order to search for the exact string : "AGA>23/180#20210212" I've tried the below match queries :
{"query": { "match" : {"mid": "AGA>23/180#20210212"}}}
{"query": {"bool": { "must" : [ { "match" : { "mid": "AGA>23/180#23221"}}]}}}
elastic search matches on "AGA>135/880#20210212" & "AGA>212/880#20210212"
So it seems the values 135 & 212 are treated like wildcards.
If I use query instead: {"query": { "term" : {"mid": "AGA>23/180#20210212"}}}
then 0 results are returned.
How to search for value "AGA>23/180#20210212" only ?
The term query returns documents that contain an exact term in a
provided field.
By default standard analyzer is used. It will provide grammar based tokenization for AGA>23/180#20210212 and will generate the following tokens.
aga, 23, 180, 20210212
Due to this, the match query matches on "AGA>135/880#20210212" & "AGA>212/880#20210212"
To search for the exact term you need to add .keyword to the mid field (if you have not explicitly defined any mapping for the field). This uses the keyword analyzer instead of the standard analyzer (notice the ".keyword" after mid field). Try out this below query -
{
"query": {
"term": {
"mid.keyword": "AGA>23/180#20210212"
}
}
}
OR you can change your index mapping to
{
"mappings": {
"properties": {
"mid": {
"type": "keyword"
}
}
}
}

Elasticsearch - match not_analyzed field with partial search term

I have a "name" field - not_analyzed in my elasticsearch index.
Lets say value of "name" field is "some name". My question is, if I want a match for the search term - some name some_more_name someother name because it contains some name in it, then will not_analyzed allow that match to happen, if not, then how can I get a match for the proposed search term?
During the indexing the text of name field is stored in inverted index. If this field was analyzed, 2 terms would go to the inverted index: some and name. But as it is not analyzed, only 1 term is stored: some name
During the search (using match query), by default your search query is analyzed and tokenized. So there will be several terms: some, name, some_more_name and someother. Then Elasticsearch will look at inverted index to see if there is at least one term from the search query. But there is only some name term, so you won't see this document in the result set.
You can play with analyzers using _analyze endpoint
Returning to your question, if you want to get a match for the proposed search query, your field must be analyzed.
If you need to keep non-analyzed version as well you should use multi fields:
PUT my_index
{
"mappings": {
"my_type": {
"properties": {
"name": {
"type": "keyword",
"fields": {
"analyzed": {
"type": "text"
}
}
}
}
}
}
}
Taras has explained clearly,and i issue might have resolved,but still if you cant change mapping of your index ,you can use query(I have tested in 5.4 ES)
GET test/_search
{
"query": {
"query_string": {
"default_field": "namekey",
"query": "*some* *name*",
"default_operator": "OR"
}
}

ElasticSearch look through multiple fields as a fuzzy query

{
"title" : "That Uselessly Amazing Title",
"author" : "Someone you have never heard of",
"url" : "http://www.theuselessweb.com",
"summary" : "a collection of useless websites",
"tag" : ["useless","maybe useful"]
}
Say I have a schema that looks like the one shown above. The user asks the application to show something "useless".
How do I write a query that will look through the title, summary, and tags for the word "useless" as a fuzzy search?
From the docs Fuzzy match Query
GET /my_index/my_type/_search
{
"query": {
"multi_match": {
"fields": [ "summary", "title", "tag" ],
"query": "useless",
"fuzziness": "AUTO"
}
}
}
This query works because it's using a multi_match query
Fuzziness works only with the basic match and multi_match queries. It
doesn’t work with phrase matching, common terms, or cross_fields
matches.
Otherwise you'll have to combine several fuzzyqueries inside a bool Query

ElasticSearch - Search for complete phrase only

I am trying to create a search that will return me exactly what i requested.
For instance let's say i have 2 documents with a field named 'Val'
First doc have a value of 'a - Copy', second document is 'a - Copy (2)'
My goal is to search exactly the value 'a - Copy' and find only the first document in my returned results and not both of them with different similarity rankings
When i try most of the usual queries like:
GET test/_search
{
"query": {
"match": {
"Val": {
"query": "a - copy",
"type": "phrase"
}
}
}
}
or:
GET /test/doc/_search
{
"query": {
"query_string": {
"default_field": "Val",
"query": "a - copy"
}
}
}
I get both documents all the time
There is a very good documentation for finding exact values in ES:
https://www.elastic.co/guide/en/elasticsearch/guide/current/_finding_exact_values.html
It shows you how to use the term filter and it mentions problems with analyzed fields, too.
To put it in a nutshell you need to run a term filter like this (I've put your values in):
GET /test/doc/_search
{
"query" : {
"filtered" : {
"query" : {
"match_all" : {}
},
"filter" : {
"term" : {
"Val" : "a - copy"
}
}
}
}
}
However, this doesn't work with analyzed fields. You won't get any results.
To prevent this from happening, we need to tell Elasticsearch that
this field contains an exact value by setting it to be not_analyzed.
There are multiple ways to achieve that. e.g. custom field mappings.
Yes, you are getting that because your field is, most likely, analyzed and split into tokens.
You need an analyzer similar to this one
"custom_keyword_analyzer": {
"type": "custom",
"tokenizer": "keyword",
"filter": "lowercase"
}
which uses the keyword tokenizer and the lowercase filter (I noticed you indexed upper case letters, but expect to search with lowercase letters).
And then use a term filter to search your documents.

Resources