how to search in kibana (lucene syntaxe) values containing "?"? - elasticsearch

I am using ELK and I need to filter all the documents with an unmatched COUNTRY (from geoip)
Theses properties looks like:
'IPCOUNTRY': '??'
But I just can't filter on this special value...
I tried
IPCOUNTRY:?? => ? is evaluated > returns all records > normal case-
IPCOUNTRY:\?\? => Doesn't return any document... but lucene documentation says it should be the good way of achieving this...
IPCOUNTRY:"??" => doesnt work
IPCOUNTRY:'??' => doesnt work
EDIT:
This case doesn't work too
- IPCOUNTRY:/[^A-Z]{2}/
Simple but boring issue ^^
Thanx!

You could try :
!IPCOUNTRY:"?"
-IPCOUNTRY:"?"
NOT IPCOUNTRY:"?"
If you have an unanalyzed IPCOUNTRY field, you can do something like :
!IPCOUNTRY.raw:"??"

This is an elasticsearch mapping issue. Punctuation is dropped. You'll need to set your field to an analyzer that would keep ?. Maybe keyword? or not_analyzed?
extract from https://github.com/elastic/kibana/issues/6561#issuecomment-197951710

If all of your fields have documents same as 'IPCOUNTRY': '??', then you can directly filter this field which will exclude the field from matches.
To directly add a filter you can do it in the following 2 ways:-
In Discover page open the text and find the field. Click on + magnifier to add the field as a filter.
In Discover page, on the left side where fields are listed. Click on field name & select the value portaying as ?? to add it as a filter.

Related

Control which multi fields are queried by default

I have a preexisting index that contains field mappings and is currently being queried by many applications. I would like to add additional ways for the data to be queried, specifically, support full text search via analysis. Multi-fields seemed like the obvious way to do this, but I found that adding new multi-fields actually changes the existing query behavior.
For example, I have an "id" field that is a keyword. Applications are already using this field to query on. After I add a new multi-field, like "txt" (using the standard analyzer), new documents can be found by querying with just a partial value match. Values for "id" look like this: "123-abc" so now a query with just "abc" will match when querying against the "id" field. This is not how it worked previously (the keyword only field would require the entire value "123-abc").
Ideally, the top-level "id" field would be keyword only, and if a "full text" search was required, the query would need to specify "id.txt". So my question is... is there a way to disable multi-fields and require that the query explicitly set a sub field when needed?
My only other thought on how to solve this, was to use copy_to so that these fields are completely distinct... but that is a bit more work and there are many many fields to deal with that would require this.

Elasticsearch doesn't find values containing "-"

I ran into an issue where elasticsearch is not able to find indexed categories that contain an "-" in their slug. The category given is "wc-sitze". When I query for "wc", the category is found, and when I query for "sitze", it is also found, however when I query for the whole thing "wc-sitze", it is not found. I have checked with several other categories, and it seems that always the "-" is messing it up.
Any ideas?
Thanks in advance
Looks like you are using the text field with default analyzer standard which removes the special char like -. you need to change the analyzer to make it work.

Multiple Field search in Elasticsearch

How can we do multiple field search in Elastic search.
for example I want to search subcategory and region, for one field it is working for multiple field search how we have to do.
Below link is working fine, since I am using one field only for search
http://34c512ba34534fffdfd12abfd69f2458.us-east-1.aws.found.io:9200/episodes/episode/_search?q=sub_cat_seo_url:english-news&sort=pubdate_timestamp:desc
but when I try to search multiple field for example sub_cat_seo_url and region it is not working
see this link (not working)
http://34c512ba34534fffdfd12abfd69f2458.us-east-1.aws.found.io:9200/episodes/episode/_search?q=sub_cat_seo_url:english-news,region:1&sort=pubdate_timestamp:desc
http://34c512ba34534fffdfd12abfd69f2458.us-east-1.aws.found.io:9200/episodes/episode/_search?q=sub_cat_seo_url:english-news&region:1&sort=pubdate_timestamp:desc
According to documentation, it should work
See http://www.elasticsearch.org/guide/reference/query-dsl/query-string-query.html
That being said, you can also use the following:
http://34c512ba34534fffdfd12abfd69f2458.us-east-1.aws.found.io:9200/episodes/episode/_search?q=%2Bsub_cat_seo_url%3Aenglish-news+%2Bregion%3A1&sort=pubdate_timestamp:desc
NOTE :
The existing mapping makes your field "sub_cat_seo_url" analyzed which is analyzed using standard analyzer. Hence, when you are searching for "english-news" it gets tokenized into "english", "news" which results in any document matching either english or news to be valid matches. For eg. "telugu-news" is a valid match for your query. Not sure if it is intentional.
In your mapping you need to mark it as "not_analyzed" for exact match.
Note : %2b is decoded as '+' whereas '+' is decoded as ' '

How to add a numeric filter on kibana dashboard?

I have a field that contains numbers. I want a filter that shows all logs that are less than a constant value.
When I try to add a new query filter, all I can see is a query string option.
If you are talking about the query field a syntax like this works:
field:<10
Will find just records with a field value less than 10. Found this by experimentation one day -- don't know if it's documented anywhere.

Elasticsearch not searching some fields

I have just updated a website, the update adds new fields to elasticsearch.
In my dev environment, it all works fine. but on the live site, the new fields are not being found.
Eg. I have added a new field with the value : 1
However, when adding a filtered query of
{"field":1}
It does not find any matching results.
When I look in the documents, I can see docs with the field set to 1
Would the reason for this be that the new field was added after the mappings was set? I am not all that familiar with elasticsearch, So I am not really sure where to start looking to fix it.
Any help would be appreciated.
Update:
querying from URL shows nothing either
_search/?pretty=true&size=50&q=field1:*
however there is another field that was added at the same time which I can search on.
I can see field1 in the result set but it just wont allow me to search on it.
Only difference i see in the mapping is that the one that is working is set to type:long whereas the one not working is set as type:string
Is it a length issue on the ngram? what was your "min_gram" settings?
When you check on your index settings like this:
GET <host>/<index_name>/_settings
Does it work when you filter for a two digit field?
Are all the field values one digit?
It's OK to add a field after the mapping was set. ElasticSearch will guess the mapping for you. (in fact, it's one of their selling features --- no need to define the mapping, just throw the data at it)
There are a few things that can go wrong:
Verify that data is actually in the index. To do that, just navigate to the _search url with no parameters, you should see the field if it is indexed.
Look at your mapping. Could it be that the field is explicitly set not to be indexed?
Another possibility is that your query is wrong (but that is unlikely, since you're saying it works in the development environment)

Resources