Searching for error codes with regex pattern in Kibana - elasticsearch

I am trying to search my logs within the search bar of Kibana UI for error codes that consist of:
a fixed 3 digit String
a minus
a 5 digit number
, e.g. TED-12345. The error codes can be located anywhere inside the message field
I tried the following Regex message: /.*TED-[0-9]{5}.*/ but it did not return the expected results. I probably mixed query and "search bar syntax". Can anybody suggest the correct regex?

Please make sure you have the Lucene syntax enabled for your queries, because Kibana Query Language does not support regular expressions.
From docs: https://www.elastic.co/guide/en/kibana/master/kuery-query.html
KQL has a different set of features than the Lucene query syntax. KQL is able to query nested fields and scripted fields. KQL does not support regular expressions or searching with fuzzy terms. To use the legacy Lucene syntax, click KQL next to the Search field, and then turn off KQL.

Related

How to search exact word in a test in Elastic Search

Let's say I have two texts:
Text 1 - "The fox has been living in the wood cabin for days."
Text 2 - "The wooden hammer is a dangerous weapon."
And I would like to search for the word "wood", without it matching me "wooden hammer". How would I do that in Elastic Search or nest?
Term query is used for exact matches search. However it's not recommended to use it against text fields, the following quote from term query documentation:
To better search text fields, the match query also analyzes your
provided search term before performing a search. This means the match
query can search text fields for analyzed tokens rather than an exact
term.
The term query does not analyze the search term. The term query only
searches for the exact term you provide. This means the term query may
return poor or no results when searching text fields.
The problem with text exact matches, as described in the Term query documentation:
By default, Elasticsearch changes the values of text fields as part of
analysis. This can make finding exact matches for text field values
difficult.
So, the documents data is modified (i.e., analyzed) before indexing. This depends on the index mapping definition for each field, defaults to the default index analyzer, or the standard analyzer.
But the default standard analyzer will not change the token "Wooden" to "Wood", this might happen if you used stemming for this field.
This means, if you don't use a different analyzer or stemming, querying with "Wood" shouldn't match "Wooden" token.
To summarize: Indexed data is modified/analyzed before indexing (based on the field mapping definition). Match query analyze the search query, while Term query doesn't analyze the search query. So you have to properly chose the field mapping and the search query to better suit your use case
For some use cases, like storing email addressed, phone numbers or keyword fields that always have the same value, consider using the Keyword type, which is suitable for exact matches in these use cases. However, ES recommends:
Avoid using keyword fields for full-text search. Use the text field
type instead.
So for better visibility and practical solution for your use case, it's better to elaborate more the field mapping you use and what you want to achieve.

How can I use standard SQL on text fields of elastic without using the specials SQL elasticSearch operators?

I would like to create SQL query on some text field (not keyword) for example "name" field and send that query to elastic server.
my problem is that I need to use the standard SQL language (not the MATCH and QUERY operators which are specials for elastic SQL) of text fields.
when I tried to use JDBC driver or when I tried to use high-level-java-client with LIKE operatorI got the following error
"No keyword/multi-field defined exact matches for [name]; define one or use MATCH/QUERY instead"
I also tried to use the translate API of elasticsearch- but even there I couldn't use the "LIKE" operator on text fields only on keyword fields.
does anyone have any solution for me? I want to use the LIKE operator on text fields instead of the full text operators which are unique to elastic sql.
Please check the this documentation. they have clearly mentioned in document that it is not possible.
One significant difference between LIKE/RLIKE and the full-text search
predicates is that the former act on exact fields while the latter
also work on analyzed fields. If the field used with LIKE/RLIKE
doesn’t have an exact not-normalized sub-field (of keyword type)
Elasticsearch SQL will not be able to run the query. If the field is
either exact or has an exact sub-field, it will use it as is, or it
will automatically use the exact sub-field even if it wasn’t
explicitly specified in the statement.
If you still want to used text field then you need to enabled multi-field as mentioned here. or you can try out to enable fielddata on text field but i am not sure that it will work SQL or not.

How to get unanalyed text in ES for RegEx or WildCard queries?

With a text field, if I do a doc['my_text_field'] it shows analyzed tokens. As a result, the regex and wildcard queries can not be formulated in a resonable way...
Although, the text is available when doing adding a scripted field via params._source.my_text_field in its original form.
How do I get the unanalyed text like params._source.my_text_field for a normal elasticsearch query?

How to do "where not exists" type filtering in Kibana/ELK?

I am using ELK to create dashboards from my log files. I have a log file with entries that contain an id value and a "success"/"failure" value, displaying whether an operation with a given id succeeded or failed. Each operation/id can fail an unlimited number of times and succeed at most once. In my Kibana dashboard I want to display the count of log entries with a "failure" value for each operation id, but I want to filter out cases where a "success" log entry for the id exists. i.e. I am only interested in operations that never succeeded. Any hints for tricks that would achieve this?
This is easy in Kibana 5 search bar. Just add a filter
!(_exists_:"your_variable")
you can toggle the filter or write the inverse query as
_exists_:"your_variable"
In Kibana 4 and Kibana 3 you can use this query which is now deprecated
_missing_:"your_variable"
NOTE: In Elasticsearch 7.x, Kibana now has a pull down to select KQL or Lucene style queries in the search bar. Be mindful that syntax such as _exists_:FIELD is a Lucene syntax and you need to set the pulldown accordingly.
In newer ELK versions (I think after Elasticsearch 6) you should use field:* to check if the field exist and not field:* to check if it's missing.
elastic search reference:
https://www.elastic.co/guide/en/elasticsearch/reference/6.5/query-dsl-query-string-query.html#_wildcards
! (_exists_:NAME) is not working for me. I use suggestion from:
https://discuss.elastic.co/t/kibana-5-0-0--missing--is-not-working-anymore/64336
NOT _exists_:NAME
UPDATE The problem I faced is that ES syntax forbids spaces after negation operators. Use one of:
NOT _exists_:FIELD
!_exists_:FIELD
-_exists_:FIELD
Check tutorial: https://www.timroes.de/2016/05/29/elasticsearch-kibana-queries-in-depth-tutorial/
NOTE: In Elasticsearch 7.x, Kibana now has a pull down to select KQL or Lucene style queries in the search bar. Be mindful that syntax such as _exists_:FIELD is a Lucene syntax and you need to set the pulldown accordingly.
In newer versions of Kibana the default language is now KQL (Kibana Query Language) not Lucene anymore. So most answers here are outdated. The query if a field exists is the following:
your_variable:*
and to answer your question you can just negate that:
not your_variable:*
You can find more documation on here: https://www.elastic.co/guide/en/kibana/7.15/kuery-query.html
You can also toggle back to Lucene if you click on that button inside the search field but in my opinion the new language is way easier to use:
One option would be to create an own query for this criteria in Kibana. Then just have your panel that does the counting just to use this query.
value:failure
More information here:
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html#query-string-syntax

Elasticsearch autocomplete and searching against multiple term fields

I'm integrating elasticsearch into an asset tracking application. When I setup the mapping initially, I envisioned the 'brand' field being a single-term field like 'Hitachi', or 'Ford'. Instead, I'm finding that the brand field in the actual data contains multiple terms like: "MB 7 A/B", "B-7" or even "Brush Bull BB72X".
I have an autocomplete component setup now that I configured to do autocomplete against an edgeNGram field, and perform the actual search against an nGram field. It's completely useless the way I set it up because users expect the search results to be restricted to what the autocomplete matches.
Any suggestions on the best way to setup my mapping to support autocomplete and subsequent searches against a multiple term field like this? I'm considering a terms query against a keyword field, or possibly a match query with 'and' as the operator? I also have to deal with hyphens like "B-7".
you can use phrase suggest, the guide is here:
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/search-suggesters.html
the phrase suggest guide is here:
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/search-suggesters-phrase.html

Resources