Search for a value in all fields while having an existing filter in Cloud Logging - google-cloud-logging

I have the following filter in Cloud Logging that shows me all logs from a particular instance:
(resource.type="gce_instance" AND resource.labels.instance_id="***") OR (resource.type="global" AND jsonPayload.instance.id="***")
In this set, I want to search for a value in all fields. By looking at the documentation https://cloud.google.com/logging/docs/view/advanced-queries#searching-examples I found that I can write a simple word unicorn in the query fields and it will search in all fields. It works, but it searches in all my logs. But I want to search in the filtered logs set only, not across all logs in Cloud Logging. I want to get all rows containing the word failed and tried this:
((resource.type="gce_instance" AND resource.labels.instance_id="***") OR (resource.type="global" AND jsonPayload.instance.id="***")) and failed
But id doesn't work. How can I search in all fields while already having a filter?

Try to run the query formatting this way the last part:
((resource.type="gce_instance" AND resource.labels.instance_id="***") OR
(resource.type="global" AND jsonPayload.instance.id="***")) AND "failed"
Cheers,

Related

Cannot use "OR" with "NOT _exists_" in Kibana 6.8.0 search bar

I am trying to create one query in the Kibana search bar to retrieve some specific documents.
The goal is to get the documents that either have the field "myDate" before 2019-10-08 or "myDate" does not exist.
I have documents that meet one or the other condition.
I started by creating this query :
myDate:<=2019-10-08 OR NOT _exists_:myDate
But no documents were returned.
Since it did not work, I tried some other ways i found online :
myDate:<=2019-10-08 OR NOT (_exists_:myDate)
myDate:<=2019-10-08 OR !(_exists_:myDate)
myDate:<=2019-10-08 OR NOT (myDate:*)
But still, no results.
When I use either "part" of the "OR" condition, it works perfectly : I get either the documents who have myDate<=2019-10-08 or the ones that do not have a "myDate" field filled.
But when I try with both conditions, I get no document.
I have to use only the search bar to find these documents, neither an elasticsearch rest query nor by using kibana filters.
Thank you for your help :)
Below query works. Use Inspect button in kibana to see what query is actually being fired and make sure you are using correct index pattern as well.
(myDate:<=2019-12-31) OR (NOT _exists_:myDate)
Take a look at Query DSL documentation for Boolean operators for more better understanding with different use cases

Can I write a query that will bring me logs with unique identifier (Graylog)?

I just started working with graylog and I have some issues.
Can I write a query that will bring me logs with unique identifier?
For examples I have logs with op_id and loan_amt and I want to get sum of loan_amt from all logs. Here comes the problem : some logs may share same op_id and my sum will not be correct because will add plenty times the loan_amt from logs with same op_id
Can you help me, please?
If I understand correctly you will need to further narrow down your search criteria to filter out duplicate log entries.
You can use the GrayLog search query language to do this.
Try to find fields where duplicate logs differentiate from each other and then create a filter to exclude one from your results.
For example something like this:
source:hostname.that.logs.loans_amt AND LoggerName:your.logger.that.logs.loan_amt

Is there a way to define a dynamic query in Kibana dashboard?

A somewhat similar question has been asked here but there's no answer for that yet. That question relates to an older version of Kibana so I hope you can help me.
I'm trying to setup some predefined queries in the Kibana dashboard. I'm using Kibana 5.1. The purpose of those queries is filtering some logs based on multiple different parameters.
Let's see a query I'd like to execute:
{
"index": "${index_name}",
"query": {
"query_string": {
"query": "message:(+\"${LOG_LEVEL}\")",
"analyze_wildcard": true
}
}
}
I know I can query directly in the dashboard something like "message:(+"ERROR")" and manually change the ERROR to WARN for example, but I don't want that - imagine that this query might be more complex and contain multiple fields.
Note that the data stored in the message is not structured - think of the message as a whole log line. This means I don't have fields like LOG_LEVEL which I could filter directly.
Is there any way I can set the index_name and LOG_LEVEL dynamically from the Kibana Discover dashboard?
You should go to discover, open one document and click over this button in any of the fields. After this, a filter will appear under the search bar and you can edit it and put any custom query. If you want add more filters with more custom queries you can repeat the same action with a different document or field or you can do to Settings (or Management), Saved Objects, go to the Search you saved and to the JSON representation and copy and paste the elements inside the filter array field as many times you want.
And remember that in order to apply one of the filters, you probably should disable the enabled ones (otherwise it will filter by all the enabled filters in your dashboard).

How to do "where not exists" type filtering in Kibana/ELK?

I am using ELK to create dashboards from my log files. I have a log file with entries that contain an id value and a "success"/"failure" value, displaying whether an operation with a given id succeeded or failed. Each operation/id can fail an unlimited number of times and succeed at most once. In my Kibana dashboard I want to display the count of log entries with a "failure" value for each operation id, but I want to filter out cases where a "success" log entry for the id exists. i.e. I am only interested in operations that never succeeded. Any hints for tricks that would achieve this?
This is easy in Kibana 5 search bar. Just add a filter
!(_exists_:"your_variable")
you can toggle the filter or write the inverse query as
_exists_:"your_variable"
In Kibana 4 and Kibana 3 you can use this query which is now deprecated
_missing_:"your_variable"
NOTE: In Elasticsearch 7.x, Kibana now has a pull down to select KQL or Lucene style queries in the search bar. Be mindful that syntax such as _exists_:FIELD is a Lucene syntax and you need to set the pulldown accordingly.
In newer ELK versions (I think after Elasticsearch 6) you should use field:* to check if the field exist and not field:* to check if it's missing.
elastic search reference:
https://www.elastic.co/guide/en/elasticsearch/reference/6.5/query-dsl-query-string-query.html#_wildcards
! (_exists_:NAME) is not working for me. I use suggestion from:
https://discuss.elastic.co/t/kibana-5-0-0--missing--is-not-working-anymore/64336
NOT _exists_:NAME
UPDATE The problem I faced is that ES syntax forbids spaces after negation operators. Use one of:
NOT _exists_:FIELD
!_exists_:FIELD
-_exists_:FIELD
Check tutorial: https://www.timroes.de/2016/05/29/elasticsearch-kibana-queries-in-depth-tutorial/
NOTE: In Elasticsearch 7.x, Kibana now has a pull down to select KQL or Lucene style queries in the search bar. Be mindful that syntax such as _exists_:FIELD is a Lucene syntax and you need to set the pulldown accordingly.
In newer versions of Kibana the default language is now KQL (Kibana Query Language) not Lucene anymore. So most answers here are outdated. The query if a field exists is the following:
your_variable:*
and to answer your question you can just negate that:
not your_variable:*
You can find more documation on here: https://www.elastic.co/guide/en/kibana/7.15/kuery-query.html
You can also toggle back to Lucene if you click on that button inside the search field but in my opinion the new language is way easier to use:
One option would be to create an own query for this criteria in Kibana. Then just have your panel that does the counting just to use this query.
value:failure
More information here:
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/query-dsl-query-string-query.html#query-string-syntax

Elasticsearch not searching some fields

I have just updated a website, the update adds new fields to elasticsearch.
In my dev environment, it all works fine. but on the live site, the new fields are not being found.
Eg. I have added a new field with the value : 1
However, when adding a filtered query of
{"field":1}
It does not find any matching results.
When I look in the documents, I can see docs with the field set to 1
Would the reason for this be that the new field was added after the mappings was set? I am not all that familiar with elasticsearch, So I am not really sure where to start looking to fix it.
Any help would be appreciated.
Update:
querying from URL shows nothing either
_search/?pretty=true&size=50&q=field1:*
however there is another field that was added at the same time which I can search on.
I can see field1 in the result set but it just wont allow me to search on it.
Only difference i see in the mapping is that the one that is working is set to type:long whereas the one not working is set as type:string
Is it a length issue on the ngram? what was your "min_gram" settings?
When you check on your index settings like this:
GET <host>/<index_name>/_settings
Does it work when you filter for a two digit field?
Are all the field values one digit?
It's OK to add a field after the mapping was set. ElasticSearch will guess the mapping for you. (in fact, it's one of their selling features --- no need to define the mapping, just throw the data at it)
There are a few things that can go wrong:
Verify that data is actually in the index. To do that, just navigate to the _search url with no parameters, you should see the field if it is indexed.
Look at your mapping. Could it be that the field is explicitly set not to be indexed?
Another possibility is that your query is wrong (but that is unlikely, since you're saying it works in the development environment)

Resources