logstash fitler how to get the designated fields form log data - filter

the log is like this:
{
"playerId": 2,
"args": {
"uid": 2024657127,
"__route__": "userCenter.playerHandler.getOnLineUids"
},
"time": "03122053",
"timeUsed": 8,
"resp": {
"code": 200,
"uidState": {
"imId": 2024657127,
"uid": 0,
"state": 0
}
}
}
I just need the "__route__" and "timeUsed",
filter {
if "__route__" in [message] {
json {
source => "message"
remove_field => ["args.uid", "playerId", "time", "resp"]
}
}}
the result in kibana like this:
image of the result
we can see the field "arg.uid" is also there,how to delete the field like it? Or any other better way to get "__route__" and "timeUsed"?

Just replace args.uid with [args][uid] , it should work after that. Because in logstash every subfield is accessed by using [parent][child] notation

Related

What is proper way to use graphql query variables to search for values matching two different searches?

I have a "users" table that is connected to "interestTags" table. I would like to be able to search users interestTags and return all users that match one or more tags, In this example I would like to be able to return all users that has interestTags of either "dog" or "apple.
The code below is only showing matches for "apple" and leaving out the "dog" interestTag users. I would like to get both "dog" users and "apple" users returned instead of one or the other. How would I go about doing this? Here is my code:
users(offset: $offset, limit: 30, order_by: {lastRequest: asc}, where: {dob: {_gte: $fromDate, _lte: $toDate}, interestTagsFromSenderId: {_or: [{tag: $tagList}]}}) {
id
displayName
profilePhotoUrl
dob
bio
location
interestTags: interestTagsFromSenderId {
tag
}
created_at
}
}
graphql query variables:
{
"offset": 0,
"fromDate": "1999-07-01",
"toDate": "2024-01-01",
"tagList":
{
"_eq": "dog", "_eq": "apple"
}
}
This is what graphql is returning:
{
"data": {
"users": [
{
"id": 31,
"displayName": "n00b account",
"profilePhotoUrl": "default.jpg",
"dob": "2021-07-15",
"bio": null,
"location": null,
"interestTags": [
{
"tag": "apple"
}
],
"created_at": "2021-07-15T06:57:23.068243+00:00"
}
]
}
}
to fix the issue:
I added $tagList: [interestTags_bool_exp!] to the query function
I changed the query to interestTagsFromSenderId: {_or: $tagList}}
And changed the variable query to { "tagList": [{"tag": {"_eq": "dog"}}, {"tag": {"_eq": "apple"}}]}

How should I extract largest value or latest timestamp data in a graphQL query

When I execute following graphQL query which has only one function and I get output which is shown below.
I want output which has largest ID or the latest timestamp.
It is possible by making change in API but my constraint is not to make any change in API and have enhance the query only, Please help me how can I achieve my goal/ desired output
Input
query getAllCriticalevent{
getAllCriticalevent(patientId: 95)
{
id
startTime
}
}
Output
{
"data": {
"getAllCriticalevent": [
{
"id": "107",
"startTime": "2019-06-14 12:47:57.0"
},
{
"id": "1464",
"startTime": "2019-10-10 16:08:35.0"
},
{
"id": "1465",
"startTime": "2019-10-10 16:09:09.0"
},
{
"id": "1466",
"startTime": "2019-10-10 16:09:44.0"
},
{
"id": "1469",
"startTime": "2019-10-10 16:11:28.0"
},
{
"id": "1470",
"startTime": "2019-10-10 16:12:03.0"
},
{
"id": "1484",
"startTime": "2019-10-10 16:20:09.0"
}
]
}
}
My expected output is this
{
"startTime": "2019-10-10 16:20:09.0"
}
or
{
"id": "1484",
"startTime": "2019-10-10 16:20:09.0"
}
One way to do this is to add a column to the Type definition, then return it from your resolver.
In Laravel (not Java), the definition:
'max' => [
'type' => Type::int(),
'description' => 'The highest score achieved'
],
and a separate query in the ORM resolver (getMaxAttribute() is referenced as simply .max()):
public function getMaxAttribute() {
return DB::table('players')->max('score');
}
will return the max for a desired column. You request the column by name in GraphQL, just like normal (eg. "{ ... max }").

Separate multiple events in logstash input into separate documents in elasticsearch index

INPUT in logstash :
{
"Teacher": {
"Name": "Mary",
"age": 20,
},
"Student": [
{
"Name": "Tim",
"age"12
},
{
"Name": "Eric",
"age":13
}
]
}
Need to filter this input using logstash to send three separate documents into ElasticSearch.
doc1: {
"Name": "ABC",
"age": 20,
}
doc2: {
"Name": "Tim",
"age"12
}
doc 3:
{
"Name": "Eric",
"age":13
}
Tried split, mutate, ruby filters function but did not get the desired result. Could someone help me separate these into separate outputs to the elasticsearch index.
Since you want a separate event for 'Mary', use the clone filter to create two events. Delete the 'Students' array from one copy to just be left with 'Mary'.
In the second clone, using the split filter will give you different events for 'Tim' and 'Eric'.

how to store my json log file to logstash with json filter

This is my json log file. I'm trying to store the file to my elastic-Search through my logstash.
{ "id": "135569", "title" : "Star Trek Beyond", "year":2016 , "genre":
["Action", "Adventure", "Sci-Fi"] }
after storing the data into the elasticSearch, my results is as follow
{
"_index": "filebeat-6.2.4-2018.11.09",
"_type": "doc",
"_id": "n-J39mYB6zb53NvEugMO",
"_score": 1,
"_source": {
"#timestamp": "2018-11-09T03:15:32.262Z",
"source": "/Users/jinwoopark/Jin/json_files/testJson.log",
"offset": 106,
"message": """{ "id": "135569", "title" : "Star Trek Beyond", "year":2016 , "genre":["Action", "Adventure", "Sci-Fi"] }""",
"id": "%{id}",
"#version": "1",
"host": "Jinui-MacBook-Pro.local",
"tags": [
"beats_input_codec_plain_applied"
],
"prospector": {
"type": "log"
},
"title": "%{title}",
"beat": {
"name": "Jinui-MacBook-Pro.local",
"hostname": "Jinui-MacBook-Pro.local",
"version": "6.2.4"
}
}
}
What I'm trying to do is that,
I want to store only "genre value" into the message field, and store other values(ex id, title) into extra fields(the created fields, which is id and title field). but the extra fields were stored with empty values(%{id}, %{title}). It seems like I need to modify my logstash json filter, but here I need your help.
my current configuration of logstash is as follow
input {
beats {
port => 5044
}
}
filter {
json {
source => "genre" //want to store only genre (from json log) into message field
}
mutate {
add_field => {
"id" => "%{id}" // want to create extra field for id value from log file
"title" => "%{title}" // want to create extra field for title value from log file
}
}
date {
match => [ "timestamp", "dd/MM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "%{[#metadata][beat]}-%{[#metadata][version]}-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}
When you tell the json filter that the source is genre, it should ignore the rest of the document, which would explain why you don't get an id or title.
Seems like you should parse the entire json document, and use the mutate->replace plugin to move the contents of genre to message.

logstash - geoip in Kibana can not show any information using the IP addresses

I want to display the number of users accessing my app in a World Map using ElasticSearch, Kibana and Logstash.
Here is my log (Json format):
{
"device": "",
"public_ip": "70.90.17.210",
"mac": "00:01:02:03:04:05",
"ip": "192.16.1.10",
"event": {
"timestamp": "2014-08-15T00:00:00.000Z",
"source": "system",
"name": "status"
},
"status": {
"channel": "channelname",
"section": "pictures",
"downlink": 1362930,
"network": "Wi-Fi"
}
}
And this is my config file:
input {
file {
path => ["/mnt/logs/stb.events"]
codec => "json"
type => "event"
}
}
filter {
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss", "ISO8601" ]
}
}
filter {
mutate {
convert => [ "downlink", "integer" ]
}
}
filter {
geoip {
add_tag => [ "geoip" ]
database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
source => "public_ip"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
output {
elasticsearch {
host => localhost
}
}
At the end in Kibana I see only an empty geoip tag
Can someone help me and to point me where is my mistake?
Since Logstash 1.3.0 you can use the geoip.location field that is created automatically instead of creating the coordinates field and converting it to float manually.
One curly bracket seems to be missing from your log, I guess this is the correct format:
{
"device": {
"public_ip": "70.90.17.210",
"mac": "00:01:02:03:04:05",
"ip": "192.16.1.10"
},
"event": {
"timestamp": "2014-08-15T00:00:00.000Z",
"source": "system",
"name": "status"
},
"status": {
"channel": "channelname",
"section": "pictures",
"downlink": 1362930,
"network": "Wi-Fi"
}
}
In this case I would suggest you to try the following configuration for the filter (without mutate):
filter {
geoip {
source => "[device][public_ip]"
}
}
Then you should be able to use "geoip.location" in your map. I did quite some research and debugging to find out that in order to be resolved correctly, nested fields should be surrounded by [ ] when used as source.

Resources