Logstash data showing up as "message" field in elasticsearch - elasticsearch

I am trying to send some raw data to elasticsearch through logstash. I am trying to do this through the udp plugin but for now I dont think this is relevant.
Basically, I with to send key/value pairs, and I wish for this to show up as:
{
"key_1": "value_1"
....
}
instead of:
{
"message": "{\"key1\": \"value1\"}"
}
Is there any way for logstash to somehow "decode" the message as json and insert them as top level keys?
Thanks

I just needed to use a "json" codec on the input like so:
input {
udp {
port => 3425
codec => "json"
}
}
Thanks to Val for pointing this out

Related

Logstash - Can choose not to forward a log based on data?

Just learning how to use Logstash - goddamn there's a lot to learn on this :D
In my setup, I have CEF data being sent to my logstash.
Some cef events are just "statistic" information about the tool that is sending the cef events.
I want logstash to NOT send on these events. Is that possible?
Here is some psuedo code of what I think it would look like.
input {
udp {
port => 9001
codec => cef
}
filter {
if 'stat_heading' contains "Statistic Information" do not forward to elasticsearch
}
output {
elasticsearch {
host => ["192.168.0.20:9200"]
}
Could someone point me in the correct direction?
Edit
Okay - So i see the Filter does have an optional for IF Conditions. I'm going to read into this more, and when i get a working solution I'll post it.
Edit
got it working. Added solution in comments below.
I think you can try drop plugin to skip some data if it gets to filter
https://www.elastic.co/guide/en/logstash/current/plugins-filters-drop.html
Okay I've found my own answer to this.
You need to add in the conditional If statements, and if an event value matches some value then drop the event.
input {
udp {
port => 9001
codec => cef
}
filter {
if "Some string here" in [myheader] {
drop {}
}
output {
elasticsearch {
host => ["192.168.0.20:9200"]
}

sending different output using same logstash file

I need my logstash conf file to send a message to a kafka topic to indicate that the document processed has been sent to elasticsearch. I have my logstash file ready to structure the data to send to the ElasticSearch but I need to post 'yes' or 'no' message to a kafka topic through the same logstash file.
You can use mutiple outputs like
output
{
#output to console
stdout {
codec => rubydebug
}
#output to elasticsearch
elasticsearch {
hosts => [ "192.168.1.245:9201" ]
}
#output to kafka
kafka {
codec => json
topic_id => "mytopic"
}
}
First you need to have the yes/no value in a field, let's call it value.
Then add a kafka output, with the plain codec using the format option to add the yes/no value:
output {
#rest of your output configuration
kafka {
...
codec => plain {format => "%{[value]}"}
}
}

how i can create index to elastic search using tcp input protocol?

i have configured logstash 5.5 to use tcp protocol for give the json message.
input {
tcp {
port => 9001
codec => json
type => "test-tcp-1"
}
}
output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "logstash-%{type}-%{+YYYY.MM.dd}"
}
}
filter{
json { source => "message" }
}
The message has been received from logstash with successfully but elasticsearch not create a index ! Why ?
If use the same configuration with stdin input plugin work fine.
Many thanks.

data from rabbitmq not being read into kibana dashboard

I just altered my logstash-elasticearch setup to include rabbitmq rather since I wasn't able to get messages into logstash fast enough with tcp connection. Now it is blazing fast as logstash reads from the queue but I do not see the messages coming through into kibana. One error shows the timestamp field missing. I used the plugin/head to view the data and it is odd:
_index _type _id ▼_score #version #timestamp
pt-index logs Bv4Kp7tbSuy8YyNi7NEEdg 1 1 2014-03-27T12:37:29.641Z
this is what my conf file looks like now and below what it did look like:
input {
rabbitmq {
queue => "logstash_queueII"
host => "xxx.xxx.x.xxx"
exchange => "logstash.dataII"
vhost => "/myhost"
}
}
output {
elasticsearch{
host => "xxx.xxx.xx.xxx"
index => "pt-index"
codec => "json_lines"
}
}
this is what it was before rabbitmq:
input {
tcp {
codec => "json_lines"
port => "1516"
}
}
output {
elasticsearch {
embedded => "true"
}
}
Now the only change I made was to create a specific index in elasticsearch and have the data indexed there but now it seems the format of the message has changed. It is still json messages with 2/3 fields but not sure what logstash is reading or changing from rabbitmq. I can see data flowing into the histogram but the fields are gone.
"2014-03-18T14:32:02" "2014-03-18T14:36:24" "166" "google"
these are the fields I would expect. Like I said all this worked before I made the change.
I have seen examples of a similar configurations, but they do not use the output codec of "json_lines" going into Elasticsearch. The output codec would adjust the formatting of the data as it leaves logstash which I do not believe is nessisary. Try deleting the codec and see what logstash is outputting by adding a file output to a log, be sure this is only short sample...

elasticsearch message field value

I am sending json messages to logstash getting indexed by elasticsearch and managed to setup the UI dashboard in Kibana. I would like to filter the data by the message fields and cannot figure out how or where to do this. An example of my message:
{"message":"{"pubDate":"2014-02-25T13:09:14",
"scrapeDate":"2014-02-5T13:09:26",
"Id":"78967",
"query":"samsung S5",
"lang":"en"}
Right now it counts all these messages coming in but I need to get each message filtered by the fields itself for example like Id or lang or query.
Does this have to be done in the config file or can it be created in Kibana interface.
First, I assume your json messages is
{
"pubDate":"2014-02-25T13:09:14",
"scrapeDate":"2014-02-5T13:09:26",
"Id":"78967",
"query":"samsung S5",
"lang":"en"
}
When you send your message to logstash, you need to specify the codec to json. As show in the configuration below:
input {
stdin {
codec => json
}
}
output {
elasticsearch {
cluster => "abc"
}
}
Logstash will parsing your message to different field, like the output:
{
"pubDate" => "2014-02-25T13:09:14",
"scrapeDate" => "2014-02-5T13:09:26",
"Id" => "78967",
"query" => "samsung S5",
"lang" => "en",
"#version" => "1",
"#timestamp" => "2014-02-26T01:36:15.336Z",
"host" => "AAAAAAAAAA"
}
When you show this data in Kibana, You can use fieldname:value to query and filter what you need. For example, you can query all message with lang:en.

Resources