I have configured my logstash file like this
input {
kafka {
topics => [
...
]
bootstrap_servers => "${KAFKA_URL}"
codec => "json"
}
}
filter {
...
}
output {
elasticsearch {
index => "logstash-%{organizationId}"
hosts => ["${ELASTICSEARCH_URL}"]
codec => "json"
}
stdout { codec => json }
}
the elasticsearch output url is coming from the environment variable.
I want to improve the behavior of logstash and change dynamically the output server url based on the some info that came in the kafka message
It is possible to do it?
thanks in advance
Related
I need my logstash conf file to send a message to a kafka topic to indicate that the document processed has been sent to elasticsearch. I have my logstash file ready to structure the data to send to the ElasticSearch but I need to post 'yes' or 'no' message to a kafka topic through the same logstash file.
You can use mutiple outputs like
output
{
#output to console
stdout {
codec => rubydebug
}
#output to elasticsearch
elasticsearch {
hosts => [ "192.168.1.245:9201" ]
}
#output to kafka
kafka {
codec => json
topic_id => "mytopic"
}
}
First you need to have the yes/no value in a field, let's call it value.
Then add a kafka output, with the plain codec using the format option to add the yes/no value:
output {
#rest of your output configuration
kafka {
...
codec => plain {format => "%{[value]}"}
}
}
I have setup local ELK. All works fine, but before trying to write my own GROK pattern I wonder is there already one for Winston style logs?
That works great for Apache style log.
I would need something that works for Winston style. I think JSON filter would do the trick, but I am not sure.
This is my Winston JSON:
{"level":"warn","message":"my message","timestamp":"2017-03-31T11:00:27.347Z"}
This is my Logstash configuration file example:
input {
beats {
port => "5043"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}
For some reason it is not getting parsed. No error.
Try like this instead:
input {
beats {
port => "5043"
codec => json
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}
I'm sending data in a url to logstash 5.2 and I would like to parse it in logstash, so every url parameter becomes a variable in logstash and I can visualize it properly in kibana.
http://127.0.0.1:31311/?id=ID-XXXXXXXX&uid=1-37zbcuvs-izotbvbe&ev=pageload&ed=&v=1&dl=http://127.0.0.1/openpixel/&rl=&ts=1488314512294&de=windows-1252&sr=1600x900&vp=1600x303&cd=24&dt=&bn=Chrome%2056&md=false&ua=Mozilla/5.0%20(Macintosh;%20Intel%20Mac%20OS%20X%2010_11_3)%20AppleWebKit/537.36%20(KHTML,%20like%20Gecko)%20Chrome/56.0.2924.87%20Safari/537.36&utm_source=&utm_medium=&utm_term=&utm_content=&utm_campaign=
This is my logstash conf file:
input
{
http
{
host => "127.0.0.1"
port => 31311
}
}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
}
stdout
{
codec => rubydebug
}
}
You could use the grok filter to match your params in your url as such:
filter {
grok {
match => [ "message", "%{URIPARAM:url}" ]
}
And then you might have to use kv filter in order to split your data:
kv {
source => "url"
field_split => "&"
}
This SO might become handy. Hope this helps!
I am trying to get the data from Kafka and push it to ElasticSearch.
Here is the logstash configuration I am using:
input {
kafka {
zk_connect => "localhost:2181"
topic_id => "beats"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "elasticse"
}
}
Can anyone help here with the logstash configuration? If I run this I am getting invalid configuration error.
D:\logstash-5.0.0\bin>logstash -f log-uf.conf
Sending Logstash logs to D:\logstash-5.0.0\logs\logstash-plain.txt which is now
configured via log4j2.properties.
[2016-11-11T16:31:32,429][ERROR][logstash.inputs.kafka ] Unknown setting 'zk_
connect' for kafka
[2016-11-11T16:31:32,438][ERROR][logstash.inputs.kafka ] Unknown setting 'top
ic_id' for kafka
[2016-11-11T16:31:32,452][ERROR][logstash.agent ] fetched an invalid c
onfig {:config=>"input {\n kafka {\n zk_connect => \"localhost:2181\"\n to
pic_id => \"beats\"\n consumer_threads => 16\n }\n}\noutput {\nelasticsearch
{\nhosts => [\"localhost:9200\"]\nindex => \"elasticse\"\n}\n}\n", :reason=>"Som
ething is wrong with your configuration."}
can anyone help here?
You're running Logstash 5 with a config for Logstash 2.4.
zk_connect (Zookeeper host) was replaced by bootstrap_servers (Kafka broker) and topic_id by topics in 5.0
Try this config instead:
input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["beats"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "elasticse"
}
}
I begin with logstash and ElasticSearch and I would like to index .pdf or .doc file type in ElasticSearch via logstash.
I configured logstash using the codec multiline to get my file in a single message in ElasticSearch. Below is my configuration file:
input {
file {
path => "D:/BaseCV/*"
codec => multiline {
# Grok pattern names are valid! :)
pattern => ""
what => "previous"
}
}
}
output {
stdout {
codec => "rubydebug"
}
elasticsearch {
hosts => "localhost"
index => "cvindex"
document_type => "file"
}
}
At the start of logstash the first file I add, I recovered in ElasticSearch in one message, but the following are spread over several messages. I wish I had the correspondence : 1 file = 1 message.
Is this possible ? What should I change my setup to solve the problem ?
Thank you for your feedback.