logstash elastic search output configuration based on inputs - elasticsearch

Is there any way I can use logstash configuration file to scale output accordingly with different types/indexes ?
For eg.,
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "index_resources"
if(%{some_field_id}==kb){
document_type => "document_type"
document_id => "%{some_id}"
}
else {
document_type => "other_document_type"
document_id => "%{some_other_id}"
}
}

Yes you could route your documents to multiple indexes within your logstash itself. Output could look something like this:
output {
stdout {codec => rubydebug}
if %{some_field_id} == "kb" { <---- insert your condition here
elasticsearch {
host => "localhost"
protocol => "http"
index => "index1"
document_type => "document_type"
document_id => "%{some_id}"
}
} else {
elasticsearch {
host => "localhost"
protocol => "http"
index => "index2"
document_type => "other_document_type"
document_id => "%{some_other_id}"
}
}
}
This thread might help you as well.

Related

logstash create strange index name

i use logstash 7.9.3 and with this version i have problems to create right index name like logstash-2021.01.01. I need first 9 days of month with 0.
with this config logstash-%{+yyyy.MM.dd} result is => logstash-2021.01.01-000001
with this config logstash-%{+yyyy.MM.d} result is => logstash-2021.01.1
input {
redis {
host => "someip_of_redis"
data_type => "list"
key => "logstash"
codec => "json"
}
}
output {
elasticsearch {
hosts => ["http://someip_of_elastic:9200"]
index => "logstash-%{+yyyy.MM.dd}"
}
}
Thank you in advance
to disable it, i add to config following ilm_enabled => false
input {
redis {
host => "someip_of_redis"
data_type => "list"
key => "logstash"
codec => "json"
}
}
output {
elasticsearch {
hosts => ["http://someip_of_elastic:9200"]
ilm_enabled => false
index => "logstash-%{+yyyy.MM.dd}"
}
}

How to logstash from elasticsearch with specific _type

I have a logstash (6.8.1) configuration file like this:
input {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
index => "myindex"
codec => "json"
docinfo => true
}
}
filter {
mutate {
remove_field => ["#timestamp", "#version"]
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9300" ]
index => "%{[#metadata][_index]}"
document_type => "%{[#metadata][_type]}"
document_id => "%{[#metadata][_id]}"
}
}
However, the es in port 9200 is 5.x, and the es in port 9300 is 6.x.
Moreover, the es in port 9200 has multiple "types".
Since es 6.x has only one "type", I need something like this:
input {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
user => ""
password => ""
index => "myindex"
document_type => "mytype" <--- like this
codec => "json"
docinfo => true
}
}
How can I realize this?
Thanks a lot for your help.
I am providing you an example of jenkins logstash configuration.
Did you mean this:
input {
file {
path => "/var/log/jenkins/*"
type => "jenkins-server"
start_position => "beginning"
}
}
filter
{
if [type] == "jenkins-server" {
mutate {
add_field => ["#message_type", "jenkins"]
add_field => ["#message", "%{message}"]
}
}
}
Finally, I use elasticsearch-dump to realize this operation:
docker run --net=host --rm -it docker.io/taskrabbit/elasticsearch-dump \
--input=http://user:password#input-es:9200 \
--input-index=my-user-index/my-user-type \
--output=http://user:password#output-es:9200 \
--output-index=my-user-index \
--type=mapping
PS: to migrate data, use option --type=data

logstash 5.0.1: setup elasticsearch multiple indexes ouput for multiple kafka input topics

I have a logstash input setup as
input {
kafka {
bootstrap_servers => "zookeper_address"
topics => ["topic1","topic2"]
}
}
I need to feed the topics into two different indexes in elasticsearch. Can anyone help me with how the ouput should be setup for such a task. At this time I am only able to setup
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "my_index"
codec => "json"
document_id => "%{id}"
}
}
I need two indexes on the same elasticsearch instance say index1 and index2 which will be fed by messages coming in on topic1 and topic2
First, you need to add decorate_events to your kafka input in order to know from which topic the message is coming
input {
kafka {
bootstrap_servers => "zookeper_address"
topics => ["topic1","topic2"]
decorate_events => true
}
}
Then, you have two options, both involving conditional logic. The first is by introducing a filter for adding the correct index name depending on the topic name. For this you need to add
filter {
if [kafka][topic] == "topic1" {
mutate {
add_field => {"[#metadata][index]" => "index1"}
}
} else {
mutate {
add_field => {"[#metadata][index]" => "index2"}
}
}
# remove the field containing the decorations, unless you want them to land into ES
mutate {
remove_field => ["kafka"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[#metadata][index]}"
codec => "json"
document_id => "%{id}"
}
}
Then second option is to do the if/else directly in the output section, like this (but the additional kafka field will land into ES):
output {
if [#metadata][kafka][topic] == "topic1" {
elasticsearch {
hosts => ["localhost:9200"]
index => "index1"
codec => "json"
document_id => "%{id}"
}
} else {
elasticsearch {
hosts => ["localhost:9200"]
index => "index2"
codec => "json"
document_id => "%{id}"
}
}
}

Elasticsearch - Duplicating Types?

I created an index in Elasticsearch, with a type t1 and documents doc1-docN. Is there a way, via an API call, to create a new type, t2, that contains the same documents as t1 (doc1 - docN)?
No magic API call for this. You need to index those documents. I suggest this blog post from one of the Elastic developers: http://david.pilato.fr/blog/2015/05/20/reindex-elasticsearch-with-logstash/
You'd need something around these lines:
input {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "test_index"
size => 500
scroll => "5m"
docinfo => true
query => '{"query":{"term":{"_type":{"value":"test_type_1"}}}}'
}
}
filter {
mutate {
remove_field => [ "#timestamp", "#version" ]
}
}
output {
elasticsearch {
host => "localhost"
port => "9200"
protocol => "http"
index => "test_index"
document_type => "test_type_2"
document_id => "%{[#metadata][_id]}"
}
}

How to create multiple indexes in logstash.conf file?

I used the following piece of code to create an index in logstash.conf
output {
stdout {codec => rubydebug}
elasticsearch {
host => "localhost"
protocol => "http"
index => "trial_indexer"
}
}
To create another index i generally replace the index name with another in the above code. Is there any way of creating many indexes in the same file? I'm new to ELK.
You can use a pattern in your index name based on the value of one of your fields. Here we use the value of the type field in order to name the index:
output {
stdout {codec => rubydebug}
elasticsearch {
host => "localhost"
protocol => "http"
index => "%{type}_indexer"
}
}
You can also use several elasticsearch outputs either to the same ES host or to different ES hosts:
output {
stdout {codec => rubydebug}
elasticsearch {
host => "localhost"
protocol => "http"
index => "trial_indexer"
}
elasticsearch {
host => "localhost"
protocol => "http"
index => "movie_indexer"
}
}
Or maybe you want to route your documents to different indices based on some variable:
output {
stdout {codec => rubydebug}
if [type] == "trial" {
elasticsearch {
host => "localhost"
protocol => "http"
index => "trial_indexer"
}
} else {
elasticsearch {
host => "localhost"
protocol => "http"
index => "movie_indexer"
}
}
}
UPDATE
The syntax has changed a little bit in Logstash 2 and 5:
output {
stdout {codec => rubydebug}
if [type] == "trial" {
elasticsearch {
hosts => "localhost:9200"
index => "trial_indexer"
}
} else {
elasticsearch {
hosts => "localhost:9200"
index => "movie_indexer"
}
}
}

Resources