can logstash send data simultaneusly to mulpile location along with elastic search - elasticsearch

Normally, in ELK logstash parsed data and send to elastics search.
I want to know is it possible that logstash send same data to different location at real time.
If it is possible, please let me know how to do it.

Create several output files that match type and send to different hosts.
output {
if [type] == "syslog" {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "logstash-%{+YYYY.MM.dd}"
codec => "plain"
workers => 1
manage_template => true
template_name => "logstash"
template_overwrite => false
flush_size => 100
idle_flush_time => 1
}
}
}

Related

How to split a large json file input into different elastic search index?

The input to logstash is
input {
file {
path => "/tmp/very-large.json"
type => "json"
start_position => "beginning"
sincedb_path => "/dev/null"
}
and sample json file
{"type":"type1", "msg":"..."}
{"type":"type2", "msg":"..."}
{"type":"type1", "msg":"..."}
{"type":"type3", "msg":"..."}
Is it possible to make them feed into different elastic search index, so I can process them easier in the future?
I know if it is possible to assign them with a tag, then I can do something like
if "type1" in [tags] {
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "logstash-type1%{+YYYY.MM.dd}"
flush_size => 50
}
}
How to do similar thing by looking at a specific json field value, e.g. type in my above example?
Even simpler, just use the type field to build the index name like this:
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "logstash-%{type}%{+YYYY.MM.dd}"
flush_size => 50
}
You can compare on any fields. You'll have to first parse your json with the json filter or codec.
Then you'll have a type field to work on, like this:
if [type] == "type1" {
elasticsearch {
...
index => "logstash-type1%{+YYYY.MM.dd}"
}
} else if [type] == "type2" {
elasticsearch {
...
index => "logstash-type2%{+YYYY.MM.dd}"
}
} ...
Or like in Val's answer:
elasticsearch {
hosts => ["localhost:9200"]
action => "index"
index => "logstash-%{type}%{+YYYY.MM.dd}"
flush_size => 50
}

I want to Delete document by logstash,but it throws a exception

Now,I meet a question. My logstash configuration file as follows:
input {
redis {
host => "127.0.0.1"
port => 6379
db => 10
data_type => "list"
key => "local_tag_del"
}
}
filter {
}
output {
elasticsearch {
action => "delete"
hosts => ["127.0.0.1:9200"]
codec => "json"
index => "mbd-data"
document_type => "localtag"
document_id => "%{album_id}"
}
file {
path => "/data/elasticsearch/result.json"
}
stdout {}
}
I want to read id from redis, by logstash, notify es to delete document.
Excuse me,My English is poor,I hope that someone will help me .
Thx.
I can't help you particularly, because your problem is spelled out in your error message - logstash couldn't connect to your elasticsearch instance.
That usually means one of:
elasticsearch isn't running
elasticsearch isn't bound to localhost
That's nothing to do with your logstash config. Using logstash to delete documents is a bit unusual though, so I'm not entirely sure this isn't an XY problem

How to move data from one Elasticsearch index to another using the Bulk API

I am new to Elasticsearch. How to move data from one Elasticsearch index to another using the Bulk API?
I'd suggest using Logstash for this, i.e. you use one elasticsearch input plugin to retrieve the data from your index and another elasticsearch output plugin to push the data to your other index.
The config logstash config file would look like this:
input {
elasticsearch {
hosts => "localhost:9200"
index => "source_index" <--- the name of your source index
}
}
filter {
mutate {
remove_field => [ "#version", "#timestamp" ]
}
}
output {
elasticsearch {
host => "localhost"
port => 9200
protocol => "http"
manage_template => false
index => "target_index" <---- the name of your target index
document_type => "your_doc_type" <---- make sure to set the appropriate type
document_id => "%{id}"
workers => 5
}
}
After installing Logstash, you can run it like this:
bin/logstash -f logstash.conf

separate indexes on logstash

Currently I have logstash configuration that pushing data to redis, and elastic server that pulling the data using the default index 'logstash'.
I've added another shipper and I've successfully managed to move the data using the default index as well. My goal is to move and restore that data on a separate index, what is the best way to achieve it?
This is my current configuration using the default index:
shipper output:
output {
redis {
host => "my-host"
data_type => "list"
key => "logstash"
codec => json
}
}
elk input:
input {
redis {
host => "my-host"
data_type => "list"
key => "logstash"
codec => json
}
}
Try to give the index filed in output. Give the name you want and then run that. so a seperate index will be created for that.
input {
redis {
host => "my-host"
data_type => "list"
key => "logstash"
codec => json
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "redis-logs"
cluster => "cluster name"
}
}

logstash not fast enough output to elasticsearch

I use logstash as indexer to output data into elasticsearch from redis, but it is not fast enougth because of large data. And then I used mutil workers,but it will be lead various problem. There are other better ways to do faster output? Thanks.
Here is my configuration:
input {
redis {
host => "10.240.93.41"
data_type => "list"
key => "tcpflow"
}
}
filter {
csv {
columns => [ts,node,clientip,vip,rtt,city,isp,asn,province]
separator => "|"
}
}
output {
elasticsearch {
index => "tcpflow-%{+YYYY.MM.dd}"
index_type => "tcpflow"
cluster => "elasticsearch"
host => ["10.240.93.41", "10.240.129.32"]
#protocol => "node"
#protocol => "http"
#port => 8200
protocol => "transport"
manage_template => false
workers => 30
}
}
The redis{} input in logstash defaults to reading one document at a time. Try setting batch_count to something in the 100-1000 range, depending on the size of your documents.
Having multiple worker threads ("-w") is ideal, unless you're using the multiline{} filter which is not thread-safe.

Resources