Serilog - logstash index - elasticsearch

I am sending log messages (UDP) to logstash via serilog.
var logger = new LoggerConfiguration()
.WriteTo.Console()
.WriteTo.UDPSink("host", port)
.MinimumLevel.Is(LogEventLevel.Verbose)
.CreateLogger();
But I would like to specify name on the logstash index. Any idea how?

I Don't know how your logstash config looks like, so I can't give you a full answer.
But, in general your logStash.config file should look like:
input {
udp {
port => ...
id => "my_plugin_id"
}
}
output {
elasticsearch {
host => "127.0.0.1"
index => "%{your_defined_index}"
}
}

Related

Logstash creating pipelines from Kafka not working

I am trying to get data from Kafka topic to run into ELK-stack with Logstash but can't get the data moving.
I edited the logstash.conf to following:
input {
tcp {
port => 5000
}
kafka {
bootstrap_servers => "broker:29092"
topics => ["PLACES_ROWKEY"]
}
}
## Add your filters / logstash plugins configuration here
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
user => "elastic"
password => "changeme"
index => "from_logstash"
}
}
Im running this setup in Docker if it matters (broker is the hostname for the Kafka broker container). I restart the Logstash but cant see any new indices in elasticsearch

Kafka (Confluent Platform) input for Logstash - broken message encoding

I have a Confluent Platform (version 4.1.1). It is configured to read data from the database. The configuration for this is:
name = source-mysql-requests
connection.url = jdbc:mysql://localhost:3306/Requests
connector.class = io.confluent.connect.jdbc.JdbcSourceConnector
connection.user = ***
connection.password = ***
mode = incrementing
incrementing.column.name = ID
tasks.max = 5
topic.prefix = requests_
poll.interval.ms = 1000
batch.max.rows = 100
table.poll.interval.ms = 1000
I also have a Logstash (version 6.2.4) for reading the relevant Kafka topic. Here is its configuration:
kafka {
bootstrap_servers => "localhost:9092"
topics => ["requests_Operation"]
add_field => { "[#metadata][flag]" => "operation" }
}
output {
if [#metadata][flag] == "operation" {
stdout {
codec => rubydebug
}
}
}
When I run "kafka-avro-console-consumer" for the test, I get messages of this type:
{"ID":388625154,"ISSUER_ID":"8e427b6b-1176-4d4a-8090-915fedcef870","SERVICE_ID":"mercury-g2b.service:1.4","OPERATION":"prepareOutcomingConsignmentRequest","STATUS":"COMPLETED","RECEIVE_REQUEST_DATE":1525381951000,"PRODUCE_RESULT_DATE":1525381951000}
But in Logstash I have something terrible and unreadable:
"\u0000\u0000\u0000\u0000\u0001����\u0002Hfdebfb95-218a-11e2-a69b-b499babae7ea.mercury-g2b.service:1.4DprepareOutcomingConsignmentRequest\u0012COMPLETED���X���X"
What could go wrong?
You can change Kafka Connect to not use Avro by changing the configurations for value.converter and key.converter to use JSON instead, for example.
Otherwise, you would need Logstash to know how to interpret the Schema Registry encoded Avro data and convert it into a human-readable format.
Alternatively, you could use Connect's Elasticsearch or Console sink and skip Logstash entirely, assuming that is the goal
You can use a Connect SMT to replace the Logstash add_field : operation config as well

Unable to view Apache log in elasticsearch

I have installed ELK stack on windows and configured Logstash to read an Apache Log file. I cant seem to see the output in Elasticsearch. I am very new to ELK stack.
Environment Setup
Elasticsearch: http://localhost:9200/
Logstash :
Kibana : http://localhost:5601/
All 3 applications above are running as a service.
I have created a file called "logstash.conf" to read apache logs in "C:\Elk\logstash\conf\logstash.conf" with the following :
input {
file {
path => "C:\Elk\apache.log"
start_position => "beginning"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
}
I then restarted my Logstash service and now wish to see if elasticsearch is indexing the content of my log. How do i go about doing this ?
try adding following lines to your logstash conf and let us know if there are any grokparsing failures...which would mean your pattern used in filter section is not correct..
output {
stdout { codec => json }
file { path => "C:/POC/output3.txt" }
}

logstash and x-forwarded-for on IIS

I just built an ELK server on Windows so I'm new to the process. I've read through the docs but am having trouble parsing out my IIS advanced logs, especially x-forwarded-for data as we're behind a load balancer..
My advanced logging is set up to output the data like this:
$date, $time, $s-ip, $cs-uri-stem, $cs-uri-query, $s-port, $cs-username, $c-ip, $X-Forwarded-For, $csUser-Agent, $cs-Referer, $sc-status, $sc-substatus, $sc-win32-status, $time-taken
I set up my logstash.conf like this:
input {
tcp {
host => "localhost"
type => "iis"
port => 5044
}
}
filter {
if [type] == "iis" {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{IPORHOST:site} %{URIPATH:page} %{NOTSPACE:query_string} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:client_host} %{NOTSPACE:useragent} %{NOTSPACE:referer} %{GREEDYDATA:response} %{NUMBER:httpStatusCode:int} %{NUMBER:scSubstatus:int} %{NUMBER:scwin32status:int} %{NUMBER:timeTakenMS:int}"}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "iis"
document_type => "main"
}
}
I don't think this is correct as I'm not getting data. I've scoured the docs but am still having issues and am not sure if there are other steps I need to take, like mapping the fields.
I'm currently using filebeat from one server to push data to my ELK server. I'm not sure if this is the best way as well (maybe nxlog?). We don't want to install logstash on the client machines.
Can someone lend me a hand? It would be GREATLY appreciated!!
Thanks,
George
Since you are using Filebeat then you need to use the beats input and not the tcp input. See the documentation on how to setup Logstash for Beats.
Essentially you need to replace your tcp input with:
input {
beats {
port => 5044
}
}
And inside your Filebeat configuration file, set the document_type to iis so that your filter condition will match.
filebeat:
prospectors:
- paths:
- 'C:\path\to\your\iis\logs\*.log'
document_type: iis

Where does logstash /elasticsearch write data?

In my input section of my logstash config file, I have created a configuration for reading a rabbitMQ queue. Using the RabbitMQ console, I can see logstash drain the queue. However, I have no idea what logstash is doing with the message. Is it discarding it? Is if forwarding it to elasticsearch?
Here's the logstash configuration
input {
rabbitmq {
host => "192.168.34.151"
exchange => an_exchange
key => a_key
queue => a_queue
}
}
output {
elasticsearch {
embedded => true
protocol => http
}
}
edit - removed the bogus comma from the config.

Resources