Update from Logstash to Elastic Search failed - elasticsearch

I want to parse a simple logfile with logstash and post the results to elastic search. I've configured logstash according to the log stash documentation.
But Logstash reports this error:
Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200/"]',
but Elasticsearch appears to be unreachable or down!
{:client_config=>{:hosts=>["http://localhost:9200/"], :ssl=>nil,
:transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil,
:ssl=>{}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore,
:logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false,
:reload_on_failure=>false, :randomize_hosts=>false}, :error_message=>"Connection refused",
:level=>:error}
My configuration looks like this:
input { stdin{} }
filter {
grok {
match => { "message" => "%{NOTSPACE:demo}"}
}
}
output {
elasticsearch { hosts => "localhost:9200"}
}
Of course elastic search is available when calling http://localhost:9200/
Versions: logstash-2.0.0, elasticsearch-2.0.0
OSX
I've found a thread with a similar issue. But this seems to be a bug in an older logstash version.

I changed localhost to 127.0.0.1
This works:
output {
elasticsearch { hosts => "127.0.0.1:9200"}
}

Related

Logstash with Elastic index only 10,000 documents

I am working with Filebeat and Logstash to upload logs to Elastic (all are 7.3-oss version).
My log file contain billions of rows, yet elastic only show 10K documents.
When adding stdout output it seems like all the data is coming to Logstash, but for some reason Logstash uploads only 10,000 docs.
I added another output
stdout { codec => rubydebug }
for printing to the screen it seems like the data is coming from Filebeat, but for some reason Logstash only upload 10,000 docs.
Also tried removing the Json Filter in Logstash, but the issue still occur.
Filebeat config
filebeat.inputs:
- type: log
paths:
\\some-path\my.json
output.logstash:
hosts: ["localhost:5044"]
Logstash pipeline
input {
beats {
port => 5044
}
}
filter{
json{
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "machine-name:9200" ]
}
}
Logstash.yml
is empty as the default installation
I found that is was my search that caused the confusion.
According to
https://www.elastic.co/guide/en/elasticsearch/reference/7.3/search-request-body.html#request-body-search-track-total-hits,
Elastic simply didn't return the accurate hits (just stated that its greater than 10000).
Changing my search query
GET logstash-*/_search
{
"track_total_hits": true
}
returned the right size.

How to extract service name from document field in Logstash

I am stuck in middle of ELK- Stack configuration, any lead will be highly appreciated.
Case Study:
I am able to see the logs(parsed through logstash without any filter) but I want to apply filter's while parsing the logs.
For ex:
system.process.cmdline: "C:\example1\example.exe" -displayname "example.run" -servicename "example.run"
I can see the above logs in kibana dashboard but I want only the -servicename keys, value.
Expected output in Kibana, where servicename is an index and example.run will be associate value.
servicename "example.run"
I am newbie in ELK.So, Please help me out...
My environment:
Elasticsearch- 6.6
Kibana- 6.6
Logstash- 6.6
Filebeat- 6.6
Metricbeat- 6.6
Logs coming from- Windows server 2016
input {
beats {
port => "5044"
}
}
filter {
grok{
match =>{"message" => "%{NOSPACE:hostname} "}
}
}
output {
file {
path => "/var/log/logstash/out.log"
}
}
I have tried with the above logstash pipeline. But i am not successfull in getting the required result. Assuming i have to add more lines in filter but don't know what exactly.
use this in you filter:
grok{
match => { "message" => "%{GREEDYDATA:ignore}-servicename \"%{DATA:serviceName}\"" }
}
your service name should be now in serviceName key

Unknown setting 'protocol' for elasticsearch 5.1.1

So I've been looking for the way to solve this issue all day long.
but all I've got is for the old version of elasticsearch.
fyi, i use the latest version of elk stack.
elasticsearch version : 5.1.1
kibana version : 5.1.1
logstash version : 5.1.1
This is my apache conf :
input {
file {
path => '/Applications/XAMPP/xamppfiles/logs/access_log'
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
output {
elasticsearch { protocol => "http" }
}
That file used to access log data from apache.
But when I run the logstash, with :
logstash -f apache.conf
I got this error message.
That message told me that something wrong with my configuration.
the http protocol is doesnt exist anymore i guess.
Can you tell me how to fix it?
Many thanks return
There is no protocol setting in the elasticsearch output anymore. Simply modify your output to this:
output {
elasticsearch {
hosts => "localhost:9200"
}
}

How to generate reports on existing dump of logs using ELK?

Using ELK stack, is it possible to generate reports on existing dump of logs?
For example:
I have some 2 GB of Apache access logs and I want to have the dashboard reports showing:
All requests, with status code 400
All requests, with pattern like "GET http://example.com/abc/.*"
Appreciate, any example links.
Yes, it is possible. You should:
Install and setup the ELK stack.
Install filebeat, configure it to harvest your logs, and to forward the data to logstash.
In logstash, listen to filebeat input, use the grok to process/break up your data, and forward it to elastichsearch something like:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{COMMONAPACHELOG}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "filebeat-logstash-%{+YYYY.MM.dd}"
}
}
In kibana, setup your indices, and query for data, e.g.
response: 400
verb: GET AND message: "http://example.com/abc/"

how to mention a json file for elasticsearch which can be used by kibana?

I have to show my log file(a json file) to a dashboard of kibana. I configured elasticsearch and kibana.
I tried giving elasticsearch.yml path.data: C:\Users\Rajesh\Desktop\temp (where my logs are),but while using dashboard if i am searching for any string it gives 0 Results.
Could anyone please guide me? Thanks in Advance
You can use Logstash to read your logfile and then output to elasticsearch. Then use kibana to view it.
Logstash has a lot of plugin help you to do this.
Here is an example for your reference. This is the Logstash configuration. We read all the json data from a file and then output to elasticsearch.
input {
file {
path => "/path/to/your/json/file"
codec => json_lines {
}
}
}
output {
elasticsearch {
cluster => "abc"
}
}

Resources