Elasticsearch : Encountered a retryable error - elasticsearch

I have written a Logstash config file such that it reads the log messages a file and then transfer data to elasticsearch.
Location of the config file: pipe.conf
/etc/logstash/conf.d
pipe.conf has the following contents:
input
{
file
{
path => "/var/log/elasticsearch/file.log"
sincedb_path => "/dev/null"
start_position => "beginning"
type => "doc"
}
}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
action => "create"
index => ["logs"]
}
}
When Logstash runs, error occurs,
"[Ruby-0-Thread-10#[main]>worker3: :1] elasticsearch - Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"http://localhost:9200/_bulk"}"

Default action is create. so need to add action.
elasticsearch {
hosts => ["http://localhost:9200"]
index => "logs"
}

Related

Generating Logs from multiple directories in Logstash. Logs not appearing in ElasticSearch

I am trying to collect logs from multiple directories through Logstash and send them to Elasticsearch.
This is my configuration:
input{
file {
path => ["/XXX/XXX/results/**/log_file.txt"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
pattern_definitions => { "THREAD_NAME" => "%{WORD}?%{NUMBER}?" }
match => { "message" => "%{SPACE}?%{TIMESTAMP_ISO8601:asctime}?%{SPACE}?\|%{SPACE}?%{THREAD_NAME:thread_name}"}
}
}
output{
elasticsearch {
hosts => ["x.x.x.x:9200"]
}
stdout { codec => rubydebug }
}
The file path is a relative path.
The logs are placed inside different directories placed inside the results directory:
results/dir/log_file.txt.
I have tried this configuration with stdin and logs appeared inside Kibana, but Logstash doesn't pick up the logs in the directories. Please advise.

Unable to sync data form mysql to logstash

My logstash.conf file is
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:8889/optjobs"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "root"
# The path to our downloaded jdbc driver
jdbc_driver_library => "/Users/ajoshi31/mysql-connector-java-5.1.17.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM candidates INNER JOIN candidate_skills ON candidate_skills.candidate_id = candidates.id"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "optjobsprd"
"document_type" => "data"
}
}
With the config file and on running logstash
$ logstash -f logstash.conf
I get the below error
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to /usr/local/Cellar/logstash/7.5.2/libexec/logs which is now configured via log4j2.properties
[2020-04-09T11:47:14,307][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-04-09T11:47:14,549][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.5.2"}
[2020-04-09T11:47:17,657][INFO ][org.reflections.Reflections] Reflections took 122 ms to scan 1 urls, producing 20 keys and 40 values
[2020-04-09T11:47:18,565][ERROR][logstash.outputs.elasticsearch] Unknown setting '"document_type"' for elasticsearch
[2020-04-09T11:47:18,568][ERROR][logstash.outputs.elasticsearch] Unknown setting '"hosts"' for elasticsearch
[2020-04-09T11:47:18,572][ERROR][logstash.outputs.elasticsearch] Unknown setting '"index"' for elasticsearch
[2020-04-09T11:47:18,590][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:mai
Once I remove the host, document_type and host, logstash runs, connects to mysql and execute the query but I cannot create index or update the data from mysql.
The options must not be quoted. Remove the quotes like
output {
stdout { codec => json_lines }
elasticsearch {
hosts => "localhost:9200"
index => "optjobsprd"
document_type => "data"
}
}

Elasticsearch not recieving input from logstash

I'm running logstash where the output is set to elasticsearch on my localhost. However, when I open up elasticsearch, it appears that it did not receive any data from logstash. Logstash parses the csv file correctly, as I can see by the output in the terminal.
I've tried modifying the conf file, but the problem remains. The conf file is below
input {
file {
path => "/Users/kevinliu/Desktop/logstash_tutorial/gg.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["name","price","unit","url"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "gg-prices"
}
stdout {}
}
When I access localhost:9200/ I just see the default " "You Know, for Search" display/message from elasticsearch.

Logstash multiple logs

I am following an online tutorial and have been provided with a cars.csv file and the following Logstash config file. My logstash is running perfectly well and is indexing the CSV as we speak.
The question is, I have another log file (entirely different data) which I need to parse and index into a different index.
How do I add this configuration without restarting logstash?
If above isn't possible and I edit the config file then restart logstash - it won't reindex the entire cars file will it?
If I do 2. How do I format the config for multiple styles of log file.
eg. my new log file looks like this:
01-01-2017 ORDER FAILED: £12.11 Somewhere : Fraud
Existing Config File:
input {
file {
path => "/opt/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns =>
[
"maker",
"model",
"mileage",
"manufacture_year",
"engine_displacement",
"engine_power",
"body_type",
"color_slug",
"stk_year",
"transmission",
"door_count",
"seat_count",
"fuel_type",
"date_last_seen",
"date_created",
"price_eur"
]
}
mutate {
convert => ["mileage", "integer"]
}
mutate {
convert => ["price_eur", "float"]
}
mutate {
convert => ["engine_power", "integer"]
}
mutate {
convert => ["door_count", "integer"]
}
mutate {
convert => ["seat_count", "integer"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "cars"
document_type => "sold_cars"
}
stdout {}
}
Config file for orders.log
input {
file {
path => "/opt/logs/orders.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => { "message" => "(?<date>[0-9-]+) (?<order_status>ORDER [a-zA-Z]+): (?<order_amount>£[0-9.]+) (?<order_location>[a-zA-Z]+)( : (?<order_failure_reason>[A-Za-z ]+))?"}
}
mutate {
convert => ["order_amount", "float"]
}
}
output {
elasticsearch {
hosts => "localhost"
index => "sales"
document_type => "order"
}
stdout {}
}
Disclaimer: I'm a complete newbie. Second day using ELK.
For point 1, either in your logstash.yml file, you can set
config.reload.automatic:true
Or, while executing logstash with conf file, run it like:
bin/logstash -f conf-file-name.conf --config.reload.automatic
After doing either of these settings, you can start your logstash and from now on any change you make in conf file will be reflected back.
2. If above isn't possible and I edit the config file then restart logstash - it won't reindex the entire cars file will it?
If you use sincedb_path => "/dev/null", Logstash won't remember where is has stopped reading a document and will reindex it at each restart. You'll have to remove this line if you wish for Logstash to remember (see here).
3.How do I format the config for multiple styles of log file.
To support multiple style of log files, you can put tags on the file inputs (see https://www.elastic.co/guide/en/logstash/5.5/plugins-inputs-file.html#plugins-inputs-file-tags) and then use conditionals (see https://www.elastic.co/guide/en/logstash/5.5/event-dependent-configuration.html#conditionals) in your file config.
Like this:
file {
path => "/opt/cars.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
tags => [ "csv" ]
}
file {
path => "/opt/logs/orders.log"
start_position => "beginning"
sincedb_path => "/dev/null"
tags => [] "log" ]
}
if csv in [tags] {
...
} else if log in [tags] {
...
}

Mutiple logs in single config file to elasticsearch

I want to send logs from different location to elasticsearch using logstash conf file.
input {
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/ce.log"
type => "CE"
start_position => "beginning"
}
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/spovp.log"
type => "SP"
start_position => "beginning"
}
file
{
path => "C:/Users/611166850/Desktop/logstash-5.0.2/logstash-5.0.2/logs/logs/ovpportal_log"
type => "OVP"
start_position => "beginning"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "multiple"
codec => json
workers => 1
}
}
This is the config file I use, but Kibana is not recognising this index. Can someone help with this
Thanks in advance ,Rashmi
Check logstash's log file for errors.(maybe you'r config file is not correct)
Also search ES directly for preferred index, maybe problem is not Kibana, and you don't have any index with this name.
try starting logstash in debug mode to see if there are any logs in it.
you can also try to get the logstash out put to a file on local system rather than directly sending it to the elasticsearch. uncomment block as per your requirement
# block-1
# if "_grokparsefailure" not in [tags] {
# stdout {
# codec => rubydebug { metadata => true }
# }
# }
# block-2
# if "_grokparsefailure" not in [tags] {
# file {
# path => "/tmp/out-try1.logstash"
# }
# }
so by any of these methods you can get the output to console or to a file. comment _grokparsefailure part in case you don't see any output in file.
Note: in kibana default indices have #timestamp in their fields so check
1. if kibana is able to recognize the index if you unckeck the checkbox on page where you create new index
2. if your logs are properly parsed. if not you need to work out with grok filters with pattern matching your logs or create grok filters
all elasticsearch indices are visible on http://elasticsearch-ip:9200/_cat/indices?v (your elasticsearch ip) so try that too. share what you find

Resources