Logstash: issue in pipeline starting - windows

I have installed ElasticSearch, Kibana, Logstash and Beats on Windows 7 64 bit system.
I am getting below mentioned error after executing 'logstash -f first-pipeline.conf --config.reload.automatic' command.
Could not find log4j2 configuration at path /logstash-5.1.2/logstash -5.1.2/config/log4j2.properties. Using default config which logs to console
12:21:15.654 [[main]-pipeline-manager] INFO logstash.inputs.beats - Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"}
12:21:15.766 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
12:21:15.839 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
12:21:15.926 [[main]<beats] INFO org.logstash.beats.Server - Starting server on port: 5043
12:21:16.544 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9601}

Try adding this line in your setup.bat within your bin folder:
SET JAVA_OPTS=%JAVA_OPTS% -Dlog4j.configurationFile=%LS_HOME%\config\log4j2.properties
But I don't see that it could harm Logstash from starting the pipeline. Unless you really need log4j to do the logging for you. This SO could be helpful which elaborates pretty much the same. Hope it helps!

There are a bunch of spaces in the path where Log4j2 is looking for the configuration file: /logstash-5.1.2/logstash -5.1.2/config/log4j2.properties. Also the path starts with a slash so Log4j2 will interpret it as an absolute path.
You need to find the place where this is configured and specify the actual location of the configuration file. The logstash docs should have a logging configuration section.

Related

Filebeat not picking up some files

I am trying to send tomcat logs to ELK. I am using Filebeat to scan the files.
My log file name would be "project_err.DD-MM-YYYY". In filebeat configuration, I am giving the file name as foldername\project_err*
But filebeat is ignoring the files. Is this configuration correct?

How to stop logstash to write logstash logs to syslog?

I have my logstash configuration in my ubuntu server which reads data from the postgres database and send the data to elastic search. I have configured a schedule at each 15 minutes the logstash will look the postgres table, if there is any change in the table it sends the data to elastic search.
But each time the logstash is also sending the logs to syslog which I does not need. Because of logstash my syslog file consumes more memory.
So how to stop logstash to send its logs to syslog. Is there is any configuration in logstash.yml to avoid sending logs to syslog.
I referred many sites in online in which they said to remove below line from the configuration.
stdout { codec => rubydebug }
But I don't have this line.
In my output I just send my data to elastic search which I brought from AWS.
Is there is a way to stop logstash to sending its logs to syslog?
disable the rootLogger.appendRef.console in log4j
The logfiles that logstash itself produces are created through log4j, one stream goes by default to the console. Syslog will write to consolelogs to the syslog file itself. In the Ubuntu version of logstash this is configured in the file name/etc/logstash/log4j2.properties
In the default configuration there is a line that starts with
rootLogger.appenderRef.console
If you add a # in front of the line and restart logstash. The logfiles that logstash creates will stop going to syslog
service logstash restart
The other rootLogger that uses the RollingFileAppender should still write logmessages from logstash itself (so not the messages that are being processed by your pipeline) to
/var/log/logstash/logstash-plain.log
It's easy to confuse the logfiles that logstash creates with the messages that you process, especially if they get mixed by the logstash-output-stdout or logstash-output-syslog plugins. This is not applicable to you because you use the logstash-output-elasticsearch plugin that writes to elasticsearch.
The log4j.properties file gets skipped if you run logstash from the commandline, in Ubuntu. It's a nice way of testing your pipeline in a terminal, you can run multiple logstash instances in parallel (e.g. the service and a commandline test pipeline)
/usr/share/logstash/bin/logstash -f your_pipeline.conf
To avoid write to syslog, check your pipelines and log4j.properties files.
In your pipelines files, remove all occurences of this :
stdout { codec => rubydebug }
And in your log4j.properties files comment this line :
rootLogger.appenderRef.console

ELK and Filebeat configuration

I am working with https://www.elastic.co/, I installed Elasticsearch,Logstash,Kibana and Filebeat. I am taking /var/log/*.log in filebeat.yml and configured output to elasticsearch and enabled some modules but i am not getting modules logs, input is loaded but harvester is not started.
Please help me
I think that you might have a permission issue there.
If you have this error :
[WARN ][logstash.inputs.file ] failed to open /var/log/yum.log: Permission denied - /var/log/yum.log
You should consider starting filebeat with a root privilege.

Logstash debug to log files

Is it possible to create log files with logstash?
I know there is a -l/--log switch but is it possible to do it through the config file?
Thank you.
Open
/etc/logstash/logstash.yml
and add/change the following lines:
log.level: debug
path.logs: /var/log/logstash
Restart logstash and this will generate logs in:
/var/log/logstash/logstash-plain.log

How logs printed directly onto the console in yarn-cluster mode using spark

I am new in spark and i want to print logs on console using apache spark in yarn cluster mode.
You need to check the value in log4j.properties file. In my case i have this file in /etc/spark/conf.dist directory
log4j.rootCategory=INFO,console
INFO - prints the all the logs on the console. You can change the value to ERROR, WARN to limit the information you would like to see on the console as sparks logs can be overwhelming

Resources