How to Filter log based upon log severity in Rsyslog? - rsyslog

I an newbie in rsyslog, i able to get the log from client to server. But I need to divide this as per log severity (means INFO,ERROR, WARN) like this

Try this to add in your rsyslog.conf file in server side
module(load="imuxsock") # provides support for local system logging
#module(load="immark") # provides --MARK-- message capability
# provides UDP syslog reception
module(load="imudp")
input(type="imudp" port="514")
# provides TCP syslog reception
module(load="imtcp")
input(type="imtcp" port="50514" ruleset="remote")
Ruleset (name="remote"){
# action (type="omfile" file="/var/log/jvh.log")
if $msg contains 'ERROR' then {
action (type="omfile" file="/var/log/jvhErr.log")
}else if $msg contains 'INFO' then {
action(type="omfile" file="/var/log/jvhInfo.log")
}else {
action(type="omfile" file ="/var/log/jvhOther.log")
}
}

Related

kafka connect elastic sink Could not connect to Elasticsearch. General SSLEngine problem

I'm trying to deploy confluent Kafka connect to elasticsearch. My elastic stack is deployed on kubernetes, has HTTP encryption, and authentication. I'm forwarding elastic from kubernetes to localhost.
Caused by: org.apache.kafka.connect.runtime.rest.errors.BadRequestException: Connector configuration
is invalid and contains the following 3 error(s):
Could not connect to Elasticsearch. Error message: General SSLEngine problem
Could not authenticate the user. Check the 'connection.username' and 'connection.password'. Error
message: General SSLEngine problem
Could not authenticate the user. Check the 'connection.username' and 'connection.password'. Error
message: General SSLEngine problem
I'm sure that the username and password are right. Elastic properties file looks like
name=elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=1
topics=pwp-alerts
key.ignore=true
connection.url=https://localhost:9200
type.name=kafka-connect
errors.tolerance = all
behavior.on.malformed.documents=warn
schema.ignore = true
connection.username ="elastic"
connection.password ="my_password"
Does anyone know what can cause the problem?
I guess the failure issued by unsuccessful connection to your elastic engine it may happens by many things for example wrong port or your listener type it may be advertised listener instead of simple consumer, I recommend to use Logstash and add the Kafka input configuration in your Logstash configuration, You can simply modify your Kafka consumer and bootstrap server and many properties in input and your elastic index, Port and authorization in output easily.
Your Logstash configuration file with Kafka input may look like as below
input {
kafka{
group_id => "Your group consumer group id"
topics => ["Your topic name"]
bootstrap_servers => "Your consumer port, Default port is 9092"
codec => json
}
}
filter {
}
output {
file {
path => "Some path"
}
elasticsearch {
hosts => ["localhost:9200"]
document_type => "_doc"
index => "Your index name"
user => username
password => password
}
stdout { codec => rubydebug
}
}
You can remove the file in output if you don't want to store your data additionally beside your Logstash pipeline.
Find out more about Logstash Kafka input properties in Here

why rsyslog is unable to parse incoming syslogs with json template that are forwarded over TCP to some port (say 10514)?

I am currently forwarding the incoming syslogs via rsyslogto local logstash port. I am currently using the below template that resides in /etc/rsyslog.d/json-template.conf
my contents of json-template.conf are as under :
template(name="json-template"
type="list") {
constant(value="{")
constant(value="\"#timestamp\":\"") property(name="timereported" dateFormat="rfc3339")
constant(value="\",\"#version\":\"1")
constant(value="\",\"message\":\"") property(name="msg" format="json")
constant(value="\",\"sysloghost\":\"") property(name="hostname")
constant(value="\",\"severity\":\"") property(name="syslogseverity-text")
constant(value="\",\"facility\":\"") property(name="syslogfacility-text")
constant(value="\",\"programname\":\"") property(name="programname")
constant(value="\",\"procid\":\"") property(name="procid")
constant(value="\"}\n")
}
configuration for forwarding in /etc/rsyslog.conf :
*.* ##127.0.0.1:10514;json-template
rsyslog is able to send incoming syslogs to port 10514 but it is not able to parse the meaningful information from the syslogs.
NOTE: I have same setup for UDP and rsyslog is able to parse all the msgs as per json template.
I tried the same configuration of rsyslog with UDP :
configuration for forwarding in /etc/rsyslog.conf :
*.* #127.0.0.1:10514;json-template
and rsyslog is able to parse all the things from the syslog (timestamp, message, sysloghost)
All the necessary configuration for opening of tcp port for tcp forwarding and opening of udp ports for udp forwarding are taken care of as under :
for tcp:
sudo firewall-cmd --zone=public --add-port=10514/tcp
for udp:
sudo firewall-cmd --zone=public --add-port=10514/udp
But only thing I am not able to figure out is what I am missing w.r.t parse syslogs with TCP forwarding.
Expected outcome: rsyslog should be able to parse syslog as per json template
I found out the problem. the json-template sends JSON instead of RFC3164 or RFC5424 format.
so we have to add a filter in logstash configuration file to forward the JSON as it is.
My logstash configuration file looks like below :
input {
tcp {
host => "127.0.0.1"
port => 10514
type => "rsyslog"
}
}
# This is an empty filter block. You can later add other filters here to further process
# your log lines
filter {
json {
source => "message"
}
if "_jsonparsefailure" in [tags] {
drop {}
}
}
# This output block will send all events of type "rsyslog" to Elasticsearch at the configured
# host and port into daily indices of the pattern, "logstash-YYYY.MM.DD"
output {
if [type] == "rsyslog" {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}
}

Not getting each error email alert from logstash 1.5.4

I have my ELK setup like below:
HOST1: Component(which generates log) + Logstash (To send logs to redis)
HOST2: Redis + Elasticsearch + Logstash ( To parse data based on grok and send it to elasticsearch on same setup)
HOST3: Redis + Elasticsearch + Logstash ( To parse data based on grok and send it to elasticsearch on same setup)
HOST4: nginx + Kibana 4
Now when I send one error log line from logstash to redis, I get double entry in Kibana 4. Like below:
Plus I didnt get any email alert from logstash, although it is configured to send alert when severity == "Erro".
this is part of logstash conf file:
output {
elasticsearch { host => ["<ELK IP>"] port => "9200" protocol => "http" }
if [severity] =~ /Erro/
{
email {
from => "someone#somedomain.com"
subject => "Error Alert"
to => "someone#somedomain.com"
via => "smtp"
htmlbody => "<h2>Error Alert1</h2><br/><br/><div
align='center'>%{message}</div>"
options => [
"smtpIporHost", "smtp.office365.com",
"port", "587",
"domain", "smtp.office365.com",
"userName", "someone#somedomain.com",
"password", "somepasswd",
"authenticationType", "login",
"starttls", "true"
]
}
}
stdout { codec => rubydebug }
}
I am using following custom grok pattern to parse log line:
ABTIMESTAMP %{YEAR}%{MONTHNUM2}%{MONTHDAY} %{USERNAME}
ABLOGLEVEL (Note|Erro|Fatl|Warn|Urgt)
ABLOG %{ABTIMESTAMP:timestamp} %{HOST:hostname} %{WORD:servername} %{INT:pid} %{INT:lwp} %{INT:thread} %{ABLOGLEVEL:severity};%{USERNAME:event}\(%{NUMBER:msgcat}/%{NUMBER:msgnum}\)%{GREEDYDATA:greedydata}
Any help here as, how to get each email alert for every error log line?
Thanks in advance!
resolved it... Actually I was having multiple conf files in logstash/conf.d folder. I removed all unnecessary files and only kept my conf file and now its working. :). Thank you Val for your help

Where does logstash /elasticsearch write data?

In my input section of my logstash config file, I have created a configuration for reading a rabbitMQ queue. Using the RabbitMQ console, I can see logstash drain the queue. However, I have no idea what logstash is doing with the message. Is it discarding it? Is if forwarding it to elasticsearch?
Here's the logstash configuration
input {
rabbitmq {
host => "192.168.34.151"
exchange => an_exchange
key => a_key
queue => a_queue
}
}
output {
elasticsearch {
embedded => true
protocol => http
}
}
edit - removed the bogus comma from the config.

Changing the "Upload failed" Message

I would like to change the "Upload failed" message to one returned from my server-side processing.
I can see the message I want in the onError callback but I'm not sure how to used that instead of the default message.
Thoughts, examples or further reading advice welcome (new here).
The implementation of what you're trying to do depends on whether you are using Fine Uploader Basic/Core or Regular/UI. This is because UI mode offers some extra goodies for displaying error messages and such.
A few properties/options that may benefit you:
Fine Uploader Basic/Core mode
text.defaultResponseError
Message sent to the onError callback if no specific information about the error can be determined. This is used if the server indicates failure in the response but does not include an “error” property in the response and the error code is 200 (XHR only)
var uploader = new qq.FineUploaderBasic({
/* ... */
text: {
defaultResponseError: "Oh noes! Upload fail."
}
});
The documentation on 'text'
Fine Uploader Regular/UI mode
failedUploadTextDisplay.mode option
Valid values are “default” (display the text defined in failUploadText next to each failed file), “none” (don’t display any text next to a failed file), and “custom” (display error response text from the server next to the failed file or Blob).
failedUploadTextDisplay.responseProperty option
The property from the server response containing the error text to display next to the failed file or Blob. This is ignored unless mode is “custom”.
var uploader = new qq.FineUploader({
/* ... */
text: {
defaultResponseError: "Oh noes! Upload fail."
},
failedUploadTextDisplay: {
mode: 'custom', // Display error responses from the server.
responseProperty: 'errorMsg' // Default is 'error', change this to match the
// property that contains the error message from
// your server
}
});
The documentation on failedUploadTextDisplay
For people who still use FineUploaded and above does not work, that is because the key is not changed to failUpload.
Usage for a custom message on UI end would be
text: {
failUpload: 'Your upload faile message goes here
},
More details can be found here - https://docs.fineuploader.com/branch/master/upgrading-to-4.html
If you want to display the server-side message, you can do it the below way:
failedUploadTextDisplay {
mode: 'custom',
responseProperty: 'server side error key goes here'
}
If you wish to completely remove it, i.e, not show the message below file if file upload has failed, use below
failedUploadTextDisplay {
mode: 'none'
}

Resources