how can I send logs to nginx reverse proxy nginx over logstash? - elasticsearch

My question is simple but I could not find a solution here. I have a logstash and Elasticsearch server .Generally I can send logs to elasticsearch by logstash.
Look please below code:logstash.conf file to ship logs to elastic
input {
file {
type => "json"
path => ["C:/logs/*.json"]
start_position => "beginning"
codec => "json"
}
}
filter {
mutate {
remove_field => [ "path" ]
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
}
}
But I put a nginx between logstash and elasticsearch, I configured that but it is not working. Below error returns.
input {
file {
type => "json"
path => ["C:/logs/*.json"]
start_position => "beginning"
codec => "json"
}
}
filter {
mutate {
remove_field => [ "path" ]
}
ruby {
code => "
require 'base64';
event['password'] = bG9ndXNlcjpidXJnYW5iYW5rXzIwMTc=
"
}
}
output {
stdout {
codec => rubydebug
}
http {
http_method => "post"
url => "http://localhost:8080;"
format => "message"
headers => {"Authorization" => "Basic %{password}"}
content_type => "application/json"
message => '{"whatever": 1 }'
}
}
Error: [2017-09-18T11:07:30,235][ERROR][logstash.outputs.http ] [HTTP Output Failure] Encountered non-2xx HTTP code 401 {:response_code=>401, :url=>"http://localhost:8080;", :event=>2017-09-18T07:05:42.797Z test_y HTTP "EJT\ANYCUSTOMER" "" "GET" "/api/v1.0/xxxxxxx/xxxx" responded 200 in 0.0000 ms, :will_retry=>false}
Second simple solution also below but it is not working also:
input {
file {
type => "json"
path => ["C:/logs/*.json"]
start_position => "beginning"
codec => "json"
}
}
filter {
mutate {
remove_field => [ "path" ]
}
}
output {
stdout {
codec => rubydebug
}
http {
http_method => "post"
url => "http://127.0.0.1:8080"
headers => ["Authorization", "Basic dXNlcjE6JGFwcjEkNGdTY3dwMkckSlBLOXNVRmJGbzZ4UjhnSUVqYXo2Lg=="]
}
}
ERROR : [ERROR][logstash.outputs.http ] [HTTP Output Failure] Encountered non-2xx HTTP code 401 {:response_code=>401, :url=>"http://127.0.0.1:8080",

Related

Create index based on message field - appname

****Logstash.conf code *******
input {
stdin{
type => "stdin-type"
}
file{
type => "json"
path => [ "C:/prod/*.log", "C:/prod/*/**.log"]
start_position => "beginning"
tags => "prod"
}
file{
type => "json"
path => [ "C:/dev/*.log", "C:/dev/*/**.log"]
start_position => "beginning"
tags => "dev"
}
}
filter {
grok {
match => {
"message" => [ "%{JSON:payload_raw} "]
}
pattern_definitions => {
"JSON" => "{.*$"}
}
json {
source => "payload_raw"
target => "payload"
}
mutate {
remove_field => [ "payload_raw","message" ]
}
date {
match => [ "[payload][datetime]", "yyyy-MM-dd HH:mm:ss,SSS" ]
target => "#timestamp"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
index => "%{tags}-logs"
}
}
Sample log
{datetime":"2021-08-10 04:11:37,825","servername":"VM-0001","serverip":"(null)","process":"2404","thread":"4","level":"DEBUG","appname":"Dev-Email","page":"Program.cs"}
Given the sample document your shared, your elasticsearch output needs to look like this:
elasticsearch {
hosts => ["localhost:9200"]
index => "%{appname}-logs"
}
Also know that index names are not allowed to contain uppercase letters, so Dev-Email will need to be lowercased (using the mutate/lowercase filter) before being used as the index name.

How to parse the field value in logstash?

I have logstash in which I am getting data from HTTP API but there is a field for which I need to parse the value from this
"ServiceProvider": "T:ISP | CIR:450Mbps BR:1Gbps | VD:Beq | CID:124"
to this
"ServiceProvider": "T:ISP"
"CIR": "450Mbps"
"BR": "1Gbps"
"VD": "Beq"
"CID": "124"
My config file for Logstash is:
input{
http_poller {
urls => {
"ISP" => {
method => get
url => "http://xyz:8080/api"
headers => {
Accept => "application/json"
}
}
}
request_timeout => 60
tags => "hourly"
schedule => { cron => "30 * * * *"}
codec => "json"
metadata_target => "meta"
}
}
filter {
mutate {
remove_field => [ "[meta][request][auth][user]", "[meta][request][auth][pass]","[meta][request][headers][Accept]" ]
}
}
output {
elasticsearch {
hosts => ["http://xyz:9100"]
index => "xyz"
}
}
Thanks in advance!!
I would leverage the dissect filter and kv filter like this:
filter {
dissect {
mapping => {
"message" => "%{ServiceProvider} | %{[#metadata][kv]}"
}
}
kv {
source => "[#metadata][kv]"
field_split => "| "
value_split => ":"
}
mutate {
remove_field => [ "[meta][request][auth][user]", "[meta][request][auth][pass]","[meta][request][headers][Accept]" ]
}
}

Logstash: Multiline Log messages transform into single line log message

I am printing below log messages
{"timestamp":"15-06-2020 22:12:35","level":"INFO","thread":"http-nio-8080-exec-2","mdc":{"Z-Request-Id":"20200615101234-2c078173-66c2-49ce-93ec-40dfab2a7312","destination":"backendorg"},"logger":"com.AbcHandler","message":"host: localhost, port: 9200, index: zindex and protocol: http","context":"ZPlatform"}
{"timestamp":"15-06-2020 22:12:35","level":"INFO","thread":"http-nio-8080-exec-2","mdc":{"Z-Request-Id":"20200615101234-2c078173-66c2-49ce-93ec-40dfab2a7312","destination":"backendorg"},"logger":"com.AbcHandler","message":"batchNumber: 1 and batchSize: 50","context":"ZPlatform"}
Parsing above messages using Multiline codec, Below is my logstash config file
input {
file {
start_position => "end"
sincedb_path => "/tmp/sincedb_file"
codec => multiline {
pattern => "^Spalanzani"
negate => true
what => previous
}
}
}
filter {
if [type] == "app" {
grok {
match => [ "message","%{GREEDYDATA:jsonstring}"]
}
json {
source => "jsonstring"
target => "parsedJson"
remove_field=>["jsonstring"]
}
mutate {
add_field => {
"frontendDateTime" => "%{[parsedJson][timestamp]}"
"logMessage" => "%{[parsedJson][message]}"
}
}
mutate {
remove_field => [ "parsedJson" ]
}
}
}
But what i am seeing all above messages were clubbed together. Don't know why it is happening. It should show me different log message
{
"tags" => [
[0] "multiline"
],
"message" => "{\"timestamp\":\"15-06-2020 22:12:35\",\"level\":\"INFO\",\"thread\":\"http-nio-8080-exec-2\",\"mdc\":{\"Z-Request-Id\":\"20200615101234-2c078173-66c2-49ce-93ec-40dfab2a7312\",\"destination\":\"backendorg\"},\"logger\":\"com.AbcHandler\",\"message\":\"host: localhost, port: 9200, index: zindex and protocol: http\",\"context\":\"ZPlatform\"}\n{\"timestamp\":\"15-06-2020 22:12:35\",\"level\":\"INFO\",\"thread\":\"http-nio-8080-exec-2\",\"mdc\":{\"Z-Request-Id\":\"20200615101234-2c078173-66c2-49ce-93ec-40dfab2a7312\",\"destination\":\"backendorg\"},\"logger\":\"com.AbcHandler\",\"message\":\"batchNumber: 1 and batchSize: 50\",\"context\":\"ZPlatform\"}",
"logMessage" => "search string: ",
"#timestamp" => 2020-06-15T16:42:38.256Z
}
could someone help me.

logstash don't report all the events

i could see some events are missing while reporting logs to elastic search. Take an example i am sending 5 logs event only 4 or 3 are reporting.
Basically i am using logstash 7.4 to read my log messages and store the information on elastic search 7.4. below is my logstash configuration
input {
file {
type => "web"
path => ["/Users/a0053/Downloads/logs/**/*-web.log"]
start_position => "beginning"
sincedb_path => "/tmp/sincedb_file"
codec => multiline {
pattern => "^(%{MONTHDAY}-%{MONTHNUM}-%{YEAR} %{TIME}) "
negate => true
what => previous
}
}
}
filter {
if [type] == "web" {
grok {
match => [ "message","(?<frontendDateTime>%{MONTHDAY}-%{MONTHNUM}-%{YEAR} %{TIME})%{SPACE}(\[%{DATA:thread}\])?( )?%{LOGLEVEL:level}%{SPACE}%{USERNAME:zhost}%{SPACE}%{JAVAFILE:javaClass} %{USERNAME:orgId} (?<loginId>[\w.+=:-]+#[0-9A-Za-z][0-9A-Za-z-]{0,62}(?:[.](?:[0-9A-Za-z][0-9A-Za-zā€Œā€‹-]{0,62}))*) %{GREEDYDATA:jsonstring}"]
}
json {
source => "jsonstring"
target => "parsedJson"
remove_field=>["jsonstring"]
}
mutate {
add_field => {
"actionType" => "%{[parsedJson][actionType]}"
"errorMessage" => "%{[parsedJson][errorMessage]}"
"actionName" => "%{[parsedJson][actionName]}"
"Payload" => "%{[parsedJson][Payload]}"
"pageInfo" => "%{[parsedJson][pageInfo]}"
"browserInfo" => "%{[parsedJson][browserInfo]}"
"dateTime" => "%{[parsedJson][dateTime]}"
}
}
}
}
output{
if "_grokparsefailure" in [tags]
{
elasticsearch
{
hosts => "localhost:9200"
index => "grokparsefailure-%{+YYYY.MM.dd}"
}
}
else {
elasticsearch
{
hosts => "localhost:9200"
index => "zindex"
}
}
stdout{codec => rubydebug}
}
As keep on new logs are writing to log files, i could see a difference of log counts.
Any suggestions would be appreciated.

How can I make indexing json file by using logstash?

I try to make index my json file like below. I have to write a grok expression . But I could not do that? can you help me?
{"level":"Information","ClientIP":"10.201.21.188","Test":"10.210.21.188"}
{"level":"Information","ClientIP":"10.202.21.187","Test":"10.220.21.188"}
{"level":"Information","ClientIP":"10.203.21.186","Test":"10.230.21.188"}
{"level":"Information","ClientIP":"10.204.21.185","Test":"10.240.21.188"}
My logstash.conf is below :
input {
file {
type => "json"
path => ["C:/logs/test-20170933.json"]
start_position => "beginning"
}
}
filter {
grok {
match => [ "message","%{WORD:level} I HAVE TO WRITE OTHER ELEMENTS BUT HOW????"]
}
json {
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
}
I guess that we need grok expression to achive that. Also I am open for new creative solution for that.
You don't need to grok anything, your file input simply needs a JSON codec and you're good to go:
input {
file {
type => "json"
path => ["C:/logs/test-20170933.json"]
start_position => "beginning"
codec => "json" <-- add this
}
}
filter {
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-%{+YYYY.MM.dd}"
}
}

Resources