Logback how to log json structure for splunk and default for app - spring

I have a Spring Boot app with logback on kubernetes that structures the log in a json format with other metadata.
The log on the kubernetes pod is also in the same exact format.
Is it possible to have an appender that logs with default format on the kubernetes pod while continuing to log json format for splunk?

Related

Is there any Elastic Search appender for directly sending(storing) spring-boot application logs to Elastic Search without using ELK stack

We are planning to store our (Spring-Boot) application logs to ElasticSearch. I am aware of ELK stack, which uses FileBeat + LogStash to collect and process the logs.
What is desired: Have an appender in logback.xml to directly send the logs to ElasticSearch. The very basic idea is of having an appender like File-Appenders with the difference of target for storing logs being ElasticSearch. At the same time, we want to do it in asynchronous manner. FYI, we are using slf4j with logback implementations for logging.
More specifically: We want to remove the intermediators:: Logstash or Beats as they will need more infra and may bring unwanted overhead. And having the process of sending logs to ElasticSearch in asynchronous way would be really great (so that application does not suffer latency due to logging).
What I have already tried:
Send Spring Boot logs directly to LogStash. But it seems of not much use, since it internally uses file appenders and the logs are then sent to LogStash.
Is there any such appenders available? Or maybe there is some workaround.

Spring Actuator Metrics generate logs

I'm trying to get micrometer metrics data to Splunk. Each metric endpoint gives the current value of a metric, so I would need Splunk to send a http request to my application periodically, or I can write the metric values to a log file periodically.
So how do I get my application to write the metric values to logs?
If you are in spring boot 2.x and Micrometer is of version 1.1.0+ you can create a bean of
periodic (1 minute) special logging registry see (https://github.com/micrometer-metrics/micrometer/issues/605)
#Bean
LoggingMeterRegistry loggingMeterRegistry() {
return new LoggingMeterRegistry();
}
This is by far the easiest way to log everything via logging system.
Another alternative is creating a scheduled job that will run some method on a bean with injected metering registry that will iterate over all the metrics (with possibly filtering out the metrics that you won't need) and preparing the log of your format.
If you think about this, this is exactly what the metrics endpoint of spring boot actuator does, except returning the data via http instead of writing to log.
Here is an up-to-date implementation of the relevant endpoint from the spring boot actuator source

Stackdriver - All logs are mapped as INFO

All my logs ERROR/WARNIN are mapped as INFO at Stackdriver.
I'm using logback and Im running my application in a Kubernetes cluster.
How can I setup my logback to Stackdriver?
Tks
The Stackdriver logging agent configuration for Kubernetes defaults to INFO for any logs written to the container's stdout and ERROR for logs written to stderr. If you want finer-grained control over severity, you can configure Spring to log as single-line JSON (e.g., via JsonLayout1) and let the logging agent pick up the severity from the JSON object (see https://cloud.google.com/logging/docs/agent/configuration#process-payload).
1By default, JsonLayout will use "level" for the log level, while the Stackdriver logging agent recognizes "severity", so you may have to override addCustomDataToJsonMap.
See also GKE & Stackdriver: Java logback logging format?

Integration of fluentd with Java application

I am working on integrating spring boot application with fluentd.
When I use FluentdLogger in the application and source as forward in fluentd config file, I am able to log the entries to log file .
So How to provide log level to log statments in spring boot application, default is every statement is printed to file with tag.
Do I have to use in_tail with logging frameworks which will log to file and fluentd will tail from that file to another file.
OR there is way or appender in logback or logging frameworks in java.

Logstash - prevent to loading logdata

I have log data from tomcat server(logback), and would like to analysis it.
So, this is my thought.
logback log data -> logstash -> elasticsearch -> request elastic query.
From some other architecture, there is redis in front of logback, becuase logback has no buffer.
If I would like to do this in real time, is redis requeried?
Or no need it because all data is already stored in elastic? than why are they using redis for it?

Resources