I have this configuration in my logback.xml into a Spring Web Application (NO Spring Boot).
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="Console" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="ch.qos.logback.contrib.json.classic.JsonLayout">
<timestampFormat>yyyy-MM-dd'T'HH:mm:ss.SSSX</timestampFormat>
<timestampFormatTimezoneId>Etc/UTC</timestampFormatTimezoneId>
<jsonFormatter class="ch.qos.logback.contrib.jackson.JacksonJsonFormatter">
<prettyPrint>true</prettyPrint>
</jsonFormatter>
</layout>
<customFields>{"appname":"foobar"}</customFields>
</encoder>
</appender>
<!-- LOG everything at INFO level -->
<root level="INFO">
<appender-ref ref="Console" />
</root>
</configuration>
The JSON layout works fine but custom fields as "appname": "foobar" are not printed:
{
"timestamp" : "2020-06-10T14:55:25.534Z",
"level" : "INFO",
"thread" : "Catalina-utility-1",
"logger" : "org.springframework.web.servlet.DispatcherServlet",
"message" : "FrameworkServlet 'dispatcher': initialization completed in 72 ms",
"context" : "default"
}
What am I doing wrong?
SOLUTION
I was using the wrong libraries for my needs:
logback-jackson
logback-json-classic
Because of the fact that I need to process logs through Logstash I've corrected my configuration like this:
pom.xml
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>6.4</version>
</dependency>
logback.xml
<appender name="Console" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"customer":"X", "appname":"Y", "environment":"dev"}</customFields>
</encoder>
</appender>
and now It works fine.
I just stumbled this question because I had the same problem, and I found a solution, with logback-jackson and logback-json-classic.
Option 1: Per-Thread via Mapped Diagnostic Context (MDC)
SLF4j's Mapped Diagnostic Context is a per-thread key-value store that we can use to write custom structured data to the log output.
MDC.put("customKey", "customValue");
Logback's JsonLayout will automatically print this value under a special mdc JSON object without any further configuration.
{ [...], "mdc": {"customKey", "customValue"}}
Note that the MDC is constructed per thread and if it is empty, no mdc field is printed to the log output.
Option 2: Global (for all threads)
If you want custom fields to appear at the JSON output's root, you need to create a custom, but simple Layout class that extends JsonLayout. JsonLayout provides us with a addCustomDataToJsonMap we can override.
package com.mypackage;
import ch.qos.logback.contrib.json.classic.JsonLayout;
public class CustomJsonLayout extends JsonLayout {
#Override
protected void addCustomDataToJsonMap(Map<String, Object> map, ILoggingEvent event) {
map.put("customKey", "customValue");
}
}
Now, you just need to tell Logback to use CustomJsonLayout instead of JsonLayout in your logback.xml file and keep the rest the same.
<layout class="com.mypackage.CustomJsonLayout">
...
</layout>
Now, any log message will have the following output:
{ ..., "customKey": "customValue"}
Related
Based on this guide about logging AWSRequestId, I can add %X{AWSRequestId} on my logback.xml to print the request id on my lambda function.
On the example, the method that adds this key on MDC is populateMappingDiagnosticContextValues of class MicronautRequestHandler. But since my lambda function has two endpoints, I created it as an Application type and it says it's using the MicronautLambdaHandler as the handler class.
What I did is I created my custom handler that extends the MicronautLambdaHandler and added the populateMappingDiagnosticContextValues method.
#Override
public AwsProxyResponse handleRequest(AwsProxyRequest input, Context context) {
if (context != null) {
populateMappingDiagnosticContextValues(context);
}
return handler.proxy(input, context);
}
logback.xml
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<withJansi>false</withJansi>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss} %X{AWSRequestId} %-5p %c{1} - %m%n</pattern>
</encoder>
</appender>
<root level="info">
<appender-ref ref="STDOUT" />
</root>
</configuration>
On the lambda setup I pointed the handler on my custom handler and the app is working.
Except that the logs is not showing the AWSRequestId.
Is this functionality only works on MicronautRequestHandler class and not on MicronautLambdaHandler?
I also tried the aws guide here that uses log4j2 for logging the request id. But log4j doesn't work when building the GraalVM native image, that's why I sticked to logback.
If you copied populateMappingDiagnosticContextValues and mdcput into your Handler (I suppose you did it) it should work without any additional changes.
I am trying to configure log4j2 in springboot.I have removed(excluded) the logback dependency already from pom.xml.I am using this xml under resource folder named log4j2.xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="DEBUG">
<Appenders>
<Console name="LogToConsole" target="SYSTEM_OUT">
<PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/>
</Console>
<File name="LogToFile" fileName="logs/app.log">
<PatternLayout>
<Pattern>%d %p %c{1.} [%t] %m%n</Pattern>
</PatternLayout>
</File>
</Appenders>
<Loggers>
<Logger name="com.ashish" level="debug" additivity="false">
<AppenderRef ref="LogToFile"/>
<AppenderRef ref="LogToConsole"/>
</Logger>
<Logger name="org.springframework.boot" level="error" additivity="false">
<AppenderRef ref="LogToConsole"/>
</Logger>
<Root level="error">
<AppenderRef ref="LogToFile"/>
<AppenderRef ref="LogToConsole"/>
</Root>
</Loggers>
</Configuration>
This is my controller class.
package com.ashish;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.springframework.stereotype.Controller;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.GetMapping;
import java.util.Arrays;
import java.util.List;
#Controller
public class HelloController {
private static final Logger logger = LogManager.getLogger(HelloController.class);
private List<Integer> num = Arrays.asList(1, 2, 3, 4, 5);
#GetMapping("/")
public String main(Model model) {
if (logger.isDebugEnabled()) {
logger.debug("Hello from Log4j 2 - num : {}", num);
}
logger.debug("Hello from Log4j 2 - num : {}", () -> num);
model.addAttribute("tasks", num);
return "welcome";
}
private int getNum() {
return 100;
}
}
Am i missing anything here?I tried to set it using application.properties too using latest version of Log4j2.But still it's not getting created.When i run the application i can't see any log file getting dynamically created at the path specified in xml.
First, you have status="DEBUG" specified in your configuration. So you will see Log4j configure itself on the console (or wherever system.out is getting routed to). If you do not then you aren't really using Log4j.
If you do see the output then check the debug lines. I have a suspicion your log file is either not getting created due to a permissions problem or it isn't being written where you expect it.
Your configuration specifies a relative directory named "logs". Whatever directory is the working directory when the app is started should contain your logs directory. Frequently on Linux that will end up being "/". You almost certainly won't have permission to create a logs directory there so configuration will fail.
I can't comment on the content of the log4j2 config file itself, probably it will make sense to print on console first and make sure that its driven by log4j2 indeed.
However, I'll refer to the beginning of the question:
I am trying to configure log4j2 in springboot.I have removed(excluded) the logback dependency already from pom.xml
You don't present the pom.xml but in general in order to switch spring boot to work with log4j2 you should:
"Exclude" the default logging mechanism of spring boot:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
Add a log4j2 starter that will in turn (transitively) add log4j2 dependency of the versions compatible with your spring boot version:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-log4j2</artifactId>
</dependency>
Quick googling has revealed this tutorial that contains all the steps including end-to-end example of such an integration.
I am using Spring Boot (1.5.4). I wish to send (logback) logs from my services to Logstash via RabbitMQ in a JSON format rather than plain text. This will save me from having to set up a filter on the Logstash side so that formatting can be controlled on the application side (using a Logback Encoder).
I am aware of the Spring logback AMQP Appender for RabbitMQ org.springframework.amqp.rabbit.logback.AmqpAppender however this uses a Layout (plain text) rather than formatted JSON. I would like to use the LogStash Encoder net.logstash.logback.encoder.LogstashEncoder. I would like to use the Appender with the Encoder (I want it all :").
I first extended the AMQPAppender to add the Encoder like so:-
package nz.govt.mpi.util;
import org.springframework.amqp.core.Message;
import org.springframework.amqp.rabbit.logback.AmqpAppender;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.encoder.Encoder;
import lombok.Getter;
import lombok.Setter;
public class AmqpLogbackAppender extends AmqpAppender {
#Getter
#Setter
private Encoder<ILoggingEvent> encoder;
/**
* We remove the default message layout and replace with the JSON {#link Encoder}
*/
#Override
public Message postProcessMessageBeforeSend(Message message, Event event) {
return new Message(this.encoder.encode(event.getEvent()), message.getMessageProperties());
}
#Override
public void start() {
super.start();
encoder.setContext(getContext());
if (!encoder.isStarted()) {
encoder.start();
}
}
#Override
public void stop() {
super.stop();
encoder.stop();
}
}
And then I set up the logback-spring.xml configuration file like so:-
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<springProperty scope="context" name="rabbitMQHost" source="logback.amqp.host" defaultValue="localhost"/>
<springProperty scope="context" name="rabbitMQPort" source="logback.amqp.port" defaultValue="5672"/>
<springProperty scope="context" name="rabbitMQUsername" source="spring.rabbitmq.username" />
<springProperty scope="context" name="rabbitMQPassword" source="spring.rabbitmq.password" />
<springProperty scope="context" name="rabbitMQExchangeName" source="logback.amqp.exchange.name" defaultValue="mpi.tradedev"/>
<springProperty scope="context" name="rabbitMQRoutingKey" source="logback.amqp.routing.key" defaultValue="mpi.tradedev.logging"/>
<springProperty scope="context" name="serviceName" source="spring.application.name" />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread, %X{X-B3-TraceId:-},%X{X-B3-SpanId:-}] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="AMQP" class="nz.govt.mpi.util.AmqpLogbackAppender">
<!-- layout is required but ignored as using the encoder for the AMQP message body -->
<layout><pattern><![CDATA[ %level ]]></pattern></layout>
<encoder class="net.logstash.logback.encoder.LogstashEncoder">
<customFields>{"serviceName": "${serviceName}"}</customFields>
</encoder>
<!-- RabbitMQ connection -->
<host>${rabbitMQHost}</host>
<port>${rabbitMQPort}</port>
<username>${rabbitMQUsername}</username>
<password>${rabbitMQPassword}</password>
<exchangeName>${rabbitMQExchangeName}</exchangeName>
<routingKeyPattern>${rabbitMQRoutingKey}</routingKeyPattern>
<declareExchange>true</declareExchange>
<exchangeType>topic</exchangeType>
<generateId>true</generateId>
<charset>UTF-8</charset>
<durable>true</durable>
<deliveryMode>PERSISTENT</deliveryMode>
</appender>
<root level="info">
<appender-ref ref="STDOUT" />
<appender-ref ref="AMQP" />
</root>
</configuration>
I lastly added the required properties to the application.properties file like so:-
spring.application.name=my-app
logback.amqp.host=localhost
logback.amqp.port=5672
logback.amqp.exchange.name=ex_logstash
logback.amqp.routing.key=my-app.logging
spring.rabbitmq.username=rquser
spring.rabbitmq.password=rqpass
I also had to set up the necessary user account in RabbitMQ. When the application runs it creates the topic (ex_logstash) but you must create a queue (qu_logstash) that is bound to that topic with the routing key match (my-app.*).
You then create a logstash configuration to match the queue name.
ex_logstash -> qu_logstash
The logstash.json configuration file example:-
input {
rabbitmq {
host => "localhost"
queue => "qu_logstash"
durable => true
exchange => "ex_logstash"
key => "my-app.*"
threads => 10
type => "topic"
prefetch_count => 200
port => 5672
user => "rquser"
password => "rqpass"
}
}
On the application side you will need the required dependencies in your pom.xml. These are the ones I am using that cover the required classes YMMV:-
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-rabbit</artifactId>
</dependency>
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.9</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.3</version>
</dependency>
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>1.2.3</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<scope>provided</scope>
</dependency>
I have a scenario where I send some sensitive data in the ResponseBody of a spring controller. Spring logs the entire response body in DEBUG mode, where I can see the sensitive data also. Now if I configure my app to print only INFO logs or higher, this won't be displayed, but that is not a fool proof method, since its possible that DEBUG mode could be turned on accidentally.
Is there anyway I can disable certain fields in the response body from being logged by Spring?
Thanks
My suggestion would be that you implement a custom converter for logback (if that is your logging library)
http://logback.qos.ch/manual/layouts.html#customConversionSpecifier
It enables you to convert your logging data and blur out any data that should be removed from the message.
From the documentation it is pretty straight forward
public class MySampleConverter extends ClassicConverter {
long start = System.nanoTime();
#Override
public String convert(ILoggingEvent event) {
long nowInNanos = System.nanoTime();
return Long.toString(nowInNanos-start);
}
}
And the config
<configuration>
<conversionRule conversionWord="nanos"
converterClass="chapters.layouts.MySampleConverter" />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%-6nanos [%thread] - %msg%n</pattern>
</encoder>
</appender>
<root level="DEBUG">
<appender-ref ref="STDOUT" />
</root>
</configuration>
If you're using Log4j you can implement a custom filter which would mask the data per your layout
http://vozis.blogspot.sg/2012/02/log4j-filter-to-mask-payment-card.html
Edit
The final easier solution turned out to be to override toString of the return object. Spring uses toString to log the return message
logQuery is called in prepareStatementAndSetParameters mehtod - SQLInsertClause class
protected void logQuery(Logger logger, String queryString, Collection<Object> parameters) {
String normalizedQuery = queryString.replace('\n', ' ');
MDC.put(QueryBase.MDC_QUERY, normalizedQuery);
MDC.put(QueryBase.MDC_PARAMETERS, String.valueOf(parameters));
if (logger.isDebugEnabled()) {
logger.debug(normalizedQuery);
}
}
how can I set debug level to logger ?
That logger there is from SLF4J API. Depending on the logger you have behind the API you use facilities of that underlying logging implementation.
For instance we use Logback Classic (dependency ch.qos.logback:logback-classic) and I can explicitly override what configuration file to use with -Dlogback.configurationFile=devel-logback.xml in JVM parameters. Default mechanism is documented here. My file looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date %level [%.60thread] %logger{1} %msg%n</pattern>
</encoder>
</appender>
<logger name="com.mysema.query.jpa.impl.JPAQuery" level="DEBUG"/>
<!-- more loggers -->
<root level="DEBUG">
<appender-ref ref="CONSOLE"/>
</root>
</configuration>
Also adding -Dlogback.debug=true to JVM arguments adds some debug output when logback is being initialized.