log4j2 monitorInterval springboot not working - spring-boot

Im trying to get hot reloading to work with my logging level. The monitorInterval Should do the trick for me, but for some reason it doesnt work.
My log4j2.xml file looks like this:
<Configuration monitorInterval="10">
<Appenders>
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout
pattern="%d{ISO8601} [%-12.-12t] %-5p [%12.12X{CorrelationId}] [%-30.-30X{Path}] %logger{36}:%L - %msg%n"/>
</Console>
<File name="anywhere" fileName="anywhere.log" append="false">
<PatternLayout>
<Pattern>%d{ISO8601} [%-12.-12t] %-5p [%12.12X{CorrelationId}] [%-30.-30X{Path}] %logger{36}:%L - %msg%n</Pattern>
</PatternLayout>
</File>
</Appenders>
<Loggers>
<logger name="com.cetrea" level="info"/>
<Root level="warn">
<AppenderRef ref="anywhere"/>
<AppenderRef ref="STDOUT"/>
</Root>
</Loggers>
</Configuration>
Im testing it with my rest api, and when i hit this route, it should print out just LOG.info which it does.
private static final Logger LOG = LogManager.getLogger(TokenController.class);
#RequestMapping(value = "/sensitive-data/{token}", method = RequestMethod.GET, produces = "text/plain;charset=UTF-8")
public ResponseEntity getData(#PathVariable("token") String token) {
if (tokenMap.containsKey(token)) {
return ResponseEntity.ok(tokenMap.get(token));
} else {
Timestamp timestamp = new Timestamp(System.currentTimeMillis());
LOG.info("Hit with wrong or expired token at " + timestamp + "");
LOG.debug("debug thing");
return new ResponseEntity("Token not found, or has expired", HttpStatus.NOT_FOUND);
}
}
Now if i change the level to debug, i would expect it to also print out the LOG.debug, but it doesn't. This doesn't take in effect until i restart the program, instead of it hot reloading 10 seconds later.

As it turns out, when it builds it includes the log4j file, and then reads from that, so the file i was editting wasnt read. I added
-Dlog4j.configurationFile="Path to the actual file"
to the runtime settings and then it worked.

Related

How to load application.properties into Log4j2 LogEventPatternConverter class?

I am working on a task that I want to mask sensitive data using Log4j2 LogEventPatternConverter Class.
#Plugin(name="SensitiveDataLog", category = "Converter")
#ConverterKeys({"sense"})
public class SensitiveDataLog extends LogEventPatternConverter {
#Value("${ssn}")
private String ssn;
public SensitiveDataLog(String name, String style) {
super(name, style);
}
public static SensitiveDataLog newInstance(String[] options) {
return new SensitiveDataLog("sense","sense");
}
#Override
public void format(LogEvent logEvent, StringBuilder outputMsg) {
String message = logEvent.getMessage().getFormattedMessage();
Matcher matcher = SSN_PATTERN.matcher(message);
if (matcher.find()) {
String maskedMessage = matcher.replaceAll("***-**-****");
outputMsg.append(maskedMessage);
} else {
outputMsg.append(message);
}
}
}
Suppose I want to keep pattern in application.properties, But problem here is we cannot load property value ssn. Always its null.
Here is my log4j2.xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="info" monitorInterval="30"
packages="com.virtusa.xlab.fw.logging.component"
xmlns="http://logging.apache.org/log4j/2.0/config">
<Properties>
<Property name="basePath">logs/log4j2</Property>
</Properties>
<Appenders>
<!-- File Appender -->
<RollingFile name="FILE"
fileName="${basePath}/logfile.log" filePattern="${basePath}/logfile.%d{yyyy-MM-dd}-%i.log" append="true">
<PatternLayout
pattern="%-5p | %d{yyyy-MM-dd HH:mm:ss} | [%t] %C{2} (%F:%L) - %sense%n" />
<Policies>
<SizeBasedTriggeringPolicy size="1 KB" />
</Policies>
<DefaultRolloverStrategy max="4" />
</RollingFile>
<!-- Console Appender -->
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout
pattern="%-5p | %d{yyyy-MM-dd HH:mm:ss} | [%t] %C{2} (%F:%L) - %sense%n" />
</Console>
</Appenders>
<Loggers>
<Logger name="com.virtusa.xlab.fw" level="info" />
<Root level="info">
<AppenderRef ref="STDOUT" />
<AppenderRef ref="FILE" />
</Root>
</Loggers>
</Configuration>
Can anyone help me out here?
Thanks.
The problem is that SensitiveDataLog is created via static method newInstance(). Obviously, field ssn is not initialized at that moment. What you can do is to init the field later, e.g. when refreshing Spring context.
Here is my snippet:
private static XmlMaskPatternConverter INSTANCE = new XmlMaskPatternConverter();
public XmlMaskPatternConverter() {
super(NAME, NAME);
}
public static XmlMaskPatternConverter newInstance() {
return INSTANCE;
}
Now you can call static method getInstance() somewhere in your Spring Configuration (I do it in #Bean method) and set the ssn value there. Ofc, you need to create a setter for this field.
P.S. Hope it helps. I faced this problem too, so decided to leave my solution here. My first post on SO btw)

Spring boot using log4j2 write log too slow

I am using log4j2 in Spring Boot application to asyn logging.
Here is my config log4j2-dev.xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" monitorInterval="30">
<Properties>
<Property name="LOG_PATTERN">%d{yyyy-MM-dd HH:mm:ss.SSS} %highlight{%5p}--[%T-%-15.15t] [%-20X{serviceMessageId}]%-40.40c{1.} :%m%n%ex</Property>
</Properties>
<Appenders>
<Console name="ConsoleAppender" target="SYSTEM_OUT"
follow="true">
<PatternLayout pattern="${LOG_PATTERN}" />
</Console>
<!-- Rolling File Appender -->
<RollingFile name="FileAppender" fileName="logs/app.log"
filePattern="logs/app-%d{yyyy-MM-dd}-%i.log">
<PatternLayout>
<Pattern>${LOG_PATTERN}</Pattern>
</PatternLayout>
<Policies>
<SizeBasedTriggeringPolicy size="100MB" />
</Policies>
<DefaultRolloverStrategy max="10" />
</RollingFile>
<Kafka name="KafkaAppender" topic="ServiceCentrallog">
<Property name="bootstrap.servers">10.2.16.2:9092,10.2.16.3:9092,10.2.16.4:9092</Property>
<JSONLayout compact="true" properties="true">
<KeyValuePair key="application"
value="${bundle:application-dev:spring.application.name}" />
</JSONLayout>
</Kafka>
</Appenders>
<Loggers>
<AsyncRoot level="info">
<AppenderRef ref="ConsoleAppender" />
<AppenderRef ref="FileAppender" />
<AppenderRef ref="KafkaAppender" />
</AsyncRoot>
</Loggers>
My BaseClass in Project
public abstract class BaseObject {
protected final org.apache.logging.log4j.Logger logger = LogManager.getLogger(getClass());
#Override
public String toString() {
ObjectMapper mapper = new ObjectMapper();
mapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
String jsonString = "";
try {
jsonString = mapper.writeValueAsString(this);
} catch (JsonProcessingException e) {
logger.error("BaseObject: ", e);
jsonString = "Can't build json from object";
}
return jsonString;
}
}
Here is how I do to write log:
logger.info("Input: " + input.toString());
....
logger.info("output: " + Utils.toJson(restRes));
It work fine in normal case.
But if I'm using Jmetter to send a lot of request (TOTAL: 7996, AVG: 98 message/s)
I see that the logging is too slow, after stop sending requests about 1.5 minutes the logging is still continues and log files are still increasing in capacity.
I have searched a lot but still do not know how to speed up logging, or find out what's not reasonable in my configuration.
But if I'm using Jmetter to send a lot of request (TOTAL: 7996, AVG:
98 message/s) I see that the logging is too slow, after stop sending
requests about 1.5 minutes the logging is still continues and log
files are still increasing in capacity.
You are using asynch logging of Log4J2. Its intention is to not block the executing threads during logging operations. So if your application logs many things (7996*98 messages) in a couple of minutes, this behavior makes completely sense : messages are queued more and more and handling these until the last one will take time.
I have searched a lot but still do not know how to speed up logging,
or find out what's not reasonable in my configuration.
1) Use synchronous logging will speed up your logging as it will use a blocking approach : the logging invocation will return only when the message would be effectively logged in the appender(s) but it will also affect the speed of your processings.
2) Don't use 3 appenders for this scenario (that is logging request/response):
<AsyncRoot level="info">
<AppenderRef ref="ConsoleAppender" />
<AppenderRef ref="FileAppender" />
<AppenderRef ref="KafkaAppender" />
</AsyncRoot>
It performs the log three times. It is a lot.
If you really need to log these information, log these in a single appender. You can achieve it easily with the filter feature (the MarkerFilter should be fine).
For example add the marker JSON_REQUEST_RESPONSE when you log and specify that only one of the appender logs it if present and the others don't log in any case :
<RollingFile name="FileAppender" fileName="logs/app.log"
filePattern="logs/app-%d{yyyy-MM-dd}-%i.log">
<!-- ACCEPT Marker-->
<MarkerFilter marker="JSON_REQUEST_RESPONSE" onMatch="ACCEPT" />
<...>
</RollingFile>
<Console name="ConsoleAppender" target="SYSTEM_OUT"
follow="true">
<!-- DENY Marker-->
<MarkerFilter marker="JSON_REQUEST_RESPONSE" onMatch="DENY" />
<...>
</Console>
3) Don't log so much in info() :
logger.info("Input: " + input.toString());
....
logger.info("output: " + Utils.toJson(restRes));
As a side note, don't use concatenation for logging because this can be expensive for nothing if the logger level doesn't match and that nothing is logged.
The lazy evaluated computation methods that takes a Supplier are better in this case :
logger.info("Input: {}", () -> input.toString());
....
logger.info("output: {}", () -> Utils.toJson(restRes));

How to enable Logback-access in Spring Boot?

I'm trying to enable logback-access in a spring boot app to log all http requests that hit the application.
I've tried implementing this using: https://github.com/akihyro/logback-access-spring-boot-starter
Adding the XML file shown in the example doesn't do anything, is there anything more that needs to be added to enable?
Any other suggestions to achieve the same result would be welcomed :)
you still need to wire a bean to your application...like this code snippet wires the filter to your application:
#SpringBootApplication
public class MyApp {
public static void main(String[] args) {
SpringApplication.run(MyApp.class, args);
}
// ... your other methods here
#Bean
public CommonsRequestLoggingFilter logFilter() {
CommonsRequestLoggingFilter filter = new CommonsRequestLoggingFilter();
filter.setIncludeQueryString(true);
return filter;
}
}
I am pretty sure you are talking about the logback logger for SpringBoot. If I am not wrong, this is how you can do this
a. Add the dependency in your POM
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</dependency>
b. Now there are may ways you can ask spring to configure logback. For example
In the application.properties file
In the logback.xml file
The advantage of using logback.xml file is that, you might have different xml file for different build profile. But in the application.properties, you don’t have this freedom.
Sample entries in the application.properties file from one of my project
logging.level.org.springframework.web = INFO
logging.level.com.company.app = DEBUG
#logging.level.org.hibernate=ERROR
logging.file=logs/spring-boot-logging.log
## Hibernate Logging
logging.level.org.hibernate.SQL = DEBUG
If you are using XML, configuration probably will look like this
<configuration>
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>%d{yyyy-MM-dd HH:mm:ss} [%thread] %-5level %logger{36} - %msg%n</Pattern>
</layout>
</appender>
<logger name="org.springframework" level="error" additivity="false">
<appender-ref ref="STDOUT" />
</logger>
<logger name="org.springframework" level="info" additivity="false">
<appender-ref ref="STDOUT" />
</logger>
<logger name="org.springframework" level="warn" additivity="false">
<appender-ref ref="STDOUT" />
</logger>
<logger name="com.memorynotfound" level="debug" additivity="false">
<appender-ref ref="STDOUT" />
</logger>
<configuration scan="true"/>
</configuration>
I recommend you to do some Googling for better understating.
Good luck!

Springboot Event Logging

I have a scheduled task in a fixed rate, that reads a queue.
Each message that comes from the queue has an ID.
I wanna know if it's possible split the log by ID, appending to a different file.
I was thinking about use aspects or a custom appender, one of these can do the job for me?
Thanks.
Well, after some search I've remembered of MDC (Mapped Diagnostic Context) wich can do what I want with almost no workarounds.
I just need to add a SiftingAppender to the logback-spring.xml like this:
<?xml version="1.0" encoding="UTF-8"?>
<configuration debug="false">
<include resource="org/springframework/boot/logging/logback/base.xml"/>
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator>
<key>checkoutId</key>
<defaultValue>system</defaultValue>
</discriminator>
<sift>
<appender name="${checkoutId}" class="ch.qos.logback.core.FileAppender">
<file>${checkoutId}.log</file>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d{HH:mm:ss:SSS} | %-5level | %thread | %logger{20} | %msg%n%rEx</pattern>
</layout>
</appender>
</sift>
</appender>
<root level="INFO">
<appender-ref ref="SIFT" />
</root>
</configuration>
Than I call like that:
#Scheduled(initialDelayString = "${consumeStart:10000}", fixedRateString = "${consumeRate:5000}")
private void task() {
try {
val message = queue.get(timeout);
if (message != null) {
MDC.put("checkoutId", message.toString());
. . .
}
} finally {
MDC.remove("checkoutId");
}
}

Querydsl logging queries with bindings

logQuery is called in prepareStatementAndSetParameters mehtod - SQLInsertClause class
protected void logQuery(Logger logger, String queryString, Collection<Object> parameters) {
String normalizedQuery = queryString.replace('\n', ' ');
MDC.put(QueryBase.MDC_QUERY, normalizedQuery);
MDC.put(QueryBase.MDC_PARAMETERS, String.valueOf(parameters));
if (logger.isDebugEnabled()) {
logger.debug(normalizedQuery);
}
}
how can I set debug level to logger ?
That logger there is from SLF4J API. Depending on the logger you have behind the API you use facilities of that underlying logging implementation.
For instance we use Logback Classic (dependency ch.qos.logback:logback-classic) and I can explicitly override what configuration file to use with -Dlogback.configurationFile=devel-logback.xml in JVM parameters. Default mechanism is documented here. My file looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date %level [%.60thread] %logger{1} %msg%n</pattern>
</encoder>
</appender>
<logger name="com.mysema.query.jpa.impl.JPAQuery" level="DEBUG"/>
<!-- more loggers -->
<root level="DEBUG">
<appender-ref ref="CONSOLE"/>
</root>
</configuration>
Also adding -Dlogback.debug=true to JVM arguments adds some debug output when logback is being initialized.

Resources