I am working on a task that I want to mask sensitive data using Log4j2 LogEventPatternConverter Class.
#Plugin(name="SensitiveDataLog", category = "Converter")
#ConverterKeys({"sense"})
public class SensitiveDataLog extends LogEventPatternConverter {
#Value("${ssn}")
private String ssn;
public SensitiveDataLog(String name, String style) {
super(name, style);
}
public static SensitiveDataLog newInstance(String[] options) {
return new SensitiveDataLog("sense","sense");
}
#Override
public void format(LogEvent logEvent, StringBuilder outputMsg) {
String message = logEvent.getMessage().getFormattedMessage();
Matcher matcher = SSN_PATTERN.matcher(message);
if (matcher.find()) {
String maskedMessage = matcher.replaceAll("***-**-****");
outputMsg.append(maskedMessage);
} else {
outputMsg.append(message);
}
}
}
Suppose I want to keep pattern in application.properties, But problem here is we cannot load property value ssn. Always its null.
Here is my log4j2.xml
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="info" monitorInterval="30"
packages="com.virtusa.xlab.fw.logging.component"
xmlns="http://logging.apache.org/log4j/2.0/config">
<Properties>
<Property name="basePath">logs/log4j2</Property>
</Properties>
<Appenders>
<!-- File Appender -->
<RollingFile name="FILE"
fileName="${basePath}/logfile.log" filePattern="${basePath}/logfile.%d{yyyy-MM-dd}-%i.log" append="true">
<PatternLayout
pattern="%-5p | %d{yyyy-MM-dd HH:mm:ss} | [%t] %C{2} (%F:%L) - %sense%n" />
<Policies>
<SizeBasedTriggeringPolicy size="1 KB" />
</Policies>
<DefaultRolloverStrategy max="4" />
</RollingFile>
<!-- Console Appender -->
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout
pattern="%-5p | %d{yyyy-MM-dd HH:mm:ss} | [%t] %C{2} (%F:%L) - %sense%n" />
</Console>
</Appenders>
<Loggers>
<Logger name="com.virtusa.xlab.fw" level="info" />
<Root level="info">
<AppenderRef ref="STDOUT" />
<AppenderRef ref="FILE" />
</Root>
</Loggers>
</Configuration>
Can anyone help me out here?
Thanks.
The problem is that SensitiveDataLog is created via static method newInstance(). Obviously, field ssn is not initialized at that moment. What you can do is to init the field later, e.g. when refreshing Spring context.
Here is my snippet:
private static XmlMaskPatternConverter INSTANCE = new XmlMaskPatternConverter();
public XmlMaskPatternConverter() {
super(NAME, NAME);
}
public static XmlMaskPatternConverter newInstance() {
return INSTANCE;
}
Now you can call static method getInstance() somewhere in your Spring Configuration (I do it in #Bean method) and set the ssn value there. Ofc, you need to create a setter for this field.
P.S. Hope it helps. I faced this problem too, so decided to leave my solution here. My first post on SO btw)
Related
While Spring 4 Rest application loading , i am getting the logger information ( log file name / path / archival days) from database and passing this value to Logj plugin , so that i can retrieve the value from log4j.xml.
My application log file not created and plugin not called ! not getting any error in console as well.
What should i do , for Logj plugin load and log file creation ?
Spring:4.3.15.RELEASE
log4j:2.4.1
java:1.8
SpringWebInitializer.java
public class SpringWebInitializer implements WebApplicationInitializer{
#Override
public void onStartup(ServletContext ctx) throws ServletException {
AnnotationConfigWebApplicationContext webCtx = new AnnotationConfigWebApplicationContext();
webCtx.register(WebConfiguration.class);
webCtx.setServletContext(ctx);
ServletRegistration.Dynamic servlet = ctx.addServlet("dispatcher", new DispatcherServlet(webCtx));
servlet.setLoadOnStartup(1);
servlet.addMapping("/");
//Add Listener.
ctx.addListener(new MyAppContextListner());
}
}
WebConfiguration.java
#EnableWebMvc
#Configuration
#ComponentScan(basePackages ="org.vasa.ws.myapp")
public class WebConfiguration {
}
MyAppContextListner.java
public class MyAppContextListner implements ServletContextListener{
private static final Logger LOGGER = LogManager.getLogger(BenefitsCompositeContextListner.class);
private WebApplicationContext webApplicationContext = null;
#Override
public void contextInitialized(ServletContextEvent event) {
// Database connectivity and get the logger information from db.
}
#Override
public void contextDestroyed(ServletContextEvent event) {
// TODO Auto-generated method stub
}
}
Given below Log4j Plugin i am using
MyLog4JConfigDatabaseLookup.java
#Plugin(name = "MyAppdbLookup", category = StrLookup.CATEGORY)
public class MyLog4JConfigDatabaseLookup extends AbstractLookup {
public String lookup(final LogEvent event, final String key) {
System.out.println("Lookup......");
}
}
log4j2.xml
<Configuration packages="org.vasa.ws.myapp">
<Properties>
<Property name="app-name">MyAppdbLookup</Property>
<Property name="file-level">${MyAppdbLookup:logLevel}</Property>
<Property name="log-file">${MyAppdbLookup:logFile}</Property>
<Property name="log-file-level">${MyAppdbLookup:logLevel}</Property>
<Property name="log-path">${MyAppdbLookup:logPath}</Property>
<Property name="archive-days">${MyAppdbLookup:archive-days}</Property>
</Properties>
<Appenders>
<Routing name="route-log">
<Routes pattern="framework">
<Route key="benefitCompositeWS">
<RollingFile name="message-log" fileName="${log-path}/myapp.log"
filePattern="${log-path}/$${date:yyyy-MM}/myapp.%d{MM-dd-yyyy}-%i.log.gz" append="true">
<PatternLayout
pattern="%d{MM/dd/yyyy HH:mm:ss.SSS z} [%t] %-5level %logger{36} - %msg%n" />
<Policies>
<TimeBasedTriggeringPolicy />
<SizeBasedTriggeringPolicy size="150 MB" />
</Policies>
<DefaultRolloverStrategy max="1000">
<Delete basePath="${log-path}" maxDepth="2">
<IfFileName glob="*/myapp*.log.gz" />
<IfLastModified age="${archive-days}" />
</Delete>
</DefaultRolloverStrategy>
</RollingFile >
</Route>
</Routes>
</Routing>
<Loggers>
<Root level="${file-level}" additivity="false">
<AppenderRef ref="route-log" />
<AppenderRef ref="STDOUT" />
</Root>
</Loggers>
</Configuration>
I have changed my plugin name as lower case and now it's working fine as expected:
#Plugin(name="myappdblookup")
I have multi-tenant application followed by microservices created in spring boot version 1.4.3, i have generated tenant wise logs in different folders inside logs folder of tomcat.
A Problem which i am facing is after some time/hours/days/weeks my logs will stop but application work in backgroud. I am trying to find root cause of the same but failed.
In java code some classes using Slf4j and some classes using log4j to print logs.
As mentined i am using SiftingAppender and changing logFolder variable from Filter/Intercepter based on the tenant.
previously i was doing MDC.clear() to remove logFolder from MDC, then i have moved to MDC.remove("logFolder") but that also not worked
logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/defaults.xml" />
<property name="LOG_FILE" value="${LOG_FILE:-${LOG_PATH:-${LOG_TEMP:-${java.io.tmpdir:-/tmp}}/}}" />
<include resource="org/springframework/boot/logging/logback/console-appender.xml" />
<!-- This is MDC value -->
<!-- We will assign a value to 'logFileName' via Java code -->
<appender name="FILE-THREAD" class="ch.qos.logback.classic.sift.SiftingAppender">
<!-- This is MDC value -->
<!-- We will assign a value to 'logFileName' via Java code -->
<discriminator>
<key>logFolder</key>
<defaultValue>main</defaultValue>
</discriminator>
<sift>
<appender name="FILE-${logFileFolder}" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${LOG_FILE}/${logFolder}/enquiryengine.log</file>
<encoder>
<pattern>${FILE_LOG_PATTERN}</pattern>
</encoder>
<!-- <file>${LOG_FILE}</file> -->
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${LOG_FILE}/${logFolder}/enquiryengine.log.%d{yyyy-MM-dd}.%i.gz</fileNamePattern>
<timeBasedFileNamingAndTriggeringPolicy class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<maxFileSize>100MB</maxFileSize>
<maxHistory>15</maxHistory>
</timeBasedFileNamingAndTriggeringPolicy>
</rollingPolicy>
</appender>
</sift>
</appender>
<springProfile name="development">
<root level="INFO">
<appender-ref ref="CONSOLE" />
<appender-ref ref="FILE-THREAD" />
</root>
</springProfile>
<springProfile name="production">
<root level="INFO">
<appender-ref ref="FILE-THREAD" />
</root>
</springProfile>
<springProfile name="staging">
<root level="INFO">
<appender-ref ref="FILE-THREAD" />
</root>
</springProfile>
<springProfile name="testing3">
<root level="INFO">
<appender-ref ref="FILE-THREAD" />
</root>
</springProfile>
<springProfile name="testing">
<root level="INFO">
<appender-ref ref="FILE-THREAD" />
</root>
</springProfile>
<springProfile name="testing2">
<root level="INFO">
<appender-ref ref="FILE-THREAD" />
</root>
</springProfile>
<jmxConfigurator />
</configuration>
public class MyClass extends HandlerInterceptorAdapter {
#Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler)
throws Exception {
MDC.put(UtilConstant.LOG_FOLDER_NAME, "dynamicTenantFolderName");
return true;
}
#Override
public void afterConcurrentHandlingStarted(HttpServletRequest request, HttpServletResponse response, Object handler)
throws Exception {
MDC.remove(UtilConstant.LOG_FOLDER_NAME);
}
#Override
public void afterCompletion(HttpServletRequest request, HttpServletResponse response, Object handler, Exception ex)
throws Exception {
MDC.remove(UtilConstant.LOG_FOLDER_NAME);
}
}
public class TenantRequestFilter implements Filter {
#Override
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain)
throws IOException, ServletException {
MDC.remove(UtilConstant.LOG_FOLDER_NAME);
MDC.put(UtilConstant.LOG_FOLDER_NAME, "dynamicLogFolderName");
// Goes to default servlet
chain.doFilter(requestWrapper, response);
}
#Override
public void init(FilterConfig filterConfig) throws ServletException {
// TODO Auto-generated method stub
}
#Override
public void destroy() {
MDC.remove(UtilConstant.LOG_FOLDER_NAME);
}
}
As logs are not working so i will not get any stack trace, Expected Result is logs generation should not stop.
The default Timeout for Siftappender is 30 Minutes,
If no logs are written in 30mins then the siftappender gets timeout and no further logs on that MDC will be processed.
To overcome this,
1. You can increase the time limit as per the application's requirement
2. Have a scheduler to print log after particular interval, so that it never gets timeout
Refer Document: http://logback.qos.ch/manual/appenders.html
Im trying to get hot reloading to work with my logging level. The monitorInterval Should do the trick for me, but for some reason it doesnt work.
My log4j2.xml file looks like this:
<Configuration monitorInterval="10">
<Appenders>
<Console name="STDOUT" target="SYSTEM_OUT">
<PatternLayout
pattern="%d{ISO8601} [%-12.-12t] %-5p [%12.12X{CorrelationId}] [%-30.-30X{Path}] %logger{36}:%L - %msg%n"/>
</Console>
<File name="anywhere" fileName="anywhere.log" append="false">
<PatternLayout>
<Pattern>%d{ISO8601} [%-12.-12t] %-5p [%12.12X{CorrelationId}] [%-30.-30X{Path}] %logger{36}:%L - %msg%n</Pattern>
</PatternLayout>
</File>
</Appenders>
<Loggers>
<logger name="com.cetrea" level="info"/>
<Root level="warn">
<AppenderRef ref="anywhere"/>
<AppenderRef ref="STDOUT"/>
</Root>
</Loggers>
</Configuration>
Im testing it with my rest api, and when i hit this route, it should print out just LOG.info which it does.
private static final Logger LOG = LogManager.getLogger(TokenController.class);
#RequestMapping(value = "/sensitive-data/{token}", method = RequestMethod.GET, produces = "text/plain;charset=UTF-8")
public ResponseEntity getData(#PathVariable("token") String token) {
if (tokenMap.containsKey(token)) {
return ResponseEntity.ok(tokenMap.get(token));
} else {
Timestamp timestamp = new Timestamp(System.currentTimeMillis());
LOG.info("Hit with wrong or expired token at " + timestamp + "");
LOG.debug("debug thing");
return new ResponseEntity("Token not found, or has expired", HttpStatus.NOT_FOUND);
}
}
Now if i change the level to debug, i would expect it to also print out the LOG.debug, but it doesn't. This doesn't take in effect until i restart the program, instead of it hot reloading 10 seconds later.
As it turns out, when it builds it includes the log4j file, and then reads from that, so the file i was editting wasnt read. I added
-Dlog4j.configurationFile="Path to the actual file"
to the runtime settings and then it worked.
I have a scheduled task in a fixed rate, that reads a queue.
Each message that comes from the queue has an ID.
I wanna know if it's possible split the log by ID, appending to a different file.
I was thinking about use aspects or a custom appender, one of these can do the job for me?
Thanks.
Well, after some search I've remembered of MDC (Mapped Diagnostic Context) wich can do what I want with almost no workarounds.
I just need to add a SiftingAppender to the logback-spring.xml like this:
<?xml version="1.0" encoding="UTF-8"?>
<configuration debug="false">
<include resource="org/springframework/boot/logging/logback/base.xml"/>
<appender name="SIFT" class="ch.qos.logback.classic.sift.SiftingAppender">
<discriminator>
<key>checkoutId</key>
<defaultValue>system</defaultValue>
</discriminator>
<sift>
<appender name="${checkoutId}" class="ch.qos.logback.core.FileAppender">
<file>${checkoutId}.log</file>
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d{HH:mm:ss:SSS} | %-5level | %thread | %logger{20} | %msg%n%rEx</pattern>
</layout>
</appender>
</sift>
</appender>
<root level="INFO">
<appender-ref ref="SIFT" />
</root>
</configuration>
Than I call like that:
#Scheduled(initialDelayString = "${consumeStart:10000}", fixedRateString = "${consumeRate:5000}")
private void task() {
try {
val message = queue.get(timeout);
if (message != null) {
MDC.put("checkoutId", message.toString());
. . .
}
} finally {
MDC.remove("checkoutId");
}
}
logQuery is called in prepareStatementAndSetParameters mehtod - SQLInsertClause class
protected void logQuery(Logger logger, String queryString, Collection<Object> parameters) {
String normalizedQuery = queryString.replace('\n', ' ');
MDC.put(QueryBase.MDC_QUERY, normalizedQuery);
MDC.put(QueryBase.MDC_PARAMETERS, String.valueOf(parameters));
if (logger.isDebugEnabled()) {
logger.debug(normalizedQuery);
}
}
how can I set debug level to logger ?
That logger there is from SLF4J API. Depending on the logger you have behind the API you use facilities of that underlying logging implementation.
For instance we use Logback Classic (dependency ch.qos.logback:logback-classic) and I can explicitly override what configuration file to use with -Dlogback.configurationFile=devel-logback.xml in JVM parameters. Default mechanism is documented here. My file looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%date %level [%.60thread] %logger{1} %msg%n</pattern>
</encoder>
</appender>
<logger name="com.mysema.query.jpa.impl.JPAQuery" level="DEBUG"/>
<!-- more loggers -->
<root level="DEBUG">
<appender-ref ref="CONSOLE"/>
</root>
</configuration>
Also adding -Dlogback.debug=true to JVM arguments adds some debug output when logback is being initialized.