I need to keep separate log4j log files in a separate folder.Those log files should be separated for each URL.My project is Spring MVC project.
For example,
www.xyz.com/test/url1
www.xyz.com/test/url2
www.xyz.com/test/url3
How can I configure my log4j?
Is there a way to keep separate log4j files for method level?
Its possible. Yo need to create different logger instances in a class say logger, logger1, logger2 and these should point to different files. Although they will use the same base package to cover but you can move logs in different files for different methods
Here I am posting sample code for configuring multiple loggers.
I have configured my log4j file in this way.
log4j.rootLogger = INFO , stdout
#log4j.appender.stdout=org.apache.log4j.ConsoleAppender
#log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
#log4j.appender.stdout.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.appender.pageOneLog=org.apache.log4j.FileAppender
log4j.appender.pageOneLog.File=C:/work/logs/pageOneLog.log
log4j.appender.pageOneLog.layout=org.apache.log4j.PatternLayout
log4j.appender.pageOneLog.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.appender.pageTwoLog=org.apache.log4j.FileAppender
log4j.appender.pageTwoLog.File=C:/work/logs/pageTwoLog.log
log4j.appender.pageTwoLog.layout=org.apache.log4j.PatternLayout
log4j.appender.pageTwoLog.layout.ConversionPattern=%d [%24F:%t:%L] - %m%n
log4j.category.pageOneLogger=INFO, pageOneLog
log4j.additivity.pageOneLogger=false
log4j.category.pageTwoLogger=INFO, pageTwoLog
log4j.additivity.pageTwoLogger=false
Then use this loggers in the Java code following manner.
Logger log1 = Logger.getLogger("pageOneLogger");
Logger log2 = Logger.getLogger("pageTwoLogger");
Related
We are running JMX using Tauras setup. To run a lightweight test we need to disable all logging.
I tried by adding:
modules:
jmeter:
properties:
log_level.jmeter: 'FATAL_ERROR'
logging:
level:
ROOT: 'ERROR'
Log_Level: 'DEBUG'
We still see the trace.jtl file with log information. Can anyone help us with how to disable logging to trace.jtl file?
Do we have any specific command in Tauras to do this?
Thanks.
I don't think trace.jtl is controllable by these properties, what you change controls the verbosity of JMeter Logging subsystem, to wit the contents of jmeter.log file under the artifacts directory.
trace.jtl is a different beast, it contains request/response details provided by adding yet another Simple Data Writer configured like:
So if you want to get rid of it - locate the write-xml-jtl line in your configuration script and comment it (or remove it or set it to something different than full or error), on next execution you won't see this trace.jtl file
We would like our application logs to be printed to files on the local nodes. We're using Log4j's RollingFileAppender.
Our log4j.properties file is as follows:
ODS.LOG.DIR=/var/log/appLogs
ODS.LOG.INFO.FILE=application.log
ODS.LOG.ERROR.FILE=application_error.log
# Root logger option
log4j.rootLogger=ERROR, console
log4j.logger.com.ournamespace=ERROR, APP_APPENDER, ERROR_APPENDER
#
# console
# Add "console" to rootlogger above if you want to use this
#
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.out
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss}-%r %p %c{5}: %m%n
# Direct log messages to a log file
log4j.appender.APP_APPENDER=org.apache.log4j.RollingFileAppender
log4j.appender.APP_APPENDER.File=${ODS.LOG.DIR}/${ODS.LOG.INFO.FILE}
log4j.appender.APP_APPENDER.MaxFileSize=200MB
log4j.appender.APP_APPENDER.MaxBackupIndex=30
log4j.appender.APP_APPENDER.layout=org.apache.log4j.PatternLayout
log4j.appender.APP_APPENDER.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss}-%r %p %c{10}: %m%n
log4j.appender.ERROR_APPENDER=org.apache.log4j.RollingFileAppender
log4j.appender.ERROR_APPENDER.Threshold=ERROR
log4j.appender.ERROR_APPENDER.File=${ODS.LOG.DIR}/${ODS.LOG.ERROR.FILE}
log4j.appender.ERROR_APPENDER.MaxFileSize=200MB
log4j.appender.ERROR_APPENDER.MaxBackupIndex=90
log4j.appender.ERROR_APPENDER.layout=org.apache.log4j.PatternLayout
log4j.appender.ERROR_APPENDER.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss}-%r %p %c{10}: %m%n
Our MR Driver logs are appearing in /var/log/appLogs directory but our mapper and reducer logs dont show up in this directory.
We have copied our log4j.properties snippet to hdfs-log4j, yarn-log4j, hbase-log4j and zookeeper-log4j using Ambari (Hortonworks Data Platform). Our MR jobs typically use HBase input and output format classes.
What changes do we need to make to have the mapper and reducer logs appear in the /var/log/appLogs directory as well?
Edit: The logs are appearing in the JobHistory UI syslog but they aren't being added to the log file. What are we missing?
For example to configure log4j you can call PropertyConfigurator.configure(properties); from your code e.g. in mapper/reducer setup method.
This is example with properties stored on hdfs:
InputStream is = fs.open(log4jPropertiesPath);
Properties properties = new Properties();
properties.load(is);
PropertyConfigurator.configure(properties);
where fs is FileSystem object and log4jPropertiesPath is path on hdfs.
my log was running correctly with RollingFileAppender but I need compress generated files and move to folder "${app.log}\Backup". Attach log4j properties:
log4j.appender.appDebug=org.apache.log4j.RollingFileAppender
log4j.appender.appDebug.file=${app.log}\\app_exe.log
log4j.appender.appDebug.MaxFileSize=100MB
log4j.appender.appDebug.maxBackupIndex=10
log4j.appender.appDebug.layout=org.apache.log4j.PatternLayout
log4j.appender.appDebug.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} {%-15.15t} [%-5p] %m %n
log4j.appender.appDebug.Threshold = DEBUG
Add below :
log4j.appender.appDebug.rollingPolicy=org.apache.log4j.rolling.FixedWindowRollingPolicy
log4j.appender.appDebug.rollingPolicy.maxIndex=5
log4j.appender.appDebug.triggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
log4j.appender.appDebug.triggeringPolicy.MaxFileSize=10000
How do I override the default log4j.properties in hadoop? If I set the hadoop.root.logger=WARN,console, it doesnot print the logs on the console, whereas what I want is that it shouldn't print the INFO in the logs file. I added a log4j.properties file in my jar, but I am unable to override the default one. In short, I want the log file to print only the errors and warnings.
# Define some default values that can be overridden by system properties
hadoop.root.logger=INFO,console
hadoop.log.dir=.
hadoop.log.file=hadoop.log
#
# Job Summary Appender
#
# Use following logger to send summary to separate file defined by
# hadoop.mapreduce.jobsummary.log.file rolled daily:
# hadoop.mapreduce.jobsummary.logger=INFO,JSA
#
hadoop.mapreduce.jobsummary.logger=${hadoop.root.logger}
hadoop.mapreduce.jobsummary.log.file=hadoop-mapreduce.jobsummary.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hadoop.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=ALL
#
# Daily Rolling File Appender
#
log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hadoop.log.dir}/${hadoop.log.file}
# Rollver at midnight
log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
# 30-day backup
#log4j.appender.DRFA.MaxBackupIndex=30
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} (%F:%M(%L)) - %m%n
#
# console
# Add "console" to rootlogger above if you want to use this
#
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
#
# TaskLog Appender
#
#Default values
hadoop.tasklog.taskid=null
hadoop.tasklog.iscleanup=false
hadoop.tasklog.noKeepSplits=4
hadoop.tasklog.totalLogFileSize=100
hadoop.tasklog.purgeLogSplits=true
hadoop.tasklog.logsRetainHours=12
log4j.appender.TLA=org.apache.hadoop.mapred.TaskLogAppender
log4j.appender.TLA.taskId=${hadoop.tasklog.taskid}
log4j.appender.TLA.isCleanup=${hadoop.tasklog.iscleanup}
log4j.appender.TLA.totalLogFileSize=${hadoop.tasklog.totalLogFileSize}
log4j.appender.TLA.layout=org.apache.log4j.PatternLayout
log4j.appender.TLA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
#
#Security appender
#
hadoop.security.log.file=SecurityAuth.audit
log4j.appender.DRFAS=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFAS.File=${hadoop.log.dir}/${hadoop.security.log.file}
log4j.appender.DRFAS.layout=org.apache.log4j.PatternLayout
log4j.appender.DRFAS.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
#new logger
# Define some default values that can be overridden by system properties
hadoop.security.logger=INFO,console
log4j.category.SecurityLogger=${hadoop.security.logger}
#
# Rolling File Appender
#
#log4j.appender.RFA=org.apache.log4j.RollingFileAppender
#log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file}
# Logfile size and and 30-day backups
#log4j.appender.RFA.MaxFileSize=1MB
#log4j.appender.RFA.MaxBackupIndex=30
#log4j.appender.RFA.layout=org.apache.log4j.PatternLayout
#log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} - %m%n
#log4j.appender.RFA.layout.ConversionPattern=%d{ISO8601} %-5p %c{2} (%F:%M(%L)) - %m%n
#
# FSNamesystem Audit logging
# All audit events are logged at INFO level
#
log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=WARN
# Custom Logging levels
#log4j.logger.org.apache.hadoop.mapred.JobTracker=DEBUG
#log4j.logger.org.apache.hadoop.mapred.TaskTracker=DEBUG
#log4j.logger.org.apache.hadoop.hdfs.server.namenode.FSNamesystem.audit=DEBUG
# Jets3t library
log4j.logger.org.jets3t.service.impl.rest.httpclient.RestS3Service=ERROR
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.metrics.jvm.EventCounter
#
# Job Summary Appender
#
log4j.appender.JSA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.JSA.File=${hadoop.log.dir}/${hadoop.mapreduce.jobsummary.log.file}
log4j.appender.JSA.layout=org.apache.log4j.PatternLayout
log4j.appender.JSA.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{2}: %m%n
log4j.appender.JSA.DatePattern=.yyyy-MM-dd
log4j.logger.org.apache.hadoop.mapred.JobInProgress$JobSummary=${hadoop.mapreduce.jobsummary.logger}
log4j.additivity.org.apache.hadoop.mapred.JobInProgress$JobSummary=false
#
# MapReduce Audit Log Appender
#
# Set the MapReduce audit log filename
#hadoop.mapreduce.audit.log.file=hadoop-mapreduce.audit.log
# Appender for AuditLogger.
# Requires the following system properties to be set
# - hadoop.log.dir (Hadoop Log directory)
# - hadoop.mapreduce.audit.log.file (MapReduce audit log filename)
#log4j.logger.org.apache.hadoop.mapred.AuditLogger=INFO,MRAUDIT
#log4j.additivity.org.apache.hadoop.mapred.AuditLogger=false
#log4j.appender.MRAUDIT=org.apache.log4j.DailyRollingFileAppender
#log4j.appender.MRAUDIT.File=${hadoop.log.dir}/${hadoop.mapreduce.audit.log.file}
#log4j.appender.MRAUDIT.DatePattern=.yyyy-MM-dd
#log4j.appender.MRAUDIT.layout=org.apache.log4j.PatternLayout
#log4j.appender.MRAUDIT.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
Modify the log4j file inside HADOOP_CONF_DIR. Note that hadoop job wont consider the log4j file of your application. It will consider the one inside HADOOP_CONF_DIR.
If you want to force hadoop to use some other log4j file, try one of these:
You can try what #Patrice said. ie.
-Dlog4j.configuration=file:/path/to/user_specific/log4j.xml
Customize HADOOP_CONF_DIR/log4j.xml and set the logger level for "your" classes as per your wish. Other user(s) wont be affected due to this unless both are having classes with same package structure. This wont work for core hadoop classes as all users will get afftected.
Create your customized log4j file. Replicate the directory HADOOP_CONF_DIR and put your log4j file inside it. export HADOOP_CONF_DIR to your conf directory. Other users will point to the default one.
If you use the default Log4j.properties file the logging settings get overridden by environment variables from the startup script. If you want to use the default log4j and just simply want to change the logging level, use $HADOOP_CONF_DIR/hadoop-env.sh
For example, to change your logger to DEBUG log level and DRFA logger, use
export HADOOP_ROOT_LOGGER="DEBUG,DRFA"
You could remove the log4j.properties from your hadoop jar
OR make sure that your jar/log4j.properties is first in the classpath (log4j picks the first log4j.properties from the classpath that it finds)
OR specify the system variable: -Dlog4j.configuration=PATH_TO_FILE
See the documentation to learn how log4j finds the configuration.
I was faced with the same problem (CDH3U3, Hadoop 0.20.2). I finally found a solution with (note file: prefix in the path):
-Dlog4j.configuration=file:/path/to/user_specific/log4j.xml
Maven Packaging:
Once I realized I needed to add my custom debug-log.properties file to src/main/java/resources, Maven added it to the application.jar root directory and then it was just a matter of referring to it or not in -Dlog4j.configuration=debug-log.properties from the command line.
Oozie <java> Action:
In regard to Oozie, use <java-opts>-Dlog4j.configuration=${log4jConfig}</java-opts> in the workflow.xml actions and define the following in a job.properties file.
#one of the following log4j.config parameters must be defined
#log4jConfig=log4j.properties
log4jConfig=debug-log.properties
Oozie <map-reduce> Action:
<property>
<name>mapred.child.java.opts</name>
<value>-Dlog4j.configuration=${log4jConfig}</value>
</property>
As mentioned by Sulpha,
for hadoop 1.2.1, it is important to override the task-log4j.properties that is present inside hadoop-core.jar
For my pseudo distributed mode,
I was unable to print the debug messages of my pig UDFs and had to delete the task-log4j.properties from hadoop-core.jar and replace it with a copy of the $HADOOP_INSTALL/conf/log4j.properties.
Used the
zip -d hadoop-core-1.2.1.jar task-log4j.properties #to delete
and
zip -g hadoop-core-1.2.1.jar task-log4j.properties #to add back
If there is already configured log4j properties file inside the jar file. you can override by simple putting -Dlog4j.configuration= before the -classpath
here is the sample:
java -Dlog4j.configuration=..\conf\log4j.properties -classpath %CLASSPATH%
Put log4j.configuration option in the child java options.
I.e.
hadoop jar ... -Dmapred.child.java.opts=-Dlog4j.configuration=file:/...../log4j_debug.properties
You must put log4j_debug.properties file on all slave servers in a same directory path like /home/yourname/log4j_debug.properties or /tmp/log4j_debug.properties
This setting overwrites mapred.child.java.opts settings.
If you want to use with another options like -Xmx32m, which means 32MB heap size, then do like following:
hadoop jar ... -Dmapred.child.java.opts='-Xmx32m -Dlog4j.configuration=file:/...../log4j_debug.properties'
In the Hadoop 1.2.1 there are 2 config files: log4j.properties and task-log4j.properties
So to make example above work the change have to be done in task-log4j.properties not in log4j.properties
you can add follwing line in your task-log4j.properties:
log4j.logger.org.xxx=WARN
I'm trying to get started Log4j in Spring MVC application, but I'm unable to get information, what's wrong. Each blog post is same: It's really easy. Just put log4j.properties into /WEB-INF/classes directory. But for me it does not work. The problem is, that there is no place to look for error message. The only I know is, that expected log file was not created. Is there some possibility to debug it? Really to put log4j.properties file in /WEB-INF/classes is enought?
The above mentioned log4j.properties file follows:
#Direct log messages to a log file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=D:\\workspace-trainee-actual\\0pokusy\\Sprung\\logik.txt
log4j.appender.file.MaxFileSize=1MB
log4j.appender.file.MaxBackupIndex=1
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%d{ABSOLUTE} %5p %c{1}:%L - %m%n
# Root logger option
log4j.rootLogger=trace, file
Controller using Log4j:
#Controller
public class HelloWorldController {
private Logger log = Logger.getRootLogger();
#RequestMapping("/")
public ModelAndView base() {
log.debug("base URI");
ModelAndView mv = new ModelAndView();
mv.setViewName("index");
return mv;
}
}
The only sure fact is, that it work's, so log is not null and the Log4j library is available.
Try adding
<context-param>
<param-name>log4jConfigLocation</param-name>
<param-value>/WEB-INF/classes/log4j.properties</param-value>
</context-param>
to your web.xml file
I have changed logging threshold of Geronimo to debug and I found '2011-06-24 07:33:18,353 DEBUG [root] base URI'. I thing there is some conflict. The application is one process, one JVM instance with thousands of classes, but with only one root logger for all of them.
Seams that you are right:
From the Geronimo Documentation:
Note that in any case, unless you use hidden-classes or inverse-classloading to load your own copy of log4j separate from the geronimo copy, log4j will not automatically read any log4j.properties files you may have included in your classpath.
I found the last "solution" from the documentation very interesting:
Copy the log4j.properties file by hand to the appropriate location such as var/my-app/log4j.properties. There is no need to include this file in your app.
Because that allows you to externalize the log4j configuration. So the Operations-Guy can manage and change the log4j configuration. And you do not need to build/and deploy a new version if for example the directory where the files are stored is changed.
Workaround
The system works properly allways. The Log4j system is configured for our instance of the Java Virtual Machine. Geronimo has already done it. We can not reconfigure the root logger, but we can use it. The default threshold is INFO and application uses root logger for a debug message. Thus we cannot see it anywhere.
If threshold has decreased to DEBUG, the message appears in Geronimo log. I have changed in the file $GERONIMO_HOME/var/log/server-log4j.properties a line at the beginning: log4j.rootLogger=DEBUG, CONSOLE, FILE And in $GERONIMO_HOME/var/log/geronimo.log I can then read: 2011-06-24 20:02:21,375 DEBUG [root] base URI
From some unknown reason is neither under Linux nor under Windows created separated output file. We can the message find just in server log, but it does not matter, we can overcome it. Let rename the logger in Log4j configuration: #Root logger for application
log4j.logger.springTestLogger=TRACE, APLOK
And in the code accordingly: private Logger log = Logger.getLogger("springTestLogger");
We create the separete log file under Linux easily: cat $GERONIMO_HOME/var/log/geronimo.log|grep springTestLogger > separe.log
Try with
<context-param>
<param-name>log4jConfigLocation</param-name>
<param-value>/WEB-INF/properties/log4j.properties</param-value>
</context-param>
<listener>
<listener-class>org.springframework.web.util.Log4jConfigListener</listener-class>
</listener>