I am having this messages between Flume, Hive and Hadoop when I start Flume every time. what is the best way to avoid this? I was thinking to remove one jar from flume lib directory but not sure if that going to effect others (hive, hadoop) or not.
Info: Sourcing environment configuration script /usr/local/flume/conf/flume-env.sh
Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop) for HDFS access
+ exec /usr/java/jdk1.7.0_79/bin/java -Xms100m -Xmx200m -Dcom.sun.management.jmxremote -cp '/usr/local/flume/conf:/usr/local/flume/lib/*:/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/contrib/capacity-scheduler/*.jar' -Djava.library.path=:/usr/local/hadoop/lib/native org.apache.flume.node.Application --conf-file /usr/local/flume/conf/spooling3.properties --name agent1
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/flume/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
The log messages you have mentioned may be treated as "ordinary warning messages". (No error.)
If you take a look at https://issues.apache.org/jira/browse/FLUME-2913 , you can see some further explanation.
The way the classpath is constructed for Apache Flume is: bin/flume-ng bash script collects all the classpaths from HBase and HDFS and combines them with Flume's own classpath.
If there is a different slf4j jar anywhere, you will see the warning.
Related
Whenever I run "hive" command in terminal, it gives me a few errors before starting. I've seen some errors normally running on other people's PC but not this particular one where it says no hbase. queries don't run because of these errors
/usr/bin/which: no hbase in (/home/evirac/.local/bin:/home/evirac/bin:/usr/lib64/ccache:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/var/lib/snapd/snap/bin:/home/evirac/hadoop/hadoop/sbin:/home/evirac/hadoop/hadoop/bin:/home/evirac/hadoop/hadoop/sbin:/home/evirac/hadoop/hadoop/bin:/home/evirac/hadoop/hive/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/evirac/hadoop/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/evirac/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = 601ecfde-b638-4970-bd2d-0287e5414201
Logging initialized using configuration in jar:file:/home/evirac/hadoop/hive/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive> SHOW DATABASES;
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
hive>
Hive shouldn't have a direct dependency on Hbase, but you could install it, and add it's bin directory to your PATH - https://hbase.apache.org
However, your error is related to the Hive metastore, not Hbase. The work on the Hbase metastore was abandoned in Hive 3 - https://cwiki.apache.org/confluence/display/Hive/HBaseMetastoreDevelopmentGuide
As asked before - What happens when you use beeline, the preferred Hive client?
I download Latest JMeter verserion from https://jmeter.apache.org/download_jmeter.cgi (Version 5.1.1). After extracting to my primary drive from command prompt run jmeter.bat which results below error.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/C:/JMeter/lib/ApacheJMeter_slf4j_logkit.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/C:/JMeter/lib/log4j-slf4j-impl-2.11.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.jmeter.logging.LogkitLoggerFactory]
java.lang.InstantiationError: org.apache.log.Logger
java -versoin command shows JRE 12.0.1 (64 Bit)
JAVA_HOME is properly in my PC.
Can I any one tell what is missing to resolve this error.
Your file has two SLF4J bindings :
C:/JMeter/lib/ApacheJMeter_slf4j_logkit.jar!/org/slf4j/impl/StaticLoggerBinder.class
C:/JMeter/lib/log4j-slf4j-impl-2.11.1.jar!/org/slf4j/impl/StaticLoggerBinder.class
Try again to run jmeter.bat by removing the first one.
You can refer this link for JMeter installation and setup!
English is not my native language; please excuse typing errors.
I tried to install hive with hadoop in a linux environment following this tutorial. Hadoop is installed correctly but when i try to install hive i get the following output in my shell:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/phd2014/hive/lib/hive-jdbc-2.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/phd2014/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/phd2014/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/home/phd2014/hive/lib/hive-common-2.0.0.jar!/hive-log4j2.properties
Java HotSpot(TM) Client VM warning: You have loaded library /home/phd2014/hadoop/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
Exception in thread "main" java.lang.RuntimeException: Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)
In my ~/.bashrc file y put the following:
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export HADOOP_PREFIX=/home/phd2014/hadoop
export HADOOP_HOME=/home/phd2014/hadoop
export HADOOP_MAPRED_HOME=/home/phd2014/hadoop
export HADOOP_COMMON_HOME=/home/phd2014/hadoop
export HADOOP_HDFS_HOME=/home/phd2014/hadoop
export YARN_HOME=/home/phd2014/hadoop
export HADOOP_CONF_DIR=/home/phd2014/hadoop/etc/hadoop
export HIVE_HOME=/home/phd2014/hive
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin:$HIVE_HOME/bin
I also export the variables HADOOP_HOME and HIVE_HOME inside the .profile file
This question here didn't work for me, i also run the command to create the Schema but it failed : schematool -dbType derby -initSchema
I have one more thing that i think it could help and is to modify the pom.xml file to avoid the multiple SLF4J bindings, but i can't find it. Try this but i didn't find it.
Thanks in advance
SLF4J is a logging API. It will dynamically bind to an implementation but it expects that there will only be one present. In your case it appears that you have three jars that provide an SLF4J implementation; hive-jdbc-2.0.0-standalone.jar, log4j-slf4j-impl-2.4.1.jar and slf4j-log4j12-1.7.10.jar.
hive-jdbc-2.0.0-standalone.jar appears to be a "shaded" jar - it includes the classes from multiple third party jars, including the contents of log4j-slf4j-impl. I am guessing that this is what SLF4J actually selected since it was the first one found.
The issues is that you are somehow including jars that the standalone jar has already incorporated. Normally with a standalone jar everything you need should already be in that jar.
When i try to install hive 2.0.0 i got the error that i posted, but if i install the version 1.2.1 instead it works fine, just by setting the environment variables and creating the /user/hive/warehouse directory in the HDFS. It must be a bug of the new version. My recommendation is to install the version 1.2.1 instead of the 2.0.0
I'm new to HBase. I'm running a HBase cluster on 2 machines (1 master on one machine and 1 regionserver on the second).
When I start the hbase shell using:
bin/hbase shell
and I create a table using this syntax:
create 't1', 'f1'
I get the following errors:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hduser/hbase-0.98.8-hadoop2/lib/slf4j-log4j12- 1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-1.0.4/lib/slf4j-log4j12-1.4.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
ERROR: org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
I'm using HBase version 0.98.8-hadoop2 and my Hadoop version is Hadoop 1.0.4. And I'm running this on an Ubuntu Virtual Machine.
I think HBase-98.8 recent one won't support or won't work on hadoop-1.x.x . And if you have time just make sure this which all requirements were explained in a book HBase: The Definitive Guide George, Lars or just have look this site
I have a Hadoop streaming job which fails for some reason. To find out why this happens I found corresponding stderr of the failed task, but there is only message about log4j not initialized:
log4j:WARN No appenders could be found for logger (org.apache.hadoop.ipc.Server).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Referenced website says that this means, default configuration files log4j.properties and log4j.xml can not be found and the application performs no explicit configuration.
I my system log4j.properties file is located in the usual ${HADOOP_HOME}/etc/hadoop/ directory. Why Hadoop cannot find it? Is this because streaming job is not supposed to log via log4j anyways?.. Is it possible to see stdout/stderr of a streaming job written in e.g. Perl?
Thanks!