No appenders could be found for logger - hadoop 3.2.1 - hadoop

I am running multinode Hadoop(3.2.1) in my virtual machine. I have one masternode machine and one slavenode machine.
I am having an error on my datanode logs/userlogs
log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapred.YarnChild).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

Related

Error while starting thrift server for kerberos enabled hbase

i have enabled kerberos on the HDP cluster and trying to connect to Hiveserver2 using python, But getting some error with thrift server .
Tried starting the thrift but getting error as below.Please hlep.
backtrace:
hdp296m1:~ # hbase thrift start
2017-01-20 02:06:43,953 INFO [main] util.VersionInfo: HBase 1.1.2.2.4.2.0-258
2017-01-20 02:06:43,955 INFO [main] util.VersionInfo: Source code repository file:///grid/0/jenkins/workspace/HDP-build-suse11sp3/bigtop/build/hbase/rpm/BUILD/hbase-1.1.2.2.4.2.0 revision=Unknown
2017-01-20 02:06:43,955 INFO [main] util.VersionInfo: Compiled by jenkins on Sun Apr 24 16:30:34 UTC 2016
2017-01-20 02:06:43,955 INFO [main] util.VersionInfo: From source with checksum 4f661ee4f9f148ce7bfcad5b0d667c27
2017-01-20 02:06:44,677 INFO [main] thrift.ThriftServerRunner: Using default thrift server type
2017-01-20 02:06:44,677 INFO [main] thrift.ThriftServerRunner: Using thrift server type threadpool
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.4.2.0-258/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Exception in thread "main" java.io.IOException: Running in secure mode, but config doesn't have a keytab
at org.apache.hadoop.security.SecurityUtil.login(SecurityUtil.java:273)
at org.apache.hadoop.hbase.security.User$SecureHadoopUser.login(User.java:386)
at org.apache.hadoop.hbase.security.User.login(User.java:253)
at org.apache.hadoop.hbase.security.UserProvider.login(UserProvider.java:115)
at org.apache.hadoop.hbase.thrift.ThriftServerRunner.<init>(ThriftServerRunner.java:303)
at org.apache.hadoop.hbase.thrift.ThriftServer.doMain(ThriftServer.java:93)
at org.apache.hadoop.hbase.thrift.ThriftServer.main(ThriftServer.java:232)
Hbase config:
hbase config properties for kerberos

Oozie sharelib creation in hdfs.(Root is not able to impersonate root)

I am following http://hadooptutorial.info/apache-oozie-installation-on-ubuntu-14-04/ for installing oozie 4.1.0 with hadoop 2.7.2
Build is successfull and i can able to create oozie war by issuing this command
hduser#master:~/oozie/oozie-bin$ sudo bin/oozie-setup.sh prepare-war
New Oozie WAR file with added 'ExtJS library, JARs' at /home/hduser/oozie/oozie-bin/oozie-server/webapps/oozie.war
INFO: Oozie is ready to be started
But when i issue this the command for crating sharelib got error
hduser#master:~/oozie/oozie-bin$ sudo bin/oozie-setup.sh sharelib create -fs hdfs://master:9000
output:
setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hduser/oozie/oozie-bin/libtools/slf4j-simple-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hduser/oozie/oozie-bin/libtools/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hduser/oozie/oozie-bin/libext/slf4j-log4j12-1.6.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
the destination path for sharelib is: /user/root/share/lib/lib_20160614094056
Error: User: root is not allowed to impersonate root
Stack trace for the error was (for debug purposes):
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate root
at org.apache.hadoop.ipc.Client.call(Client.java:1406)
at org.apache.hadoop.ipc.Client.call(Client.java:1359)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy7.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy7.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:671)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1746)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1112)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1399)
at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:496)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1904)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1872)
at org.apache.oozie.tools.OozieSharelibCLI.run(OozieSharelibCLI.java:165)
at org.apache.oozie.tools.OozieSharelibCLI.main(OozieSharelibCLI.java:56)
Also i restarted my hadoop cluster but no success.
here is my core-site.xml
<property>
<name>hadoop.proxyuser.hduser.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hduser.groups</name>
<value>*</value>
</property>
Can Anyone help?
Do not use sudo for creating sharelib and it will work.

log4j config issue with socket appender

It shows error when i trying to run the project using socket appender.But it is working fine when i am printing logs on a file.
INFO: Initializing Spring root WebApplicationContext
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

How to solve "log4j:WARN No appenders could be found for logger" error on Twenty Newsgroups Classification Example

I am trying to run the 2newsgroup classification example in Mahout. I have set MAHOUT_LOCAL=true, the classifier doesn't display the Confusion matrix and gives the following warnings :
ok. You chose 1 and we'll use cnaivebayes
creating work directory at /tmp/mahout-work-cloudera
+ echo 'Preparing 20newsgroups data'
Preparing 20newsgroups data
+ rm -rf /tmp/mahout-work-cloudera/20news-all
+ mkdir /tmp/mahout-work-cloudera/20news-all
+ cp -R /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/alt.atheism /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/comp.graphics /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/comp.os.ms-windows.misc /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/comp.sys.ibm.pc.hardware /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/comp.sys.mac.hardware /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/comp.windows.x /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/misc.forsale /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/rec.autos /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/rec.motorcycles /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/rec.sport.baseball /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/rec.sport.hockey /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/sci.crypt /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/sci.electronics /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/sci.med /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/sci.space /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/soc.religion.christian /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/talk.politics.guns /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/talk.politics.mideast /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/talk.politics.misc /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-test/talk.religion.misc /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/alt.atheism /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/comp.graphics /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/comp.os.ms-windows.misc /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/comp.sys.ibm.pc.hardware /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/comp.sys.mac.hardware /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/comp.windows.x /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/misc.forsale /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/rec.autos /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/rec.motorcycles /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/rec.sport.baseball /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/rec.sport.hockey /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/sci.crypt /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/sci.electronics /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/sci.med /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/sci.space /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/soc.religion.christian /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/talk.politics.guns /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/talk.politics.mideast /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/talk.politics.misc /tmp/mahout-work-cloudera/20news-bydate/20news-bydate-train/talk.religion.misc /tmp/mahout-work-cloudera/20news-all
+ '[' '' '!=' '' ']'
+ echo 'Creating sequence files from 20newsgroups data'
Creating sequence files from 20newsgroups data
+ ./bin/mahout seqdirectory -i /tmp/mahout-work-cloudera/20news-all -o /tmp/mahout-work-cloudera/20news-seq -ow
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/mahout-examples-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/dependency/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.mahout.common.AbstractJob).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
+ echo 'Converting sequence files to vectors'
Converting sequence files to vectors
+ ./bin/mahout seq2sparse -i /tmp/mahout-work-cloudera/20news-seq -o /tmp/mahout-work-cloudera/20news-vectors -lnorm -nv -wt tfidf
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/mahout-examples-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/dependency/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
+ echo 'Creating training and holdout set with a random 80-20 split of the generated vector dataset'
Creating training and holdout set with a random 80-20 split of the generated vector dataset
+ ./bin/mahout split -i /tmp/mahout-work-cloudera/20news-vectors/tfidf-vectors --trainingOutput /tmp/mahout-work-cloudera/20news-train-vectors --testOutput /tmp/mahout-work-cloudera/20news-test-vectors --randomSelectionPct 40 --overwrite --sequenceFiles -xm sequential
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/mahout-examples-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/dependency/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.mahout.driver.MahoutDriver).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
+ echo 'Training Naive Bayes model'
Training Naive Bayes model
+ ./bin/mahout trainnb -i /tmp/mahout-work-cloudera/20news-train-vectors -el -o /tmp/mahout-work-cloudera/model -li /tmp/mahout-work-cloudera/labelindex -ow -c
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/mahout-examples-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/dependency/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.mahout.driver.MahoutDriver).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
+ echo 'Self testing on training set'
Self testing on training set
+ ./bin/mahout testnb -i /tmp/mahout-work-cloudera/20news-train-vectors -m /tmp/mahout-work-cloudera/model -l /tmp/mahout-work-cloudera/labelindex -ow -o /tmp/mahout-work-cloudera/20news-testing -c
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/mahout-examples-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/dependency/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.mahout.driver.MahoutDriver).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
+ echo 'Testing on holdout set'
Testing on holdout set
+ ./bin/mahout testnb -i /tmp/mahout-work-cloudera/20news-test-vectors -m /tmp/mahout-work-cloudera/model -l /tmp/mahout-work-cloudera/labelindex -ow -o /tmp/mahout-work-cloudera/20news-testing -c
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
MAHOUT_LOCAL is set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/mahout-examples-1.0-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/cloudera/mahout-master/examples/target/dependency/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.mahout.driver.MahoutDriver).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Please give me any solution Thanks.
After lots of hour searching i found this solution :
Create log4j.properties file
This is the main properties file having all runtime configuration used by log4j. This file will have appenders information, log level information and output file names for file appenders.
log4j.rootLogger=DEBUG,consoleAppender, fileAppender
log4j.appender.consoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.consoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.consoleAppender.layout.ConversionPattern=[%t] %-5p %c %x - %m%n
log4j.appender.fileAppender=org.apache.log4j.RollingFileAppender
log4j.appender.fileAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.fileAppender.layout.ConversionPattern=[%t] %-5p %c %x - %m%n
log4j.appender.fileAppender.File=demoApplication.log
This error was due to "log4j.properties" file missing you can find more info about this on : http://www.tutorialspoint.com/log4j/log4j_configuration.htm

hdfs get fails on a particular client node

I have a strange problem with HDFS. While get operations on an existant file work like a charm on all clients accessing a HDFS cluster, it fails on one client:
Working host:
[user#host1]$ hadoop fs -ls /path/to/file.csv
found 1 items
-rw-r--r-- 3 compute supergroup 1628 2013-12-10 12:22 /path/to/file.csv
[user#host1]$ hadoop fs -get /path/to/file.csv /tmp/test.csv
[user#host1]$ cat /tmp/test.csv
48991,24768,2013-12-10 00:00:00,1,0.0001,0.0001
Not working host:
[user#host2]$ hadoop fs -ls /path/to/file.csv
Found 1 items
-rw-r--r-- 3 compute supergroup 1628 2013-12-10 12:22 /path/to/file.csv
[user#host2]$ hadoop fs -get /path/to/file.csv /tmp/test.csv
get: java.lang.NullPointerException
[user#host2]$ cat /tmp/test.csv
cat: /tmp/test.csv: No such file or directory
Using a java hdfs client on working host:
[user#host1]$ java -jar hadoop_get-1.0-SNAPSHOT-jar-with-dependencies.jar hdfs://my.namenode:port /path/to/file.csv
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
48991,24768,2013-12-10 00:00:00,1,0.0001,0.0001
Using a java hdfs client on non working host:
[user#host2]$ java -jar hadoop_get-1.0-SNAPSHOT-jar-with-dependencies.jar hdfs://my.namenode:port /path/to/file.csv
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException): java.lang.NullPointerException
at org.apache.hadoop.ipc.Client.call(Client.java:1225)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
at com.sun.proxy.$Proxy9.getBlockLocations(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:154)
at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:957)
at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:947)
at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:171)
at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:138)
at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:131)
at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1104)
at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:246)
at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:79)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:711)
at my.namespace.client.Client.main(Client.java:34)
This was resolved for us by deploying client configurations, refreshing cluster, and restarting HDFS.
Are you using CDH4? We've got the same problem after upgrading from CDH3.
Try researching reverse DNS lookup name for the problem host - we'd found difference with problem host and hosts with no problems only in DNS resolving. After fixing it - all is ok.

Resources