NoClassDefFoundError hadoop/yarn/client/YarnClientImpl - hadoop

I installed giraph-1.0.0 on my 2-node cluster, built with hadoop version 2.0.4-alpha where as i have hadoop 2.7.1 installed on my cluster. When I tried to run the giraph example for Simple Shortest Path Vertex, I got the following NoClassDefFoundError.
hadoop jar $GIRAPH_HOME/giraph-examples/target/giraph-examples-1.0.0-for-hadoop-2.0.4-alpha-jar-with-dependencies.jar org.apache.giraph.GiraphRunner org.apache.giraph.examples.SimpleShortestPathsVertex -vif org.apache.giraph.io.formats.JsonLongDoubleFloatDoubleVertexInputFormat -vip /user/hduser/input/tinygraph.txt -of org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op /user/hduser/output/shortestpaths -w 1
The error is
16/08/05 14:26:09 INFO utils.ConfigurationUtils: No edge input format specified. Ensure your InputFormat does not require one.
16/08/05 14:26:09 WARN job.GiraphConfigurationValidator: Output format vertex index type is not known
16/08/05 14:26:09 WARN job.GiraphConfigurationValidator: Output format vertex value type is not known
16/08/05 14:26:09 WARN job.GiraphConfigurationValidator: Output format edge value type is not known
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/client/YarnClientImpl
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:803)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.giraph.GiraphRunner.run(GiraphRunner.java:83)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.giraph.GiraphRunner.main(GiraphRunner.java:126)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.client.YarnClientImpl
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 21 more
The command I used for building giraph is this.
mvn -Phadoop_yarn -DskipTests -Dhadoop.version=2.0.4-alpha clean package
I am not sure if I need to rebuild apache giraph or it is something else which is the problem. Thanks in advance.

Related

SPARK SUBMIT:class not found exception

I have main class XML.scala, Project XMLHDFS
I am trying to run spark submit job with command
spark-submit --class com.abc.Import.XMLHDFS.XML /some-path/XMLHDFS.jar
it is giving me class not found exception. Please suggest where I am wrong.
java.lang.ClassNotFoundException: com.abc.Import.XMLHDFS.XML
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.spark.util.Utils$.classForName(Utils.scala:172)
at org.apache.spark.util.Utils$.classForName(Utils.scala:172)
......................

Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.InputFormat

I am doing sqoop import in MAC OSX 10.9.4 and getting error as:
14/10/24 11:51:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5
14/10/24 11:51:41 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
14/10/24 11:51:41 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
14/10/24 11:51:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
14/10/24 11:51:41 INFO tool.CodeGenTool: Beginning code generation
14/10/24 11:51:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_log` AS t LIMIT 1
14/10/24 11:51:42 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `test_log` AS t LIMIT 1
14/10/24 11:51:42 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /Users/hadoop/hadoop-2.5.1/share/hadoop/mapreduce
Note: /tmp/sqoop-hadoop/compile/2398ed5be0489f1c76d6fb556f3e0e9d/test_log.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/10/24 11:51:45 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/2398ed5be0489f1c76d6fb556f3e0e9d/test_log.jar
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/InputFormat
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.sqoop.manager.ImportJobContext.<init>(ImportJobContext.java:51)
at com.cloudera.sqoop.manager.ImportJobContext.<init>(ImportJobContext.java:33)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:483)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.InputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 58 more
Hadoop version : 2.5.1
Sqoop version :1.4.5
os : OSX10.9.4
env varibles are set:
export HADOOP_COMMON_HOME=/Users/hadoop/hadoop-2.5.1/
export HADOOP_MAPRED_HOME=/Users/hadoop/hadoop-2.5.1/share/hadoop/mapreduce/
Sqoop jar (sqoop-1.4.5.jar) is placed in /Users/hadoop/hadoop-2.5.1/lib/
and mysql connector in /Users/hadoop/sqoop/lib
also the core.jar exist in
/Users/hadoop/hadoop-2.5.1/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.5.1.jar
for f in ${HADOOP_MAPRED_HOME}/*.jar; do
HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$f;
done
add this to bin/configure-sqoop
Somehow the Jars were not accessible. After puting the Sqoop jars in HDFS, Sqoop import went fine.
This link helped
Sqoop jar files not found
cd $SQOOP_HOME/bin
vim configure-sqoop
add below line
add_to_classpath ${HADOOP_MAPRED_HOME}
after "add_to_classpath ${SQOOP_JAR_DIR}"
Try to check your if in your $HADOOP_COMMON_LIB_NATIVE_DIR you can find the library.
If not you can put it in the same dir of your mapreduce implementation:
/Users/hadoop/hadoop-2.5.1/share/hadoop/mapreduce/
I had the same problem ! Just copy hadoop-mapreduce-client-core-3.0.0-SNAPSHOT.jar into the lib folder from sqooq. Dirty but it works. You probably just want to make some tests

"java.lang.ClassNotFoundException: org.apache.http.impl.client.HttpClientBuilder" when running program in Hadoop

I have written a code to expand shortened url. The code works fine as a standalone program. But when I put it in Hadoop where I extract the url in Map function, I get the following error. I have mentioned all httpclient dependencies in the classpath while compiling the code. Please help me.
14/07/28 08:16:40 INFO mapreduce.Job: Task Id : attempt_1405534657345_0008_m_000001_2, Status : FAILED
Error: java.lang.ClassNotFoundException: org.apache.http.impl.client.HttpClientBuilder
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.myorg.ExtractUrl.unshortenSingleLevel(ExtractUrl.java:80)
at org.myorg.ExtractUrl.unshorten(ExtractUrl.java:118)
at org.myorg.ExtractUrl$Map.map(ExtractUrl.java:40)
at org.myorg.ExtractUrl$Map.map(ExtractUrl.java:27)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Error in copying jars to /home/hadoop/lib emr

I am copying my external jars to /home/hadoop/lib directoy in emr as a bootstrap process. But it is showing following error during bootstrap process
Exception in thread "main" java.lang.IncompatibleClassChangeError: class com.google.common.cache.CacheBuilder$3 has interface com.google.common.base.Ticker as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at com.google.common.cache.CacheBuilder.<clinit>(CacheBuilder.java:207)
at org.apache.hadoop.security.ShellBasedUnixGroupsMapping.<clinit>(ShellBasedUnixGroupsMapping.java:46)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:861)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:906)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:932)
at org.apache.hadoop.security.Groups.<init>(Groups.java:48)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:140)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:205)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:477)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1521)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1422)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:256)
at com.amazon.elasticmapreduce.scriptrunner.ScriptRunner.fetchFile(ScriptRunner.java:39)
at com.amazon.elasticmapreduce.scriptrunner.ScriptRunner.main(ScriptRunner.java:56)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:187)
Can anyone help me why it is happening?
I found the answer. The problem was that Mahout 0.7 is using an older version of Guava. I replaced Mahout 0.7 with 0.8 and it worked.

Shutting down NameNode at when I tried to start the namenode service

When I tried to start-all.sh namenode is not running with the log below: Please help
ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: java.lang.ClassFormatError: Unknown constant tag 59 in class file org/mortbay/util/DateCache
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at org.mortbay.log.StdErrLog.<clinit>(StdErrLog.java:38)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:188)
at org.mortbay.log.Log.class$(Log.java:67)
at org.mortbay.log.Log.<clinit>(Log.java:72)
at org.mortbay.component.Container.add(Container.java:200)
at org.mortbay.component.Container.update(Container.java:164)
at org.mortbay.component.Container.update(Container.java:106)
at org.mortbay.jetty.Server.setConnectors(Server.java:160)
at org.mortbay.jetty.Server.addConnector(Server.java:134)
at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:158)
at org.apache.hadoop.http.HttpServer.<init>(HttpServer.java:137)
at org.apache.hadoop.hdfs.server.namenode.NameNode$1.run(NameNode.java:363)
at org.apache.hadoop.hdfs.server.namenode.NameNode$1.run(NameNode.java:358)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:358)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:306)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:461)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1208)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1217)
2014-02-21 15:13:07,336 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at DWNPCPU229/127.0.1.1
************************************************************/
UPDATE
hadoop version is Hadoop 0.20.2-cdh3u0
and,
java version is java version "1.7.0_15"
Could it be a incompatible problem.
I was right. The problem was with version incompatibility. Now I installed Hadoop 0.20.2-cdh3u4 and it start well.

Resources