I have a mapreduce job which takes an avro file as input. I export it along with all the required libraries (jar library files) into a jar file. I have 2 different clusters, one is HDInsight simulator and the other one in HDP sandbox. It works fine on the HDP sandbox but it gives me an error on the HDInsight simulator and cannot find AvroInputFormat class. I tried running the job with the -libjar option but it didn't help. Here is the error message:
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.avro.mapred.AvroInputFormat not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1927)
at org.apache.hadoop.mapred.JobConf.getInputFormat(JobConf.java:686)
at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:168)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:409)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.avro.mapred.AvroInputFormat not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1895)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1919)
... 9 more
Caused by: java.lang.ClassNotFoundException: Class org.apache.avro.mapred.AvroInputFormat not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1893)
... 10 more
This looks weired because it runs fine on one cluster! Does anyone know what can be the problem?
Related
I'm trying to execute a simple pig script through oozie workflow which imports a python jar and as well as some other jar and eventually getting error like:
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.PigMain], exception invoking main(), java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.PigMain not found
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.PigMain not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1895)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:224)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.ClassNotFoundException: Class org.apache.oozie.action.hadoop.PigMain not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1893)
... 9 more
Oozie Launcher failed, finishing Hadoop job gracefully
and for this workflow i added all jars in lib directory including pig.jar .
Please check the Pig Jar should be present in Physical location of the Node where the Oozie Workflow is running.
Also You can plase the Pig jar in hadoop location of Oozie Shared Lib, and pass parameter
oozie.use.system.libpath = true
these will read the jar from Shared Lib Location
I got pre-built Spark 1.4.1 and I'm running HDP 2.6. when I try to run spark-shell it gives me an error message as follows.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:111)
at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:111)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:111)
at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:97)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:107)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
What is the issue?
ClassNotFoundException occurs when class loader could not find the
required class in class path . So , basically you should check your
class path and add the class in the classpath.
Check whether hadoop-common-0.21.0.jar is added to your classpath.
Is it possible that your Hadoop home is not set, as in here?
Cannot find hadoop installation: $HADOOP_HOME must be set or hadoop must be in the path
I want to use HDFS.jl in julia. But, every time I input the command
hdfs_connect("localhost",9000) the error occurs:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR: hdfs connect failed
in hdfs_connect at /home/gxx/.julia/v0.3/HDFS/src/hdfs_dfs.jl:35
in hdfs_connect at /home/gxx/.julia/v0.3/HDFS/src/hdfs_dfs.jl:30
My hadoop version is 1.2.1, and my classpath is:
export HADOOP_HOME=/home/gxx/usr/hadoop/hadoop-1.2.1
export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/*.jar:$HADOOP_HOME/lib/*.jar:$HADOOP_HOME/hadoop-core-1.2.1.jar
This is the link of HDFS.jl
How to fix it?
I am new to Hadoop and I have just setup Hadoop 1.2.1 on my Mac laptop (Mavericks). I then created a simple WordCount project in IntelliJ IDEA and was able to run the code on a dummy text file. I am having trouble with successfully creating a jar file which will replicate my execution through the IDE. I get the following error:
java -jar ./out/artifacts/WordCount_jar/WordCount.jar test.txt out [19:35:21]
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:146)
at neu.cs.parallelprogramming.WordCount.main(WordCount.java:48)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
FAIL: 1
Could anyone let me know what I am missing?
I guess you have to specify your class (which implements the Map/Reduce function).
E.g., $ java -jar ./WordCount.jar classWordCount input.txt output
or $ hadoop jar yourprogram.jar **yourclass** inputpath outputpath
I am very new to hadoop and map reduce programing.
I downloaded version 1.2.1 and was trying to see some example with command
bin/hadoop jar hadoop*example*.jar
with this command I am getting exception. What is wrong here? is there any problem with installation?
Exception in thread "main" java.lang.NoClassDefFoundError: 1/2/1/hadoop-1/2/1/libexec////logs
Caused by: java.lang.ClassNotFoundException: 1.2.1.hadoop-1.2.1.libexec....logs
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:315)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:330)
at java.lang.ClassLoader.loadClass(ClassLoader.java:250)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:398)
The right command is:
bin/hadoop jar hadoop-*-examples.jar <program name>
If you are using your custom map reduce class, try the following configuration on main method:
job.setJarByClass(WordCount.class);
Reference: http://mydailylearningblog.blogspot.com.br/2011/06/javalangclassnotfoundexception.html