ClassNotFoundException, while running example job of Hadoop also - hadoop

I saw the post and followed the process. But it didn't work.
ClassNotFoundException, while running example job of Hadoop
Help me please.
created mapreduce-0.1-tests.jar( there is actually a MapReduceTest.class)
copied input file form local > hadoop dfs -copyFromLocal input /Users/test/Documents/movie_titles_only.csv
hadoop jar /Users/test/project/mapReduce/target/mapreduce-0.1-tests.jar MapReduceTest -D mapred.reduce.tasks=0 input/movie_titles_only.csv movie_output
BUT!! It was same the message... How do I do, help me please!!
Exception in thread "main" java.lang.ClassNotFoundException: MapReduceTest at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:190) at java.lang.ClassLoader.loadClass(ClassLoader.java:306) at java.lang.ClassLoader.loadClass(ClassLoader.java:247) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:247) at org.apache.hadoop.util.RunJar.main(RunJar.java:149)

You need to use job.setJarByClass to specify your jar. Add the following statement in your codes:
job.setJarByClass(MapReduceTest.class);

Related

Flume-- Could not find the main class: org.apache.flume.tools.GetJavaProperty

I am using cloudera CDH 4.4.When I ran the flume cmd -
"bin/flume-ng agent -n agentA -f conf/MultipleFlumes.properties -Dflume.root.logger=INFO,console"
I got an error:
[cloudera#localhost Flume]$ bin/flume-ng agent -n agentA -f conf/MultipleFlumes.properties -Dflume.root.logger=INFO,console
Warning: No configuration directory set! Use --conf <dir> to override.
Info: Including Hadoop libraries found via (/usr/bin/hadoop) for HDFS access
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flume/tools/GetJavaProperty
Caused by: java.lang.ClassNotFoundException: org.apache.flume.tools.GetJavaProperty
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.flume.tools.GetJavaProperty. Program will exit.
Info: Excluding /usr/lib/hadoop/lib/slf4j-api-1.6.1.jar from classpath
Info: Excluding /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar from classpath
Info: Excluding /usr/lib/hadoop-0.20-mapreduce/lib/slf4j-api-1.6.1.jar from classpath
Info: Including HBASE libraries found via (/usr/bin/hbase) for HBASE access
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flume/tools/GetJavaProperty
Caused by: java.lang.ClassNotFoundException: org.apache.flume.tools.GetJavaProperty
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.flume.tools.GetJavaProperty. Program will exit.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flume/node/Application
Caused by: java.lang.ClassNotFoundException: org.apache.flume.node.Application
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.flume.node.Application. Program will exit.
I tried to echo HADOOP_HOME but it returned blank. What is the problem in above command?
Please guide.
First of all, add the -c parameter to the command like this:
bin/flume-ng agent -n agentA -c conf -f conf/MultipleFlumes.properties -Dflume.root.logger=INFO,console
Adding that parameter does not resolve the issue but if you don't include it you get another error because of log4j configuration file.
As for your problem, check if FLUME_HOME is defined, and if that is the case, unset it with
unset FLUME_HOME

Cascading + libjars = ClassNotFoundException. Sometimes

I am running Cascading (actually Scalding) hadoop job that uses DistributedCache for dependent jars.
Fist time it works fine (meaning that the classpath is set up correctly) but then it starts failing with ClassNotFoundException:
java.io.IOException: Split class cascading.tap.hadoop.io.MultiInputSplit not found
at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:387)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.ClassNotFoundException: cascading.tap.hadoop.io.MultiInputSplit
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:385)
...
Did anybody else have success with Cascading and jars in the DistributedCache
This message seems to imply that Cascading has some internal handling of the distributed cache jars. Any light you can shed on this?
Edit: I am using Cascading 2.1.6 on Hadoop 1.0.3
Which version of hadoop are you using? There are some problems with the distributed cache in 0.20.2. Can you try switching to a newer version?
Chris K Wensel, the author of Cascading responded on the mailing list that Cascading does not do anything with DistributedCache.
I looked further and it was a problem in my code -- I did not add these files to the DistributedCache properly.

HBase completebulkload returns exception

I am trying to bulk-populate an HBase table quickly from a text file (several GB) by using the bulk loading method described in the Hadoop docs.
I have created an HFile which I now want to push to my HBase table.
When I use this command:
hadoop jar /home/hxcaine/hadoop/lib/hbase.jar completebulkload /user/hxcaine/dbpopulate/output/cf1 my_hbase_table
The job starts and then I get this exception:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/util/concurrent/ThreadFactoryBuilder
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:195)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:696)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles.main(LoadIncrementalHFiles.java:701)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
Caused by: java.lang.ClassNotFoundException: com.google.common.util.concurrent.ThreadFactoryBuilder
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
... 17 more
However, I can see that the Guava jar is in my classpath and when I check inside the jar I can see ThreadFactoryBuilder.class.
I am using these versions (and stuck with them):
Hadoop 0.20.2-cdh3u3
HBase 0.90.4-cdh3u3
Guava jar: /usr/lib/hadoop-0.20/lib/guava-r09-jarjar.jar
I do have an older Guava jar in my classpath but I don't know where it came from, I don't suppose it should have an effect.
Any ideas?
what happens if you run:
export HADOOP_CLASSPATH=`hbase classpath`
before running the load? From the stack trace, it looks like the jar is needed by one of the actual tasks though I am surprised to see that this actually kicks off an M/R job.

Errors when running accumulo init

I have Hadoop and Zookeeper running w/o a problem but when I go to run $ACCUMULO_HOME/bin/accumulo init, this happens:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/start/Platform
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.start.Platform
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/start/Main
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.start.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
I can't see to find anything helpful.
This error occurs when trying to run from the source code without compiling it with Maven.
I assume this may have been resolved by now?
If not, looks like Accumulo has not been compiled correctly.
If your using the Binary Dist, run the following to compile.
mvn package -P assemble

ClassNotFoundException thrown by RecommenderJob (Apache Mahout on Hadoop)

I am using the org.apache.mahout.cf.taste.hadoop.pseudo.RecommenderJob.java file to run a pseudo distributed recommender. I am using it to run the GenericItemsRecommender class.
The command I am using is
bin/hadoop jar mahout-core-0.7-SNAPSHOT-job org.apache.mahout.cf.taste.hadoop.pesudo.RecommenderJob -Dmapred.input.dir=./ratingsLess.txt -Dmapred.output.dir=/input/output --tempDir /input/tmp --recommenderClassName org.apache.mahout.cf.taste.impl.recommender.GenericItemBasedRecommender
When I run it I get an Exception saying :
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.mahout.cf.taste.hadoop.pesudo.RecommenderJob
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Could you please let me know why I am getting this error?

Resources