I'm trying to build a simple WordCount jar project which utilizes Hadoop-lzo library but cannot seem to get the following command to work, even though the class I'm referencing is within hadoop classpath:
$ javac -cp `hadoop classpath` *.java
LzoWordCount.java:76: cannot find symbol
symbol : class LzoTextInputFormat
location: class LzoWordCount
job.setInputFormatClass(LzoTextInputFormat.class);
^
1 error
Any ideas?
I assume you have correctly installed your LZO libraries (you should have libgplcompression.so in your lib/natives/Linux**-**/ and the jar file in your lib/ folder)
Since you have them the correct class should be LzoDeprecatedTextInputFormat.class or LzoTextInputFormat.class depending on wich API you use (According to your post you are using it right Job with LzoTextInputFormat).
So your problem could be in your java.library.path, wich should include path to your jar file. You can set it up in your .bash_profile or in you bin/hadoop file.
hope that helps.
Related
I'm following this tutorial: https://learn.microsoft.com/en-us/azure/hdinsight/storm/apache-storm-develop-java-topology
What I've done so far is
maven setting
vi *.java files (in src/main/java/com/microsoft/example directory)
RandomSentenceSpout.java
SplitSentence.java
WordCount.java
WordCountTopology.java
mvn compile
jar cf storm.jar *.class (in target/classes/com/microsoft/example directory)
RandomSentenceSpout.class SplitSentence.class WordCount.class WordCountTopology.class
The above 4 files were used to make storm.jar file
Then, I tried
storm jar ./storm.jar com.microsoft.example.WordCountTopology WordCountTopology
and
storm jar ./storm.jar WordCountTopology
, but both of these failed, saying:
Error: Could not find or load main class com.microsoft.example.WordCountTopology
or
Error: Could not find or load main class WordCountTopology
According to a document, it says
Syntax: storm jar topology-jar-path class ...
Runs the main method of class with the specified arguments. The storm
jars and configs in ~/.storm are put on the classpath. The process is
configured so that StormSubmitter will upload the jar at
topology-jar-path when the topology is submitted.
I cannot find where to fix.
How can I resolve this?
I think your jar file does not contain class WordCountTopology. You can check it with jar tf storm.jar | grep WordCountTopology.
Looks like your jar does not contain a Manifest file which keeps information about the main class.
Try including the Manifest file or you can run the below java command to include the Manifest file
Hope this works!
jar cvfe storm.jar mainClassNameWithoutDotClassExtn *.class
I know that a java.lang.NoSuchMethodError means that the version of a class used for compiling is different from the version used at runtime. Usually, when I see this issue, I start the app.server in java -verbose mode, which tells me the jar file from which a class is loaded. If that jar file is not the one I intended to use, I know I'm using an incorrect version of the jar file.
Another approach I use, is to use javap to look at the method signatures of the class in the jar file I am using at runtime, to confirm that the jar does indeed contain the class with a different method signature.
I am seeing this error now in Karaf, an OSGi container and none of the above approaches are helping. java -verbose shows me the jar, javap shows me the method signature and the method signature is the same as that in the error stacktrace. In other words, I can see that the class from the jar being used at runtime does have the same method signature that the jvm says it cannot find.
Here is the exact stack trace, if it helps:
java.lang.NoSuchMethodError: org.apache.axiom.om.OMXMLBuilderFactory.createSOAPModelBuilder(Ljava/io/InputStream;Ljava/lang/String;)Lorg/apache/axiom/soap/SOAPModelBuilder;
at org.apache.axis2.builder.SOAPBuilder.processDocument(SOAPBuilder.java:55)
at org.apache.axis2.transport.TransportUtils.createDocumentElement(TransportUtils.java:179)
at org.apache.axis2.transport.TransportUtils.createSOAPMessage(TransportUtils.java:145)
at org.apache.axis2.transport.TransportUtils.createSOAPMessage(TransportUtils.java:108)
at org.apache.axis2.transport.TransportUtils.createSOAPMessage(TransportUtils.java:67)
at org.apache.axis2.description.OutInAxisOperationClient.handleResponse(OutInAxisOperation.java:354)
at org.apache.axis2.description.OutInAxisOperationClient.send(OutInAxisOperation.java:421)
at org.apache.axis2.description.OutInAxisOperationClient.executeImpl(OutInAxisOperation.java:229)
at org.apache.axis2.client.OperationClient.execute(OperationClient.java:165)
at org.wso2.carbon.authenticator.stub.AuthenticationAdminStub.login(AuthenticationAdminStub.java:659)
Are there any other approaches I can/should use? Thanks for your help.
The Karaf commands exports [ids], imports [ids] and classes [ids] can used in combination with grep (each command has a --help option).
From the bundle throwing the error (with id N), imports N | grep org.apache.axiom.om will tell you which bundle it is actually importing that package from.
And approaching from the other side, exports | grep org.apache.axiom.om will list the bundles that export that package.
I'd expect you'll see more than one line from the exports and the import command will show incorrect version is being used.
You could also use java -verbose:class to see where classes are loaded from, which might show that the problematic class is loaded from a different bundle that you expected.
I have a piece of Ruby code which depends on a binary built from C. I generally call the binary through backticks. But now when I package the Ruby code into a jar using Warbler, I'm not sure on how I'd be able to access the binary.
My code structure looks like this:
root/
|--bin/
|--exec.rb #This is the executable when I call java -jar example.jar
|--lib/
|--Module1.rb #This dir contains all the ruby modules my code requires
|--ext/
|--a.out #A binary compiled with gcc
|--.gemspec #A file to guide warbler into building this structure into a jar
I used warble to build this entire structure into a jar. In Ruby, I can access my a.out through the following statement in exec.rb.
exec = "#{File.expand_path(File.join(File.dirname(File.dirname(__FILE__)), 'ext'))}/a.out}";
`exec`
But when I try this code packaged as a jar I get the following error:
/bin/sh: file:/path/to/my/jar/example.jar!/root/ext/a.out: not found
So, how do I access the executable packaged in a jar.
Put the jar in the lib folder.
Require it in the code
require 'java'
Dir["#{File.expand_path(File.join(Rails.root, 'lib'))}/\*.jar"].each { |jar| require jar }
# A war is treated as a directory. If that is not successful add the lib folder to the CLASSPATH environment variable
Then it should be available to be used.
Edit:
Maybe this is what you are looking for https://stackoverflow.com/a/600198/643500 you can implement it with JRuby.
JRE contains the .class files for library classes.
When these .class files in JRE folder will be used?
Does JRE contains the .class files for all library classes.
Let us take an example..
when we are importing the library files, we are using the definition of the class file from the src.zip and not the .class file. E.g. import java.net.InetAddress. that is just we are inheriting the library class.Therefore the code of the library class too complies along with our code na..
we are not using the .class file directly. so what is the use of having .class files for those library files in JRE folder?
I searched JRE folder, but I found no .class file for InetAddress
But command javap java.net.InetAddress still works well.
Someone please help.
Thanks in Advance.
javap runs in the JRE so it has all the classes which come with the JRE in its class path. Most of the classes you will be interested in are in the rt.jar archive.
If you want to decompile your own class, you have to add its base directory to the class path
e.g.
$ ls
HelloWorld.class
$ javap -c -classpath . HelloWorld
Yes the JRE contains the class files.
I have set up Hadoop on my laptop and ran the example program given in the installation guide successfully. But, I am not able to run a program.
rohit#renaissance1:~/hadoop/ch2$ hadoop MaxTemperature input/ncdc/sample.txt output
Exception in thread "main" java.lang.NoClassDefFoundError: MaxTemperature
Caused by: java.lang.ClassNotFoundException: MaxTemperature
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
Could not find the main class: MaxTemperature. Program will exit.
The book said that we should set a Hadoop Classpath by writing
rohit#renaissance1:~/hadoop/ch2$ export HADOOP_CLASSPATH=build/classes
The main class is defined in MaxTemperature.java file that I am executing. How do we set the Hadoop Classpath? Do we have to do it for all program execution or only once? Where should I put the input folder. My code is at /home/rohit/hadoop/ch2 and my Hadoop installation is at /home/hadoop.
You should package your application into a JAR file, that's much easier and less error-prone than fiddling with classpath folders.
In your case, you must also compile the .java file. You said it's MaxTemparature.java, but there must also be a MaxTemperature.class before you can run it.
First compile the Java files as told by walid:
javac -classpath path-to-hadoop-0.19.2-core.jar .java-files -d folder-to-contain-classes
Create jar file of application classes using the following command:
jar cf filename.jar *.class
In either of the, whether you are exporting the classes into jar file or using specific folder to store class files , you should define HADOOP_CLASSPATH pointing to that particular class file or folder containing class file. So that at the time of running Hadoop command it should know where to look specified for main class.
set HADOOP_CLASSPATH
export HADOOP_CLASSPATH=path-to-filename.jar
or
export HADOOP_CLASSPATH=path-to-folder-containing-classes
Run using Hadoop command:
hadoop main-class args
I found this problem as well when going thru the Hadoop Book (O'Reilly). I fixed it by setting the HADOOP_CLASSPATH variable in the hadoop-env.sh file in your configuration directory.
here is the ansewer in 3 steps:
1:
javac -verbose -classpath C:\\hadoop\\hadoop-0.19.2-core.jar MaxTemperature*.java -d build/classes
2:
put *.class in build/classes
3:
export HADOOP_CLASSPATH=${HADOOP_HOME}/path/to/build/classes
(you have to create the build/classes directory)
Best Regards
walid
You do not necessarily need a jar file, but did you put MaxTemperature in a package?
If so, say your MaxTemperature.class file is in yourdir/bin/yourpackage/, all you need to do is:
export HADOOP_CLASSPATH=yourdir/bin
hadoop yourpackage.MaxTemperature
after you make your class a jar file:
hadoop jar MaxTemperature.jar MaxTemperature
basicly :
hadoop jar jarfile main [args]