Error:java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.SecureRpcEngine - hadoop

I am trying to create HBASE table in java. The code works perfectly fine in localhost. But when I try to run it on cloudera, it throws following error
Exception in thread "main" java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.SecureRpcEngine
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:867)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProtocolEngine(HBaseRPC.java:114)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:335)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:312)
at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:364)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:665)
at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:109)
at com.ab.academy.THBaseAdmin.createTable(THBaseAdmin.java:97)
at com.ab.academy.THBaseAdmin.main(THBaseAdmin.java:71)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.SecureRpcEngine
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:266)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:820)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:865)
... 8 more
Can anyone please help me with this error?

Related

Exception in thread "main" java.lang.NoClassDefFoundError: com/github/tomakehurst/wiremock/core/Options

Getting this error whenever I am trying to run the jar file. How to fix this
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: com/github/tomakehurst/wiremock/core/Options
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: com.github.tomakehurst.wiremock.core.Options
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
The command that I am entering is
mvn clean compile package
java -jar target/filename-1.0-SNAPSHOT.jar

Spark SQL on Google Compute Engine issue

We are using bdutil 1.1 to deploy a Spark (1.2.0) cluster. However, we are having an issue when we launch our spark script:
py4j.protocol.Py4JJavaError: An error occurred while calling o70.registerTempTable.
: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:235)
at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:231)
at scala.Option.orElse(Option.scala:257)
at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:231)
at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:229)
at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:54)
at org.apache.spark.sql.hive.HiveContext$$anon$1.<init>(HiveContext.scala:253)
at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:253)
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:253)
at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:72)
at org.apache.spark.sql.SQLContext.registerRDDAsTable(SQLContext.scala:279)
at org.apache.spark.sql.SchemaRDDLike$class.registerTempTable(SchemaRDDLike.scala:86)
at org.apache.spark.sql.api.java.JavaSchemaRDD.registerTempTable(JavaSchemaRDD.scala:42)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 26 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 31 more
Caused by: javax.jdo.JDOFatalUserException: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found.
The script works on my laptop. I have datanucleus-api-jdo-3.2.6.jar in the /home/hadoop/spark-install/lib path.
Any ideas of what could be wrong?
I needed to add :
SPARK_CLASSPATH=/home/hadoop/spark-install/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark-install/lib/datanucleus-core-3.2.10.jar:/home/hadoop/spark-install/lib/datanucleus-rdbms-3.2.9.jar
before my spark-submit command:
SPARK_CLASSPATH=/home/hadoop/spark-install/lib/datanucleus-api-jdo-3.2.6.jar:/home/hadoop/spark-install/lib/datanucleus-core-3.2.10.jar:/home/hadoop/spark-install/lib/datanucleus-rdbms-3.2.9.jar ../hadoop/spark-install/bin/spark-submit main.py --master spark://spark-m:7077

hdfs_connect("localhost",9000) leads to "java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration" error

I want to use HDFS.jl in julia. But, every time I input the command
hdfs_connect("localhost",9000) the error occurs:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR: hdfs connect failed
in hdfs_connect at /home/gxx/.julia/v0.3/HDFS/src/hdfs_dfs.jl:35
in hdfs_connect at /home/gxx/.julia/v0.3/HDFS/src/hdfs_dfs.jl:30
My hadoop version is 1.2.1, and my classpath is:
export HADOOP_HOME=/home/gxx/usr/hadoop/hadoop-1.2.1
export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/*.jar:$HADOOP_HOME/lib/*.jar:$HADOOP_HOME/hadoop-core-1.2.1.jar
This is the link of HDFS.jl
How to fix it?

pig: java.lang.NoClassDefFoundError: org/jruby/embed/ScriptingContainer

pig 0.10.0 supports ruby UDF. So, i am trying a very simple example. but got the following error. Do you know why?
Pig Stack Trace
--------------- ERROR 2998: Unhandled internal error. org/jruby/embed/ScriptingContainer
java.lang.NoClassDefFoundError: org/jruby/embed/ScriptingContainer at
org.apache.pig.scripting.jruby.JrubyScriptEngine.<clinit>(JrubyScriptEngine.java:65)
at java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:169) at
org.apache.pig.scripting.ScriptEngine.getInstance(ScriptEngine.java:254)
at org.apache.pig.PigServer.registerCode(PigServer.java:523) at
org.apache.pig.tools.grunt.GruntParser.processRegister(GruntParser.java:422)
at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:419)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:189)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:165)
at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84) at
org.apache.pig.Main.run(Main.java:555) at
org.apache.pig.Main.main(Main.java:111) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597) at
org.apache.hadoop.util.RunJar.main(RunJar.java:156) Caused by:
java.lang.ClassNotFoundException: org.jruby.embed.ScriptingContainer
at java.net.URLClassLoader$1.run(URLClassLoader.java:202) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:190) at
java.lang.ClassLoader.loadClass(ClassLoader.java:306) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301) at
java.lang.ClassLoader.loadClass(ClassLoader.java:247) ... 17 more
================================================================================
I had the same problem. You should look to see if you have a jruby.jar installed w pig.
Seems like the jython.jar was there so maybe that's a friendly nudge for people to use python.
I had to explicitly put the jruby.jar in the classpath doing:
java -cp $PIG_HOME/pig-0.11.1.jar:$PIG_HOME/lib/jruby.jar org.apache.pig.Main -x local myscript.pig

Errors when running accumulo init

I have Hadoop and Zookeeper running w/o a problem but when I go to run $ACCUMULO_HOME/bin/accumulo init, this happens:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/start/Platform
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.start.Platform
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/start/Main
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.start.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
I can't see to find anything helpful.
This error occurs when trying to run from the source code without compiling it with Maven.
I assume this may have been resolved by now?
If not, looks like Accumulo has not been compiled correctly.
If your using the Binary Dist, run the following to compile.
mvn package -P assemble

Resources