I'm trying to run hive from the command prompt it is working absolutely fine. But when I try running hiveserver using "hive --service hiveserver" command, I'm getting the following exception.
Starting Hive Thrift Server
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hadoop.hive.service.HiveServer
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
So I then tried with the command "hive --service hiveserver2"; still I'm not finding any solution.
Can anybody please suggest a solution for this problem.
May be another process (another hiveserver) already listening on port 10000.
can you check it by :
netstat -ntulp | grep ':10000' to see it and if found then kill the process.
Otherwise start the server on another port.
By the way which version you are using ?
This error occurred to me when it can't find hive-service-*.jar in hadoop classpath. Just copy the hive-service-*.jar to your hadoop lib folder or export classpath in hadoop-env.sh. I have mentioned how to add classpath below.
Add this line in hadoop-env.sh:
export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/local/hive/lib/hive-*.jar
I have mentioned the path for hive as /usr/local/hive since i have hive installed at that location. Change it to point to your hive installation.
Related
first of all: I am new to Hive.
I just installed Hive and when I run "hive" the server starts up and brings me into the CLI. But when I try to start it as a service/server with "hive --service hiveserver" I get:
Starting Hive Thrift Server
Exception in thread "main" java.lang.ClassNotFoundException: org.apache.hadoop.hive.service.HiveServer
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Two questions:
Is that the right way to start up a Hive server? (My understanding is that HCat and WebHCat are included automatically!?)
Why does this problem show up and how can I resolve it?
Thanks and regards!
Try to run bin/hive --service hiveserver2 instead of hive --service hiveserver for this version of apache hive
I am new to Hadoop and I have just setup Hadoop 1.2.1 on my Mac laptop (Mavericks). I then created a simple WordCount project in IntelliJ IDEA and was able to run the code on a dummy text file. I am having trouble with successfully creating a jar file which will replicate my execution through the IDE. I get the following error:
java -jar ./out/artifacts/WordCount_jar/WordCount.jar test.txt out [19:35:21]
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.hadoop.conf.Configuration.<clinit>(Configuration.java:146)
at neu.cs.parallelprogramming.WordCount.main(WordCount.java:48)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 2 more
FAIL: 1
Could anyone let me know what I am missing?
I guess you have to specify your class (which implements the Map/Reduce function).
E.g., $ java -jar ./WordCount.jar classWordCount input.txt output
or $ hadoop jar yourprogram.jar **yourclass** inputpath outputpath
I am trying to use mysql DB with apache OOzie.
my $OOZIE_HOME is
-bash: /opt/oozie_install/oozie-3.3.0-cdh4.2.2: Is a directory
But I copied mysql-connector-java-5.1.29-bin.jar in almost every possible places.
Like I copied it inside
/opt/oozie_install/oozie-3.3.0-cdh4.2.2
/opt/oozie_install/oozie-3.3.0-cdh4.2.2/libs
/opt/oozie_install/oozie-3.3.0-cdh4.2.2/libtools
/usr/lib/jvm/jdk/libs
/user/home/hadoop/
But I am still getting ClassnotFoundException.
java.lang.Exception: Could not connect to the database: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at org.apache.oozie.tools.OozieDBCLI.validateConnection(OozieDBCLI.java:473)
at org.apache.oozie.tools.OozieDBCLI.createDB(OozieDBCLI.java:179)
at org.apache.oozie.tools.OozieDBCLI.run(OozieDBCLI.java:118)
at org.apache.oozie.tools.OozieDBCLI.main(OozieDBCLI.java:64)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at org.apache.oozie.tools.OozieDBCLI.createConnection(OozieDBCLI.java:462)
at org.apache.oozie.tools.OozieDBCLI.validateConnection(OozieDBCLI.java:469)
Exactly where am I supposed to copy the Mysql connector.
I have verified my oozie-site.xml-
I followed the following steps to use mysql in Oozie
My oozie directory looks like-
You have to copy mysql-connector-java-5.1.29-bin.jar to /opt/oozie_install/oozie-3.3.0-cdh4.2.2/libext directory then restart oozie instance. Make sure that mysql user oozie has suffient privileges to the database oozie, if not, grant suffient permission using grant command in mysql server.
i meet the same problem finally i solve it by edit the oozie-env.sh and append the JAVA_HOME at last export JAVA_HOME=/usr/local/jdk1.7 the java_home is yourself javapath
I ran into this issue when I was converting my local Derby instance to MySql. The difference between my issue and the others, is that I did not install an RPM. My Oozie instance was pre-compiled in a tar.gz file. I had to copy the mysql-connector-java-bin.jar to the oozie-server/lib directory. This was in addition to copying it to the lib, libext, and libtools directories. I am not sure if all of those are needed, but I do know that oozie-server/lib is needed for Oozie to start. Hope this helps someone!
I get the following error while executing a MapReduce program.
I have placed all jars in hadoop/lib directory and have also mentioned the jars in -libjars.
This is the cmd I am executing:
$HADOOP_HOME/bin/hadoop --config $HADOOP_HOME/conf jar /home/shash/distinct.jar HwordCount -libjars $LIB_JARS WordCount HWordCount2
java.lang.RuntimeException: java.lang.ClassNotFoundException:
org.apache.hcatalog.mapreduce.HCatOutputFormat at
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:996) at
org.apache.hadoop.mapreduce.JobContext.getOutputFormatClass(JobContext.java:248) at org.apache.hadoop.mapred.Task.initialize(Task.java:501) at
org.apache.hadoop.mapred.MapTask.run(MapTask.java:306) at org.apache.hadoop.mapred.Child$4.run(Child.java:270) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:415) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127) at
org.apache.hadoop.mapred.Child.main(Child.java:264) Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.HCatOutputFormat
at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at
java.net.URLClassLoader$1.run(URLClassLoader.java:355) at
java.security.AccessController.doPrivileged(Native Method) at
java.net.URLClassLoader.findClass(URLClassLoader.java:354) at
java.lang.ClassLoader.loadClass(ClassLoader.java:423) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at
java.lang.ClassLoader.loadClass(ClassLoader.java:356) at
java.lang.Class.forName0(Native Method) at
java.lang.Class.forName(Class.java:264) at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:943)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:994) ...
8 more
Make sure LIB_JARS is a comma-separated list (as opposed to colon-separated like CLASSPATH)
Applies To CDH 5.0.x CDH 5.1.x CDH 5.2.x CDH 5.3.x Sqoop
Cause Sqoop cannot pick up the HCatalog libraries because Cloudera
Manager does not set the HIVE_HOME environment. It needs to be set
manually.
This problem is tracked with below JIRA:
https://issues.apache.org/jira/browse/SQOOP-2145
The fix of this JIRA has been included in CDH since version 5.4.0.
Workaround: Applicable to CDH versions lower than 5.4.0.
Execute below commands in shell before calling Sqoop command or adding them to /etc/sqoop/conf/sqoop-env.sh (create one, if it does not already exists):
export HIVE_HOME=/opt/cloudera/parcels/CDH/lib/hive (for parcel installation)
export HIVE_HOME=/usr/lib/hive (for package installation)
I have downloaded latest stable release of Hive, when I start /usr/local/hive/bin/hive it gives me this error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 3 more
Hadoop DFS is started and working, and I have changed /usr/local/hive/conf/hive-env.sh to export HADOOP_HOME.
Does anyone know what else can I do?
Thanks.
Apart from editing hive-env.sh, you also need to edit your bash_profile.
vim ~/.bash_profile
Add the following lines to your bash_profile
export HIVE_HOME=/usr/local/hive
export PATH=$PATH:$HIVE_HOME/bin
Save this file and then
source ~/.bash_profile
If this still doesnt work, please include your hive-env.sh file and hive-site.xml file. Also please tell me if you are using derby or mysql as metastore.
Solved moving:
export HADOOP_CLASSPATH=/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-test.jar:/usr/local/hbase/conf:/usr/local/hbase/lib/zookeeper-3.4.3.jar:/usr/local/hive/lib/*.jar:/usr/local/hbase
From /usr/local/hadoop/conf/hadoop-env.sh to ~/.bashrc
Thanks for the help.
if you are facing this issue in sqoop while hive-imports
then you need to copy hive-common-3.1.2.jar (or whichever version of hive-common-x.x.x.jar) from the $HIVE_HOME/lib folder and paste it into the $SQOOP_HOME/lib folder and the error will be gone