Terminal not executing hive queries - hadoop

I am able to execute hive queries in hive shell after starting hive
hive> show databases;
OK
default
spot
Time taken: 0.051 seconds, Fetched: 2 row(s)
But I am unable to execute the same query directly from terminal (as i want to include it in my bash script later). I am getting following error
hive -e "show databases"
FAILED: HiveException java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient

Related

Issue with Source Command in Hive

I am running the Source command in Hive CLI, in one scenario it's working fine but in second scenario it showing the error.
The .hql files are same in both the locations.
Error Scenario:
hive> source /data/data01/dev/test/test1/project/code/scripts/abc_works_Production_Parameter/src_sys_cd_parm.hql;
File: /data/data01/dev/test/test1/project/code/scripts/abc_works_Production_Parameter/src_sys_cd_parm.hql is not a file.
hive>
Normal Case:
hive> source /data/data01/dev/test/test1/project/syscode/src_sys_cd_param.hql;
hive>

Hive: Unable to access database

I find myself in a bit of a 'hive' pickle here. On booting the Hive CLI from my home directory, I can access the 'fooDB' database, which I had created earlier:
hadoop#server-7:~$ hive
/usr/local/hive/hive-1.1.0-cdh5.5.2/bin/hive: line 258: no: command not found
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> SHOW DATABASES;
OK
default
fooDB
Time taken: 0.717 seconds, Fetched: 2 row(s)
But when I try to boot it from any other location in my file-system, I am unable to access 'fooDB':
hadoop#server-7:~/Downloads$ hive
/usr/local/hive/hive-1.1.0-cdh5.5.2/bin/hive: line 258: no: command not found
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> SHOW DATABASES;
OK
default
Time taken: 0.72 seconds, Fetched: 1 row(s)
Basically, the objects created after starting the Hive CLI from one particular location in the file-system, let's say '/home/hadoop/dir1', is not accessible from any other location in the file-system via. the Hive CLI and vice-versa.
The relevant-hive section from my .bashrc looks like this:
## HIVE VARIABLES ##
export HIVE_HOME=/usr/local/hive/hive-1.1.0-cdh5.5.2
export HIVE_CONF_DIR=$HIVE_HOME/conf
export PATH=$PATH:$HIVE_HOME/bin
So I am not really sure how to proceed here. I also tried using an alias for hive, which did not help. Any help would be appreciated. Thanks!
After scouring the web, I finally bumped into this
which was exactly what I was looking for.
Hope this helps out people coming across the above problem!

Hive jdbc connection is giving error if MR is involved

I am working on Hive-jdbc connection in HDP 2.1
Code is working fine for queries where mapreduce is not involved like "select * from tabblename". The same code is showing error when the query is modified with a 'where' clause or if we specify columnnames(which will run mapreduce in the the background).
I have verified the correctness of the query by executing it in HiveCLI.
Also I have verified the read/write permissions for the table for the user through which I am running the java-jdbc code.
The error is as follows
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:275)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:355)
at com.testing.poc.hivejava.HiveJDBCTest.main(HiveJDBCTest.java:25)
Today I also got this exception when I submit a hive task from java.
The following error:
org.apache.hive.jdbc.HiveDriverorg.apache.hive.jdbc.HiveDriverhive_driver:
org.apache.hive.jdbc.HiveDriverhive_url:jdbc:hive2://10.174.242.28:10000/defaultget
connection sessucess获取hive连接成功!
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
I tried to use the sql execute in hive and it works well. Then I saw the log in /var/log/hive/hadoop-cmf-hive-HIVESERVER2-cloud000.log.out then I found the reason of this error. The following error:
Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(Permission denied: user=anonymous, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
Solution
I used the following command :
sudo -u hdfs hadoop fs -chmod -R 777 /
This solved the error!
hive_driver:org.apache.hive.jdbc.HiveDriver
hive_url:jdbc:hive2://cloud000:10000/default
get connection sessucess
获取hive连接成功!
Heart beat
执行insert成功!
If you use beeline to execute the same queries, do you see the same behaviour as you get while running your test program?
The beeline client also uses the open source JDBC driver and connects to Hive server, which is similar to what you do in your program. HiveCLI on the other hand has Hive embedded in it and does not connect to a remote Hive server by default. You can use HiveCLI to connect to a remote Hive Server 1 but I don't believe you can use it to connect to Hive Server2 (use beeline for Hive Server 2).
For this error, you can take a look at the hive.log and hiveserver2.log on the server side to get more insight into what might have caused the MapReduce error.
Hope this helps.
Cheers,
Holman

Hive derby issue

I installed hive-0.12.0 recenlty, But when I run queries in hive shell it shows the below error:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
This is contained in my hive-default.xml.template:
javax.jdo.option.ConnectionURL
jdbc:derby:;databaseName=/home/hduser/hive-0.12.0/metastore/metastore_db;create=true
JDBC connect string for a JDBC metastore
Could any one help?
Seems problem with your metastore. Since you are using the default hive metastore embedded derby. Lock file would be there in case of abnormal exit. if you remove that lock file this issue should get solve
rm metastore_db/*.lck

java.sql.SQLException: Failed to start database 'metastore_db' ERROR, while initializing database using hive

I installed Hadoop and Hive on 3 cluster. I have able to login to hive from my cluster node where HIVE is running.
root#NODE_3 hive]# hive Logging initialized using configuration in
jar:file:/usr/lib/hive/lib/hive-common-0.10.0-cdh4.2.0.jar!/hive-log4j.properties
Hive history
file=/tmp/root/hive_job_log_root_201304020248_306369127.txt hive> show
tables ; OK Time taken: 1.459 seconds hive>
But when i try to run some hive test on my cluster nodes , I am getting following given below error.
Here it is trying to initilize data base as user =ashsshar{my username}
3/04/02 02:32:44 INFO mapred.JobClient: Cleaning up the staging area
hdfs://scaj-ns/user/ashsshar/.staging/job_201304020010_0080 13/04/02
02:32:44 ERROR security.UserGroupInformation:
PriviledgedActionException as:ashsshar (auth:SIMPLE)
cause:java.io.IOException: javax.jdo.JDOFatalDataStoreException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see
the next exception for details. NestedThrowables:
java.sql.SQLException: Failed to create database
'/var/lib/hive/metastore/metastore_db', see the next exception for
details. java.io.IOException: javax.jdo.JDOFatalDataStoreException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see
the next exception for details. NestedThrowables:
java.sql.SQLException: Failed to create database
'/var/lib/hive/metastore/metastore_db', see the next exception for
details.
I have tried two things .
1 . Giving permission to cd /var/lib/hive/metastore/metastore_db
Removing rm /var/lib/hive/metastore/metastore_db/*lck
But still i am getting the same error
It seems to be an issue with creating the metastore. I solved this by creating a directory and setting the value to that directory as follows:
step-1: create a directory on home say its: hive-metastore-dir
step-2: being super user edit the hive-site.xml (its in: /usr/lib/hive/conf) as follows:
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/var/lib/hive/metastore/metastore_db;create=true</value>
to
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/home/hive-metastore-dir/metastore/metastore_db;create=true</value>
step-3: start the CLI as sudo hive and perform your queries.
You may login to hive client from a directory where the user has write access. By default, hive will try to create temporary directory in local and HDFS when a shell is opened up.
follow this steps if you are using CDH
1. copy /usr/lib/hive/conf/hive-site.xml and paste into /usr/lib/spark/conf/
This will solve the problem of "metastore_db" error
Thanks

Resources