Hive is throwing permission error while creating table/database - hadoop

I am getting permission error in hive.
I am using IBM cloud -
my.imdemocloud.com
hive> create table a(key INT);
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied: user=nehpraka, access=WRITE, inode="warehouse":biadmin:biadmin:rwxrwxr-x)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
Where as I have given all the roles to m user.
hive> SHOW GRANT USER nehpraka on DATABASE default;
OK
database default
principalName nehpraka
principalType USER
privilege Create
grantTime Wed Apr 16 14:17:51 EDT 2014
grantor nehpraka
Time taken: 0.051 seconds
Please help me out in this.
Thanks & Ragards

I got the solution.
I was using IBM Big Insight cloud, on this we can create a database in this manner.
create database DataBaseName location 'hdfs://master-1-internal.imdemocloud.com:9000/user/<user_name>';
Thanks all for helping me.

This looks like a HDFS folder write permission issue.
Have you tried with hdfs user? Basically check the warehouse of hive on HDFS and that folder should have RW permission for your user.

if you are using Cloudera version of hadoop and hive then try to open hive using below command :
sudo hive
if it asks for password then enter cloudera.
In cloudera version, only super user have permission to hive write permissions on hdfs.
This will help you.

Related

Permission when setup Hadoop Cluster in Pentaho

I try to setup new hadoop cluster in pentaho 9.3 but i got permission error.
It requires username and password for hdfs but i don't know how to create user and password for hdfs.
step 1
step 2
I get Error
Error
Hadoop, by default, uses regular OS user accounts. For Linux, you'd use useradd command.

Create database in console hive without permission in Ranger

I have not kerberos cluster Hadoop. I manage the permission hive, hdfs via Ranger.
The Resource Path in Ranger for HDFS are:
/user/myLogin
/apps/hive/warehouse/mylogin_*
/apps/hive/warehouse
I can create a database in hive ( via console) also in Ambari.
But when I remove the permission /apps/hive/warehouse I can't create a database in Hive (Console) but in Ambari I can create it.
This following the error:
hive> create database database_tesst;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTa sk. MetaException(message:org.apache.hadoop.security.AccessControlException:
Permission denied: user=AAAAA, access=EXECUTE,
inode="/apps/hive/warehouse/database_tesst.db":hdfs:hdfs:d---------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPe rmissionChecker.java:353)
How can I create a database or runing a request in hive (console) without the permission /apps/hive/warehouse ? Because I should remove this permission from Ranger to allow access users only to there data.
Thank you

Not able to create new table in hive from Spark-shell

I am using single node setup in Redhat and installed Hadoop Hive Pig and Spark . I configured hive metadata in Derby and everything . I created new folder for Hive tables and gave full privilege (chmod 777 ) . Then I created one table from Hive CLI and I am able to select those data in Spark-shell and printed those values to the console. But from Spark-shell/Spark-Sql I am not able to create new tables .It is throwing error as
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:file:/2016/hive/test2 is not a directory or unable to create one)
I checked the permission and User(using same user for Installation and Hive and Hadoop Spark etc).
Is there anything need to be done for getting full integration of Spark and Hive
Thanks
Check that the permissions in hdfs are correct (not just the filesystem)
hadoop fs -chmod -R 755 /user
If the error message persists afterwards please update the question.

HBase Bulk Loading Error. What Wrong?

I tried to data bulk loading into hbase table like below and successed.
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.bulk.output=/tmp/example_output -Dimporttsv.columns=HBASE_ROW_KEY,cf1:val1,cf1:val2,cf1:val3 so_table /user/uclab/smallbusiness/bulk3/
After doing this job, I performed like below.
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles /tmp/example_output so_table
But Some error occured recursively like below.
2015-10-12 01:52:42.835 DEBUG [LoadIncrementalHFiles-0]
mapreduce.LoadIncrementalHFiles: Goint to connect to server
regiont=so_table,,1444580736986.3c5aa99d4ca4dcb509c8cfb26c2b223f.,
hostname=datanode83,60020,1444578166533, seqNum=2 for row with hfile
group[{[B#5d37ce06,hdfs://namenode.uclab.com:8020/tmp/example_output/cf1/541f346
80be24932afa54c3fa14e4ad4}]
and
Caused by: org.apache.hadoop.ipc.RemoteException
(org.apache.hadoop.security.AccessControlException):
Permission denied: user=hbase, access=WRITE,
inode="/tmp/example_output/cf1":uclab:hdfs:drwxr-xr-x
How can I give write permission? and How can I solve this problem...?
I too faced similar kinda problem on Cloudera Quickstart VM.
Change the owner to “hbase” or HBase won’t have the permission to move the files. Run the following command:
sudo -u hdfs hdfs dfs -chown -R hbase:hbase /tmp/example_output
Now run
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles /tmp/example_output so_table

java.sql.SQLException: Failed to start database 'metastore_db' ERROR, while initializing database using hive

I installed Hadoop and Hive on 3 cluster. I have able to login to hive from my cluster node where HIVE is running.
root#NODE_3 hive]# hive Logging initialized using configuration in
jar:file:/usr/lib/hive/lib/hive-common-0.10.0-cdh4.2.0.jar!/hive-log4j.properties
Hive history
file=/tmp/root/hive_job_log_root_201304020248_306369127.txt hive> show
tables ; OK Time taken: 1.459 seconds hive>
But when i try to run some hive test on my cluster nodes , I am getting following given below error.
Here it is trying to initilize data base as user =ashsshar{my username}
3/04/02 02:32:44 INFO mapred.JobClient: Cleaning up the staging area
hdfs://scaj-ns/user/ashsshar/.staging/job_201304020010_0080 13/04/02
02:32:44 ERROR security.UserGroupInformation:
PriviledgedActionException as:ashsshar (auth:SIMPLE)
cause:java.io.IOException: javax.jdo.JDOFatalDataStoreException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see
the next exception for details. NestedThrowables:
java.sql.SQLException: Failed to create database
'/var/lib/hive/metastore/metastore_db', see the next exception for
details. java.io.IOException: javax.jdo.JDOFatalDataStoreException:
Failed to create database '/var/lib/hive/metastore/metastore_db', see
the next exception for details. NestedThrowables:
java.sql.SQLException: Failed to create database
'/var/lib/hive/metastore/metastore_db', see the next exception for
details.
I have tried two things .
1 . Giving permission to cd /var/lib/hive/metastore/metastore_db
Removing rm /var/lib/hive/metastore/metastore_db/*lck
But still i am getting the same error
It seems to be an issue with creating the metastore. I solved this by creating a directory and setting the value to that directory as follows:
step-1: create a directory on home say its: hive-metastore-dir
step-2: being super user edit the hive-site.xml (its in: /usr/lib/hive/conf) as follows:
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/var/lib/hive/metastore/metastore_db;create=true</value>
to
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:derby:;databaseName=/home/hive-metastore-dir/metastore/metastore_db;create=true</value>
step-3: start the CLI as sudo hive and perform your queries.
You may login to hive client from a directory where the user has write access. By default, hive will try to create temporary directory in local and HDFS when a shell is opened up.
follow this steps if you are using CDH
1. copy /usr/lib/hive/conf/hive-site.xml and paste into /usr/lib/spark/conf/
This will solve the problem of "metastore_db" error
Thanks

Resources