Create database in console hive without permission in Ranger - hadoop

I have not kerberos cluster Hadoop. I manage the permission hive, hdfs via Ranger.
The Resource Path in Ranger for HDFS are:
/user/myLogin
/apps/hive/warehouse/mylogin_*
/apps/hive/warehouse
I can create a database in hive ( via console) also in Ambari.
But when I remove the permission /apps/hive/warehouse I can't create a database in Hive (Console) but in Ambari I can create it.
This following the error:
hive> create database database_tesst;
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTa sk. MetaException(message:org.apache.hadoop.security.AccessControlException:
Permission denied: user=AAAAA, access=EXECUTE,
inode="/apps/hive/warehouse/database_tesst.db":hdfs:hdfs:d---------
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPe rmissionChecker.java:353)
How can I create a database or runing a request in hive (console) without the permission /apps/hive/warehouse ? Because I should remove this permission from Ranger to allow access users only to there data.
Thank you

Related

Permission when setup Hadoop Cluster in Pentaho

I try to setup new hadoop cluster in pentaho 9.3 but i got permission error.
It requires username and password for hdfs but i don't know how to create user and password for hdfs.
step 1
step 2
I get Error
Error
Hadoop, by default, uses regular OS user accounts. For Linux, you'd use useradd command.

HDP Sandbox SQOOP failed due to permission error

Below is the error message:
Unable to move source
hdfs://sandbox-hdp.hortonworks.com:8020/user/maria_dev/DimDepartmentGroup/part-m-00000
to destination
hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/dbodimemployee/delta_0000001_0000001_0000:
Permission denied: user=hive, access=WRITE,
inode="/user/maria_dev/DimDepartmentGroup":maria_dev:hdfs:drwxr-xr-x
I am totally confused. The error message itself shows that Maria_dev has write permission on the folder inode="/user/maria_dev/DimDepartmentGroup":maria_dev:hdfs:drwxr-xr-x
What did I miss?
When you run Sqoop , ** generally ** it first loads the Data from Your external database , then store that as a multi-part file at the given location (--target-dir /goldman/yahoo) then from that location to hive table (--hive-table topclient.mpool)
Now you can have access denied at 2 level .
1) If you see access denied at file location /goldman/yahoo, then set filelocation access to 777 running as hdfs user - sudo -u hdfs hadoop fs -chmod 777 /goldman/yahoo
2) If you see access denied while creating table , run sqoop command as user hive, beacuse the user hive has access to hive tables, i.e.
sudo -u hive sqoop import --connect 'jdbc:sqlserver://test.goldman-invest.data:1433;databaseName=Investment_Banking' --username user_***_cqe --password ****** --table cases --target-dir /goldman/yahoo --hive-import --create-hive-table --hive-table topclient.mpool
Finally, I got it to work. I logged in as root and switch to hive user using su - hive.
Then I was able to run the SQOOP command successfully. Previously I logged in as maria_dev and could not use su command. I do not have the password to user hive because hive is not a regular user in HDP sandbox.
Still, it is strange to me that a user needs to have root access to load some data into HDP HIVE.

Not able to create new table in hive from Spark-shell

I am using single node setup in Redhat and installed Hadoop Hive Pig and Spark . I configured hive metadata in Derby and everything . I created new folder for Hive tables and gave full privilege (chmod 777 ) . Then I created one table from Hive CLI and I am able to select those data in Spark-shell and printed those values to the console. But from Spark-shell/Spark-Sql I am not able to create new tables .It is throwing error as
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:file:/2016/hive/test2 is not a directory or unable to create one)
I checked the permission and User(using same user for Installation and Hive and Hadoop Spark etc).
Is there anything need to be done for getting full integration of Spark and Hive
Thanks
Check that the permissions in hdfs are correct (not just the filesystem)
hadoop fs -chmod -R 755 /user
If the error message persists afterwards please update the question.

Table Folder permission issues while using Hive and Impala both

We are using latest versions of Hive as well as Impala. Impala is being authenticated with LDAP and authorization is being done via Sentry. Hive access is not authorized via Sentry as yet. We are creating tables from Impala while the /user/hive/warehouse has group level ownership by "hive" group, hence, the folder permissions are impala:hive.
drwxrwx--T - impala hive 0 2015-08-24 21:16 /user/hive/warehouse/test1.db
drwxrwx--T - impala hive 0 2015-08-11 17:12 /user/hive/warehouse/test1.db/events_test_venus
As can be seen, above folders are owned by Impala and group is Hive, and are group-writable. The group “hive” has a user named “hive” as well:
[root#server ~]# groups hive
hive : hive impala data
[root#server ~]# grep hive /etc/group
hive:x:486:impala,hive,flasun,testuser,fastlane
But when I try to query the table created on the folder, it gives access errors:
[root#jupiter fastlane]# sudo -u hive hive
hive> select * from test1.events_test limit 1;
FAILED: SemanticException Unable to determine if hdfs://mycluster/user/hive/warehouse/test1.db/events_test_venus is encrypted: org.apache.hadoop.security.AccessControlException: Permission denied: user=hive, access=EXECUTE, inode="/user/hive/warehouse/test1.db":impala:hive:drwxrwx--T
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkTraverse(DefaultAuthorizationProvider.java:180)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:137)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:138)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6599)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6581)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPathAccess(FSNamesystem.java:6506)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getEZForPath(FSNamesystem.java:9141)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getEZForPath(NameNodeRpcServer.java:1582)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.getEZForPath(AuthorizationProviderProxyClientProtocol.java:926)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getEZForPath(ClientNamenodeProtocolServerSideTranslatorPB.java:1343)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:619)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1060)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2040)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2038)
Any ideas how to counter it?? Basically, we are trying to exploit the fact that by giving the group level read and write permissions, we should be able to make any group user to create and use the tables created by the folder owner, but that does not seem to be possible. Is it because of the fact that Impala alone has the Sentry authorization which uses the user impersonalization while Hive, stand-alone doesn't?
Can someone please guide or confirm?
Thanks
You can set the umask of hdfs to 000 and restart the cluster. This will ensure that all the directories or files created after this change will be with permissions 777. After this apply proper ownership and permissions to the directories and folders to ensure that the permissions of other directories are not open. Setting the umask to 000 will not change the permissions of existing directories. Only the newly created directories/files will be affected. If you are using cloudera manager, it is very easy to make this change.
NB: Umask 000 will make all the files/directories with default permission 777. This will make open permissions. So handle this by applying permissions and acls at the parent directory level.

Hive is throwing permission error while creating table/database

I am getting permission error in hive.
I am using IBM cloud -
my.imdemocloud.com
hive> create table a(key INT);
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied: user=nehpraka, access=WRITE, inode="warehouse":biadmin:biadmin:rwxrwxr-x)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
Where as I have given all the roles to m user.
hive> SHOW GRANT USER nehpraka on DATABASE default;
OK
database default
principalName nehpraka
principalType USER
privilege Create
grantTime Wed Apr 16 14:17:51 EDT 2014
grantor nehpraka
Time taken: 0.051 seconds
Please help me out in this.
Thanks & Ragards
I got the solution.
I was using IBM Big Insight cloud, on this we can create a database in this manner.
create database DataBaseName location 'hdfs://master-1-internal.imdemocloud.com:9000/user/<user_name>';
Thanks all for helping me.
This looks like a HDFS folder write permission issue.
Have you tried with hdfs user? Basically check the warehouse of hive on HDFS and that folder should have RW permission for your user.
if you are using Cloudera version of hadoop and hive then try to open hive using below command :
sudo hive
if it asks for password then enter cloudera.
In cloudera version, only super user have permission to hive write permissions on hdfs.
This will help you.

Resources