Hadoop Access Control Exception: Permissions - hadoop

Job setup failed : org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE,
inode="/mnt/var/lib/hadoop/tmp/2204827016_Attaching_UU_Codes_5C4141BF22014C8FAD3CD045070589C0/_temporary/1":hadoop:hadoop:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
It seems that only the user 'hadoop' can write to this location. What are the possible workarounds?

Your user rights are not enough try sudo or chomnd setting?

Related

I can't run start-dfs.sh in my Hadoop Grid5000 cluster (Permission denied)

I can navigate from node to node with an ssh connection without any problems, for example from parasilo-1 to parasilo-10.
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys doesn't change anything unfortunately.
I am connected in SSH to my master node (parasilo-1) on Grid5000 to run a hdfs command:
user#parasilo-1:~$ ./hadoop/hadoop-3.3.4/sbin/start-dfs.sh
Starting namenodes on [parasilo-1.rennes.grid5000.fr]
parasilo-1.rennes.grid5000.fr: user#parasilo-1.rennes.grid5000.fr: Permission denied (publickey,password).
Starting datanodes
parasilo-1.rennes.grid5000.fr: user#parasilo-1.rennes.grid5000.fr: Permission denied (publickey,password).
parasilo-10.rennes.grid5000.fr: user#parasilo-10.rennes.grid5000.fr: Permission denied (publickey,password).
Starting secondary namenodes [parasilo-1.rennes.grid5000.fr]
parasilo-1.rennes.grid5000.fr: user#parasilo-1.rennes.grid5000.fr: Permission denied (publickey,password).
2023-01-12 15:54:57,462 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Does anyone have an idea how to make this command run correctly?
You need to ssh-copy-id to all datanodes, not only edit localhost authorized keys. And there should be no password prompt.
If it's not working, there's no harm in generating a new key and trying again.

org.apache.hadoop.security.AccessControlException: Permission denied: user=hbase, access=EXECUTE, inode="/tmp"

Hbase error :
HBase down Caused by:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=hbase, access=EXECUTE, inode="/tmp":hdfs:supergroup:d-wx------
Solution to above error is Change Permission of tmp directory so hbase user can execute command:
sudo -u hdfs hdfs dfs -chmod 775 /tmp
You need to establish the hdfs user as the hadoop user name to have the execution granted.
$ export HADOOP_USER_NAME=hdfs

hadoop distcp permission denied error while executing "distcp" command with non-super user

I am trying to do inter cluster data copying between two clusters using hadoop distcp and the command would be executed using user which doesnt have any permissions,only super user has all file permissions(777),is there a way to solve this problem
command executed on shell:
hadoop distcp hdfs://[cluster1]:9000/home/[usr1]/data1 hdfs://[cluster2]:9000/home/[usr1]/data2
Error:
ERROR [main] tools.DistCp (DistCp.java:run(162)) - Exception encountered
org.apache.hadoop.security.AccessControlException: Permission denied: user=[usr1], access=EXECUTE, inode="/tmp":[superuser]:supergroup:drwxrwx---
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:350)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:311)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:238)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:189)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1751)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1735)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkOwner(FSDirectory.java:1680)
at org.apache.hadoop.hdfs.server.namenode.FSDirAttrOp.setPermission(FSDirAttrOp.java:64)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setPermission(FSNamesystem.java:1777)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.setPermission(NameNodeRpcServer.java:830)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.setPermission(ClientNamenodeProtocolServerSideTranslatorPB.java:469)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:503)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:871)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:817)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1889)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2606)

error creating Hive table

While creating one table in Hive I am getting this below error:
Error while processing statement: FAILED: Execution Error, return code
1 from org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:Got exception:
org.apache.hadoop.security.AccessControlException Permission denied:
user=admin, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:281)
at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:262)
at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:242)
at
org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:169)
at
org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6621)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6603)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAncestorAccess(FSNamesystem.java:6555)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:4350)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:4320)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:4293)
at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:869)
at
org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.mkdirs(AuthorizationProviderProxyClientProtocol.java:323)
at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:608)
at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073) at
org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2086) at
org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2082) at
java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2080) )
You should read the log before posting here. Also it would be helpful to write the hive statement you tried.
org.apache.hadoop.security.AccessControlException Permission denied:
user=admin, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
means that you run the statement as a user(admin) with no authorization to write on the hive database. You should run the statement as another user or give user admin the rights to save on hive

Where we configure the credential in PutHDFS processor in Apache NiFi

I have configured a path in PutHDFS its throwing an authentication error
LOG:
2017-03-03 01:52:29,200 DEBUG [IPC Client (1496249304) connection to dnn01.com/10.4.151.88:8020 from root] org.apache.hadoop.ipc.Client IPC Client (1496249304) connection to dnn01.com/10.4.151.88:8020 from root got value #39976 2017-03-03 01:52:29,201 TRACE [Timer-Driven Process Thread-7] org.apache.hadoop.ipc.ProtobufRpcEngine 105: Exception <- dnn01.com/10.4.151.88:8020: create {org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/raw/externaltbls/falcon/testing/.1PUGETSLA_PO810.P0125.EDIINV.P20150125.107.20160304025143629.gz":hdfs:hdfs:drwxrwxr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
2017-03-03 01:52:29,201 ERROR [Timer-Driven Process Thread-7] o.apache.nifi.processors.hadoop.PutHDFS PutHDFS[id=015a1010-9c64-1ed3-c39b-d19ab2dfe19b] Failed to write to HDFS due to org.apache.nifi.processor.exception.ProcessException: IOException thrown from PutHDFS[id=015a1010-9c64-1ed3-c39b-d19ab2dfe19b]: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/raw/externaltbls/falcon/testing/.1PUGETSLA_PO810.P0125.EDIINV.P20150125.107.20160304025143629.gz":hdfs:hdfs:drwxrwxr-x at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
The PutHDFS processor is going to execute as the OS user that NiFi is running as. In your case it looks like you are running NiFi as root because the log says "Permission denied: user=root, access=WRITE".
Your options are:
Give root WRITE access to the directory you are writing to (/raw/externaltbls/falcon/testing/)
Run NiFi as a different user who has WRITE access
Use Kerberos and specify the principal and keytab in the PutHDFS processor

Resources