Unable to write to HDFS as non sudo user - hadoop

I've changed the permission of a hdfs directory via
hdfs dfs -chmod 777 /path/to/dir
but, when writing to that directory as a non-sudo user, i get a permission error
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=crtemois, access=WRITE, inode="/aggregation/system/data/clean":owners:hdfs:drwxr-xr-x

The reason is that Apache Ranger was layered on top. Even though the permissions were changed via chmod 777, if the user permission wasn't set in Apache Ranger, writing wouldn't be possible.

Related

Hadoop returns permission denied

I am trying to install hadoop (2.7) in cluster (two machines hmaster and hslave1). I installed hadoop in the folder /opt/hadoop/
I followed this tutorial but Iwhen I run the command start-dfs.sh, I got the following error about:
hmaster: starting namenode, logging to /opt/hadoop/logs/hadoop-hadoop-namenode-hmaster.out
hmaster: starting datanode, logging to /opt/hadoop/logs/hadoop-hadoop-datanode-hmaster.out
hslave1: mkdir: impossible to create the folder « /opt/hadoop\r »: Permission denied
hslave1: chown: impossible to reach « /opt/hadoop\r/logs »: no file or folder of this type
/logs/hadoop-hadoop-datanode-localhost.localdomain.out
I used the command chmod 777 for the folder hadoop in hslave but I still have this error.
Insted of using /opt/ use /usr/local/ if you get that permission issue again give the root permissions using chmod. I already configured hadoop 2.7 in 5 machines. Or else use "Sudo chown user:user /your log files directory".
Seems you have already gave master password less access to login slave.
Make sure you are logged in with username available on both servers.
(hadoop in your case, as tutorial you are following uses 'hadoop' user.)
you can edit the '/etc/sudoer' file using 'sudo' or directly type 'visudo' in the terminal and add the following permission for newly created user 'hadoop' :-
hadoop ALL = NOPASSWD: ALL
might it will resolved your issues.

get : permission denied in hadoop

When I execute the get command it says permission denied,
I tried the already given solution but didn't worked. Following is the command and its op
hduser#ubuntu:~$ hadoop fs -get /user/hduser/Input/pg*.txt /home/vilas/Desktop/
Warning: $HADOOP_HOME is deprecated.
get: Permission denied
Check out the permissions of this /user/hduser directory, maybe hduser does not have permission to access it, if so then you can execute the following command (as hdfs user)
hdfs dfs chown hduser:hduser /user/hduser
More information about chown here.
then try again.
you must go into the desktop directory, open the terminal there and run the command
hadoop fs -get /user/hduser/Input/pg*.txt .

Can't create directory on hadoop file system

I installed hadoop 2.7.1 from root in /usr/local
now i want to give access to multiple users
when i executed the following command
hdfs dfs -mkdir /user
from hadoop user i got the error
mkdir: Permission denied: user=hadoop, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
how to resolve this problem . please help me in this
Thanks
suchetan
hdfs user is the admin user for the HDFS. Change to hdfs user and give the necessary permissions to the user you want(hadoop)
or
you can disable the dfs.permissions.enabled in the hdfs_site.xml and restart. After that you can create a folder.

AccessControlException in Hadoop for access=EXECUTE

I have a small application which reads a file from my local machine and writes the data into hdfs.
Now i want to list the files present in the hdfs folder, say HadoopTest. When i try to do that , i am getting the below exception:
org.apache.hadoop.security.AccessControlException: Permission denied: user=rpoornima, access=EXECUTE, inode="/hbase/HadoopTest/Hadoop_File_1.txt":rpoornima:hbase:-rw-r--r--
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:161)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4547)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:4523)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:3312)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:3289)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:652)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:431)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44098)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
I'm not sure how to resolve this issue. kindly give you inputs.
You exception is clear enough to show the problem.
As the exception says
Permission denied: user=rpoornima, access=EXECUTE,
inode="/hbase/HadoopTest/Hadoop_File_1.txt":rpoornima:hbase:-rw-r--r--`
This means your account rpoornima only has -rw-r--r-- permission(no execute) on the file /hbase/HadoopTest/Hadoop_File_1.txt. So you have to use another full privilege account to do the execution.
UPDATE
If you want to give access to specified user. Use a chmod command.
chown
Usage: hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI ]
Change the owner of files. The user must be a super-user. Additional information is in the Permissions Guide.
Options
The -R option will make the change recursively through the directory structure.

Permission Denied error while creating database in hive

I am trying to database in hive, but when I run below query in HIVE:
CREATE DATABASE BIGDATA;
I receive the following error message:
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied: user=aseema, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
What is causing this?
This is because of the lack of permission to the user aseema in hdfs. Follow the steps below.
Login as hduser and perform the following operations (from the logs, it seems hduser is a superuser)
hadoop fs -mkdir -p /user/hive/warehouse
hadoop fs -mkdir /tmp
hadoop fs -chmod -R 777 /user/hive
hadoop fs -chmod 777 /tmp
After this, try executing the create database statement from aseema user.
If you are running from Local Mode then you should run this command from hdfs user:
su hdfs
Then change the permission like below if you want:
hdfs dfs -chown -R <username_of_new_owner> /user

Resources