Permission denied at hdfs - shell

I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it give an error message Permission Denied.
Message from terminal with command:
hduser#ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input
put: /usr/local/input-data (Permission denied)
hduser#ubuntu:/usr/local/hadoop$
After using sudo and adding hduser to sudouser:
hduser#ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
hduser#ubuntu:/usr/local/hadoop$

I solved this problem temporary by disabling the dfs permission.By adding below property code
to conf/hdfs-site.xml
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>

I had similar situation and here is my approach which is somewhat different:
HADOOP_USER_NAME=hdfs hdfs dfs -put /root/MyHadoop/file1.txt /
What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user hdfs. You can do this with other ID (beware of real auth schemes configuration but this is usually not a case).
Advantages:
Permissions are kept on HDFS.
You don't need sudo.
You don't need actually appropriate local user 'hdfs' at all.
You don't need to copy anything or change permissions because of previous points.

You are experiencing two separate problems here:
hduser#ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input put: /usr/local/input-data (Permission denied)
Here, the user hduser does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive. You should change it.
hduser#ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
Here, the user root (since you are using sudo) does not have access to the HDFS directory /input. As you can see: hduser:supergroup:rwxr-xr-x says only hduser has write access. Hadoop doesn't really respect root as a special user.
To fix this, I suggest you change the permissions on the local data:
sudo chmod -R og+rx /usr/local/input-data/
Then, try the put command again as hduser.

I've solved this problem by using following steps
su hdfs
hadoop fs -put /usr/local/input-data/ /input
exit

Start a shell as hduser (from root) and run your command
sudo -u hduser bash
hadoop fs -put /usr/local/input-data/ /input
[update]
Also note that the hdfs user is the super user and has all r/w privileges.

For Hadoop 3.x, if you try to create a file on HDFS when unauthenticated (e.g. user=dr.who) you will get this error.
It is not recommended for systems that need to be secure, however if you'd like to disable file permissions entirely in Hadoop 3 the hdfs-site.xml setting has changed to:
<property>
<name>dfs.permissions.enabled</name>
<value>false</value>
</property>
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

Related

Not able to access /tmp folder in HDFS

I have started the name node, datanode and mr service on my local machine and all the service are running. Here is what's the result of jps command:
kv:~ karan.verma$ jps
4499 SecondaryNameNode
420
4676 NodeManager
4741 JobHistoryServer
5125 Jps
4406 DataNode
4600 ResourceManager
4333 NameNode
And i could easy browse throw the "browse directory" of the web UI for name node. But when i try to browse the /tmp directory, it shows me the following error:
Permission denied: user=root, access=READ_EXECUTE, inode="/tmp":karan.verma:karan.verma:drwxrwx-w-
I tried to change the permissions using following command:
hadoop fs -chown -R karan.verma:karan.verma hdfs://localhost/
hadoop fs -chmod a+w /
but no luck. Please suggest what could be the issue? I executed the above commands with sudo, but still the same result. Any Help?
it looks like you are running as root and the file system to is owned by karan.verma.
you can confirm this by running
whoami
either su to karan.veram or add root to the karan.verma group
Executing the following command solved the issue for me:
hadoop fs -chmod -R 777 hdfs://localhost/

Permission denied issue in mapreduce?

I have tried the below query.
hadoop jar /home/cloudera/workspace/para.jar word.Paras examples/wordcount /home/cloudera/Desktop/words/output
map reduce is started after that its showing below error. can anyone please help on this issue.
15/11/04 10:33:57 INFO mapred.JobClient: Task Id : attempt_201511040935_0008_m_000002_0, Status : FAILED
org.apache.hadoop.security.AccessControlException: Permission denied: user=cloudera, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x
Do I need to change anything config file or in cloudera manager.
The exception suggests that you are trying to write to the HDFS root directory "/" which you (user:cloudera) does not have permission to do.
Without knowing what your specific jar does:
I guess that the last argument ("/home/cloudera/Desktop/words/output") is where you wish to place the output.
I guess this is supposed to be within HDFS where /home does not exist.
Try to change this to somewhere where you can write, possibly "/user/cloudera/words/output"
There are set of default directories to be created before you start using the hadoop cluster,
do, it should show you the directories
$ hadoop fs -ls /
sample user, if you want to run as cloudera you need on hdfs
/user/cloudera -- the user running the program
/user/hadoop -- your hadoop file system user
/user/mapred -- your mapred user
/tmp -- temporary which needs to have permission hdfs chmod 1777
HTH.
The last argument that you are passing should be the output path of HDFS not the default file system.
As you are running with cloudera user, you can point to the /user/cloudera/words/output. But first you need to check whether you have cloudera in your HDFS and you have write permission by issuing the following
hadoop fs -ls /user/
Once you have it change your command to following:
hadoop jar /home/cloudera/workspace/para.jar word.Paras examples/wordcount <path_where_you_have_write_permission_in_HDFS>

get : permission denied in hadoop

When I execute the get command it says permission denied,
I tried the already given solution but didn't worked. Following is the command and its op
hduser#ubuntu:~$ hadoop fs -get /user/hduser/Input/pg*.txt /home/vilas/Desktop/
Warning: $HADOOP_HOME is deprecated.
get: Permission denied
Check out the permissions of this /user/hduser directory, maybe hduser does not have permission to access it, if so then you can execute the following command (as hdfs user)
hdfs dfs chown hduser:hduser /user/hduser
More information about chown here.
then try again.
you must go into the desktop directory, open the terminal there and run the command
hadoop fs -get /user/hduser/Input/pg*.txt .

Can't create directory on hadoop file system

I installed hadoop 2.7.1 from root in /usr/local
now i want to give access to multiple users
when i executed the following command
hdfs dfs -mkdir /user
from hadoop user i got the error
mkdir: Permission denied: user=hadoop, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
how to resolve this problem . please help me in this
Thanks
suchetan
hdfs user is the admin user for the HDFS. Change to hdfs user and give the necessary permissions to the user you want(hadoop)
or
you can disable the dfs.permissions.enabled in the hdfs_site.xml and restart. After that you can create a folder.

Permission Denied error while creating database in hive

I am trying to database in hive, but when I run below query in HIVE:
CREATE DATABASE BIGDATA;
I receive the following error message:
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied: user=aseema, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
What is causing this?
This is because of the lack of permission to the user aseema in hdfs. Follow the steps below.
Login as hduser and perform the following operations (from the logs, it seems hduser is a superuser)
hadoop fs -mkdir -p /user/hive/warehouse
hadoop fs -mkdir /tmp
hadoop fs -chmod -R 777 /user/hive
hadoop fs -chmod 777 /tmp
After this, try executing the create database statement from aseema user.
If you are running from Local Mode then you should run this command from hdfs user:
su hdfs
Then change the permission like below if you want:
hdfs dfs -chown -R <username_of_new_owner> /user

Resources