get : permission denied in hadoop - hadoop

When I execute the get command it says permission denied,
I tried the already given solution but didn't worked. Following is the command and its op
hduser#ubuntu:~$ hadoop fs -get /user/hduser/Input/pg*.txt /home/vilas/Desktop/
Warning: $HADOOP_HOME is deprecated.
get: Permission denied

Check out the permissions of this /user/hduser directory, maybe hduser does not have permission to access it, if so then you can execute the following command (as hdfs user)
hdfs dfs chown hduser:hduser /user/hduser
More information about chown here.
then try again.

you must go into the desktop directory, open the terminal there and run the command
hadoop fs -get /user/hduser/Input/pg*.txt .

Related

Unable to write to HDFS as non sudo user

I've changed the permission of a hdfs directory via
hdfs dfs -chmod 777 /path/to/dir
but, when writing to that directory as a non-sudo user, i get a permission error
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=crtemois, access=WRITE, inode="/aggregation/system/data/clean":owners:hdfs:drwxr-xr-x
The reason is that Apache Ranger was layered on top. Even though the permissions were changed via chmod 777, if the user permission wasn't set in Apache Ranger, writing wouldn't be possible.

Hadoop returns permission denied

I am trying to install hadoop (2.7) in cluster (two machines hmaster and hslave1). I installed hadoop in the folder /opt/hadoop/
I followed this tutorial but Iwhen I run the command start-dfs.sh, I got the following error about:
hmaster: starting namenode, logging to /opt/hadoop/logs/hadoop-hadoop-namenode-hmaster.out
hmaster: starting datanode, logging to /opt/hadoop/logs/hadoop-hadoop-datanode-hmaster.out
hslave1: mkdir: impossible to create the folder « /opt/hadoop\r »: Permission denied
hslave1: chown: impossible to reach « /opt/hadoop\r/logs »: no file or folder of this type
/logs/hadoop-hadoop-datanode-localhost.localdomain.out
I used the command chmod 777 for the folder hadoop in hslave but I still have this error.
Insted of using /opt/ use /usr/local/ if you get that permission issue again give the root permissions using chmod. I already configured hadoop 2.7 in 5 machines. Or else use "Sudo chown user:user /your log files directory".
Seems you have already gave master password less access to login slave.
Make sure you are logged in with username available on both servers.
(hadoop in your case, as tutorial you are following uses 'hadoop' user.)
you can edit the '/etc/sudoer' file using 'sudo' or directly type 'visudo' in the terminal and add the following permission for newly created user 'hadoop' :-
hadoop ALL = NOPASSWD: ALL
might it will resolved your issues.

Hortonworks Practice Exam - Copy File from local machine to hdfs ERROR

I am currently working on the Hortonworks practice exam and I am getting errors I have not been able to troubleshoot.
During the first step the prompt asks Put the three files from the home/horton/datasets/flight delays directory on the local machine into the user/horton/flight delays directory in hdfs permission denied error. When on the node that hdfs is installed on (root#namenode). I run the simple command:
hadoop fs -copyFromLocal /home/horton/datasets/flightdelays/flight_delays1.csv /user/horton/flightdelays
This returns the error /home/horton/datasets/flightdelays/flight_delays1.csv no such file or directory
When I run the same exact command above from the command line on the local machine instead of running it after being ssh'd onto the namenode (horton#some-ip) I get a permission denied error:
permission denied user=horton access=WRITE inode='/user/horton/flightdelays":hdfs:hdfs:drwxr-xr-x
If anyone has done this practice exam before or knows what this error is and could lend any assistance it would be greatly appreciated. When researching online a lot of people are running into the same issue with the permission denied but im going to assume that on a practice exam that they set up you shouldn't be needing to use sudo for every command you run.
Again any help would be fantastic thanks!!
Try this on CLI
sudo -u hdfs hdfs -copyFromLocal /input/file/path /hdfs/path/
Try this in your command line
hadoop fs -put /localfile.txt /hdfs path
The issue is that the folder you're trying to write to has ownership and permssions of hdfs:hdfs:drwxr-xr-x meaning it is owned by the 'hdfs' user and group. Only the hdfs user has write permissions to that folder everyone else has read and execute permissions only. Thus writing to that folder as the 'horton' user will not work.
You need to run the command as hdfs like so:
sudo -u hdfs hadoop fs -copyFromLocal /home/horton/datasets/flightdelays/flight_delays1.csv /user/horton/flightdelays

Can't create directory on hadoop file system

I installed hadoop 2.7.1 from root in /usr/local
now i want to give access to multiple users
when i executed the following command
hdfs dfs -mkdir /user
from hadoop user i got the error
mkdir: Permission denied: user=hadoop, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
how to resolve this problem . please help me in this
Thanks
suchetan
hdfs user is the admin user for the HDFS. Change to hdfs user and give the necessary permissions to the user you want(hadoop)
or
you can disable the dfs.permissions.enabled in the hdfs_site.xml and restart. After that you can create a folder.

Permission denied at hdfs

I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it give an error message Permission Denied.
Message from terminal with command:
hduser#ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input
put: /usr/local/input-data (Permission denied)
hduser#ubuntu:/usr/local/hadoop$
After using sudo and adding hduser to sudouser:
hduser#ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
hduser#ubuntu:/usr/local/hadoop$
I solved this problem temporary by disabling the dfs permission.By adding below property code
to conf/hdfs-site.xml
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
I had similar situation and here is my approach which is somewhat different:
HADOOP_USER_NAME=hdfs hdfs dfs -put /root/MyHadoop/file1.txt /
What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user hdfs. You can do this with other ID (beware of real auth schemes configuration but this is usually not a case).
Advantages:
Permissions are kept on HDFS.
You don't need sudo.
You don't need actually appropriate local user 'hdfs' at all.
You don't need to copy anything or change permissions because of previous points.
You are experiencing two separate problems here:
hduser#ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input put: /usr/local/input-data (Permission denied)
Here, the user hduser does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive. You should change it.
hduser#ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
Here, the user root (since you are using sudo) does not have access to the HDFS directory /input. As you can see: hduser:supergroup:rwxr-xr-x says only hduser has write access. Hadoop doesn't really respect root as a special user.
To fix this, I suggest you change the permissions on the local data:
sudo chmod -R og+rx /usr/local/input-data/
Then, try the put command again as hduser.
I've solved this problem by using following steps
su hdfs
hadoop fs -put /usr/local/input-data/ /input
exit
Start a shell as hduser (from root) and run your command
sudo -u hduser bash
hadoop fs -put /usr/local/input-data/ /input
[update]
Also note that the hdfs user is the super user and has all r/w privileges.
For Hadoop 3.x, if you try to create a file on HDFS when unauthenticated (e.g. user=dr.who) you will get this error.
It is not recommended for systems that need to be secure, however if you'd like to disable file permissions entirely in Hadoop 3 the hdfs-site.xml setting has changed to:
<property>
<name>dfs.permissions.enabled</name>
<value>false</value>
</property>
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

Resources