Cloudera user not alowed to manipulate hdfs system on hadoop - hadoop

I am trying to create a folder in hdfs hadoop file system but it is not allowing me to create a folder using the user cloudera nor as root. What should I configure to make it to allow me to hier was my attempt:
[cloudera#quickstart ~]$ sudo hadoop fs -mkdir /solr/test_core
mkdir: Permission denied: user=root, access=WRITE, inode="/solr":solr:supergroup:drwxr-xr-x
[cloudera#quickstart ~]$ su
Password:
[root#quickstart cloudera]# hadoop fs -mkdir /solr/test_core
mkdir: Permission denied: user=root, access=WRITE,inode="/solr":solr:supergroup:drwxr-xr-x
[root#quickstart cloudera]#

Neither cloudera nor root users will have permissions to run any command on /solr
to run any command you need to change into hdfs and then issue the commands like below:
su - hdfs
hadoop fs -mkdir /solr/test_core/
exit

Found the answer:
You should use these weird command.
sudo -u hdfs hdfs dfs -mkdir /solr/test_core/

To switch user to hdfs:
sudo su - hdfs.
Then you can make directory under /solr
To switch back to cloudera user
su - cloudera
and enter the password for cloudera

Related

Permission is denied when moving file from repository to another

Assume that I want to move a csv file from /home/user to /hdfs/data/adhoc/PR/02/RDO0/OUTPUT/
So :
hadoop fs mkdir -m 777 /hdfs/data/adhoc/PR/02/RDO0/OUTPUT/
hadoop fs -moveFromLocal RDO07J420.csv $OUTPUT_FILE_OCRE/MGM7J420-${OPC_DISO8601}.csv
But, I get this problem :
moveFromLocal: Permission denied: user=fs191, access=WRITE,
inode="/hdfs/data/adhoc/PR/02/RDO0/OUTPUT/MGM7J420-.csv.COPYING":RDO0-mdoPR:bfRDO0:drwxr-x---
You local user does not have write rights in hdfs.
Try
sudo -u hdfs hadoop fs -moveFromLocal RDO07J420.csv $OUTPUT_FILE_OCRE/MGM7J420-${OPC_DISO8601}.csv
hdfs is the root user and has write rights, but I suggest managing users and permissions better
http://www.informit.com/articles/article.aspx?p=2755708&seqNum=3

how do you create a hive warehouse directory?

I've installed hadoop and hive. I am trying to configure hive as follows:
hadoop fs -mkdir /data/hive/warehouse
I keep getting this error:
mkdir: '/data/hive/warehouse': No such file or directory
Do I need to create the directories with os commands before issuing the hadoop fs command? Any ideas?
You're missing the -p option similar to UNIX/Linux.
$ hadoop fs -mkdir -p /data/hive/warehouse
In addition, you should also chmod 1777 this directory if you're setting this up for multiple users and add /user/hive if you're running Hive as user hive.
$ hadoop fs -chmod -R 1777 /data/hive/warehouse
$ hadoop fs -mkdir -p /user/hive
$ hadoop fs -chown hive:hive /user/hive
See Apache Hive File System Permissions in CDH and Where does Hive store files in HDFS?.

"hadoop fs -mkdir" permission denied despite being in the correct group

I am trying to create a folder in HDFS from the command line with a user different from hdfs. The directory has permissions 775 for hdfs:hdfs:
$ hadoop fs -ls /
... directories ...
drwxrwxr-x - hdfs hdfs 0 2018-02-21 11:37 /data
... more directories
My user is in the group hdfs:
$ cat /etc/group
hdfs:x:nnnn:myusername
However, when I do hadoop fs -mkdir /data/foo I get:
mkdir: Permission denied: user=myusername, access=WRITE, inode="/data/foo":hdfs:hdfs:drwxrwxr-x
Does hdfs have to be my primary group for this?

Hadoop directory file to user folder

I have created a folder in the root directly and I'm trying to copy a folder to hdfs hadoop but I'm getting an error message. This is the steps that I have followed:
[root#dh] ls
XXdirectoryXX
[root#dh] sudo –u hdfs hadoop fs –mkdir /user/uname
[root#hd] uname
[root#hd] sudo –u hdfs hadoop fs –chown uname /user/uname
[root#hd] su - uname
[uname#hd] hadoop fs –copyFromLocal XXdirectoryXX/ /user/uname
copyFromLocal: 'XXdirectoryXX/': No such file or directory
Is there a problem in the command or what I've done or should I use another command to copy the files over?
I'm using Centos 6.8 in the machine
Any ideas?
Thanks
Thanks to the comments I've managed to resolve the issue. Here is the code it it helps someone:
[root#dh] sudo -u hdfs hadoop fs -chown -R root /user/uname
[root#dh] hadoop fs –copyFromLocal XXdirectoryXX/ /user/uname
Regards

No such file or directory error when using Hadoop fs --copyFromLocal command

I have a local VM that has Hortonworks Hadoop and hdfs installed on it. I ssh'ed into the VM from my machine and now I am trying to copy a file from my local filesystem into hdfs through following set of commands:
[root#sandbox ~]# sudo -u hdfs hadoop fs -mkdir /folder1/
[root#sandbox ~]# sudo -u hdfs hadoop fs -copyFromLocal /root/folder1/file1.txt /hdfs_folder1/
When I execute it I get following error as - copyFromLocal:/root/folder1/file1.txt': No such file or directory
I can see that file right in /root/folder1/ directory but with hdfs command its throwing above error. I also tried to cd to /root/folder1/ and then execute the command but same error comes. Why is the file not getting found when it is right there?
By running sudo -u hdfs hadoop fs..., it tries to read the file /root/folder1/file.txt as hdfs.
You can do this.
Run chmod 755 -R /root. It will change permissions on directory and file recursively. But it is not recommended to open up permission on root home directory.
Then you can run the copyFromLocal as sudo -u hdfs to copy file from local file system to hdfs.
Better practice is to create user space for root and copy files directly as root.
sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root:root /user/root
hadoop fs -copyFromLocal
I had the same problem running a Hortonworks 4 node cluster. As mentioned, user "hdfs" doesn't have permission to the root directory. The solution is to copy the information from the root folder to something the "hdfs" user can access. In the standard Hortonworks installation this is /home/hdfs
as root run the following...
mkdir /home/hdfs/folder1
cp /root/folder1/file1.txt /home/hdfs/folder1
now change users to hdfs and run from the hdfs USER's accessible directory
su hdfs
cd /home/hdfs/folder1
now you can access files as the hdfs user
hdfs dfs -put file1.txt /hdfs_folder1

Resources