Problems in creating volume in mapr hadoop - hadoop

I want to create a volume(MyVolume) but it can not mount the volume. it says this error: Failed to mount volume MyVolume, Permission denied its permission is root and admin.how can i create volume that comes to folder my cluster? thanks

It sounds like the user you are logging in as doesn't have the right permissions. This link will help you set the permissions up correctly:
http://www.mapr.com/doc/display/MapR/Managing+Permissions
When starting a new cluster, I typically create a 'mapr' user and give him admin permissions with this command:
maprcli acl edit -type cluster -user mapr:fc
Hope this helps!

Related

Hadoop returns permission denied

I am trying to install hadoop (2.7) in cluster (two machines hmaster and hslave1). I installed hadoop in the folder /opt/hadoop/
I followed this tutorial but Iwhen I run the command start-dfs.sh, I got the following error about:
hmaster: starting namenode, logging to /opt/hadoop/logs/hadoop-hadoop-namenode-hmaster.out
hmaster: starting datanode, logging to /opt/hadoop/logs/hadoop-hadoop-datanode-hmaster.out
hslave1: mkdir: impossible to create the folder « /opt/hadoop\r »: Permission denied
hslave1: chown: impossible to reach « /opt/hadoop\r/logs »: no file or folder of this type
/logs/hadoop-hadoop-datanode-localhost.localdomain.out
I used the command chmod 777 for the folder hadoop in hslave but I still have this error.
Insted of using /opt/ use /usr/local/ if you get that permission issue again give the root permissions using chmod. I already configured hadoop 2.7 in 5 machines. Or else use "Sudo chown user:user /your log files directory".
Seems you have already gave master password less access to login slave.
Make sure you are logged in with username available on both servers.
(hadoop in your case, as tutorial you are following uses 'hadoop' user.)
you can edit the '/etc/sudoer' file using 'sudo' or directly type 'visudo' in the terminal and add the following permission for newly created user 'hadoop' :-
hadoop ALL = NOPASSWD: ALL
might it will resolved your issues.

Can't create directory on hadoop file system

I installed hadoop 2.7.1 from root in /usr/local
now i want to give access to multiple users
when i executed the following command
hdfs dfs -mkdir /user
from hadoop user i got the error
mkdir: Permission denied: user=hadoop, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
how to resolve this problem . please help me in this
Thanks
suchetan
hdfs user is the admin user for the HDFS. Change to hdfs user and give the necessary permissions to the user you want(hadoop)
or
you can disable the dfs.permissions.enabled in the hdfs_site.xml and restart. After that you can create a folder.

"Permission denied" for almost everything after a successful ssh into gcloud instance that was created using bdutil

Just created instance and deployed a cluster using bdutil. SSH works fine as I can ssh into instance using ./bdutil shell.
When I try to access directories such as Hadoop, hdfs etc., it throws an error:
Permission Denied
The terminal appears like this username#hadoop-m $ I know hadoop-m is the name of the instance. What is the username? It says my name but I don't know where it got this from or what the password is.
I am using Ubuntu to ssh into the instance.
Not a hadoop expert, I can answer a bit generally. On GCE when you ssh in gcloud creates a username from you google account name. Hadoop directories such as hadoop or hdfs are probably owned by a different user. Please try using sudo chmod to make give your username permissions to read/write the directories you need.
To elaborate on Jeff's answer, bdutil-deployed clusters set up the user hadoop as the Hadoop admin (this 'admin' user may differ on different Hadoop systems, where Hadoop admin accounts may be split into separate users hdfs, yarn, mapred, etc). Note that bdutil clusters should work without needing deal with Hadoop admin stuff for normal jobs, but if you need to access those Hadoop directories, you can either do:
sudo su hadoop
or
sudo su
to open a shell as hadoop or root, respectively. Or as Jeff mentions, you can sudo chmod to grant broader access to your own username.

SafeModeException at cosmos.lab.fi-ware.org

According to the wiki
http://forge.fiware.org/plugins/mediawiki/wiki/fiware/index.php/BigData_Analysis_-_Quick_Start_for_Programmers#Step_1._Create_a_Cosmos_account
and via
ssh my_user#cosmos.lab.fi-ware.org
1) I realize that there is no folder '/user/my_user' but '/home/my_user'. Why? May I suposse that my user is not propertly created?
2) I am trying to create a folder, but I get the SafeModeException:
hadoop fs -mkdir /home/my_user/test
mkdir: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /home/my_user/test. Name node is in safe mode.
I have tried:
hadoop dfsadmin -safemode leave
with this result:
safemode: org.apache.hadoop.security.AccessControlException: Access denied for user my_user. Superuser privilege is required
Thanks!
Pablo, yesterday the Namenode of the Cosmos instance entered in safe mode due to the HDD was running out of space. It should be fixed now, but while the safe mode was enabled, nothing could be done with HDFS, including your user account creation.
I have completed the registration process manually, try it and let me know if something is still wrong (I also answered you by private email, with all the details regarding your user).
Regarding the Hadoop commands you tried (leaving safe mode and creating a folder under /user), these are privileged operations :)

Hadoop Single Node : Permission Denied

I just installed Hadoop single node but when i run it by logging on localhost it gives error that it cannot make changes to files as permission is denied?
Have you followed all the steps as suggested in: http://hadoop.apache.org/common/docs/current/single_node_setup.html ?
You may want to look at this : http://getsatisfaction.com/cloudera/topics/permission_denied_error_in_desktop
Also, some more information would definitely help.
You have not given necessary permissions.Make a different user other than root.Follow this tutorial to the point http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
It seems to be missing permissions for the user on the directory containing the files
make sure that the user you are logged on , is the owner of the Hadoop directory by running
ls -la command
if not the owner run the command chown -R hadoop user:group hadoop directory and it will work fine.
also you can follow the tutorial of michael noll
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

Resources