impala-shell query failing with Error(13) - shell

I'm running an impala-shell on a 3-node cluster. Some queries work just fine, but a few return the following error:
Create file /tmp/impala-scratch/924abcb4827fd7ba:d15cd3585951f4b2_c8e0146a-37cd-457a-96f6-ac5d933cd4da failed with errno=13 description=Error(13): Permission denied
I have checked my local directory, and /tmp/impala-scratch does exist and it is read-write-executable by me. Any tips would be greatly appreciated!

Okay, so I figured it out. It turns out that the old /tmp/impala-scratch had access permissions:
drwxr-xr-x
According to:
Hiveserver2: Failed to create/change scratchdir permissions to 777: Could not create FileClient
You have to change the permissions to 777:
chmod -R 777 /tmp/impala-scratch/
And this fixed it.

Related

Unable to write to HDFS as non sudo user

I've changed the permission of a hdfs directory via
hdfs dfs -chmod 777 /path/to/dir
but, when writing to that directory as a non-sudo user, i get a permission error
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=crtemois, access=WRITE, inode="/aggregation/system/data/clean":owners:hdfs:drwxr-xr-x
The reason is that Apache Ranger was layered on top. Even though the permissions were changed via chmod 777, if the user permission wasn't set in Apache Ranger, writing wouldn't be possible.

Hadoop returns permission denied

I am trying to install hadoop (2.7) in cluster (two machines hmaster and hslave1). I installed hadoop in the folder /opt/hadoop/
I followed this tutorial but Iwhen I run the command start-dfs.sh, I got the following error about:
hmaster: starting namenode, logging to /opt/hadoop/logs/hadoop-hadoop-namenode-hmaster.out
hmaster: starting datanode, logging to /opt/hadoop/logs/hadoop-hadoop-datanode-hmaster.out
hslave1: mkdir: impossible to create the folder « /opt/hadoop\r »: Permission denied
hslave1: chown: impossible to reach « /opt/hadoop\r/logs »: no file or folder of this type
/logs/hadoop-hadoop-datanode-localhost.localdomain.out
I used the command chmod 777 for the folder hadoop in hslave but I still have this error.
Insted of using /opt/ use /usr/local/ if you get that permission issue again give the root permissions using chmod. I already configured hadoop 2.7 in 5 machines. Or else use "Sudo chown user:user /your log files directory".
Seems you have already gave master password less access to login slave.
Make sure you are logged in with username available on both servers.
(hadoop in your case, as tutorial you are following uses 'hadoop' user.)
you can edit the '/etc/sudoer' file using 'sudo' or directly type 'visudo' in the terminal and add the following permission for newly created user 'hadoop' :-
hadoop ALL = NOPASSWD: ALL
might it will resolved your issues.

Cloudera Installation issue (scm_prepare_node.sh: Permission denied)

When i am trying to installing cloudera hadoop i am getting below error while copying the files stage
/tmp/scm_prepare_node.BggVxw3l
bash: /tmp/scm_prepare_node.BggVxw3l/scm_prepare_node.sh: Permission denied
Can anyone help to fix this issue.
P.S: tmp having 777 permissions drwxrwxrwt. 41 root root 4096 May 9 14:59 tmp
my /tmp was mounted. So i changed noexec permissions in /etc/fstab then restarted the machines. Now everything working fine

Problems in creating volume in mapr hadoop

I want to create a volume(MyVolume) but it can not mount the volume. it says this error: Failed to mount volume MyVolume, Permission denied its permission is root and admin.how can i create volume that comes to folder my cluster? thanks
It sounds like the user you are logging in as doesn't have the right permissions. This link will help you set the permissions up correctly:
http://www.mapr.com/doc/display/MapR/Managing+Permissions
When starting a new cluster, I typically create a 'mapr' user and give him admin permissions with this command:
maprcli acl edit -type cluster -user mapr:fc
Hope this helps!

Hadoop Single Node : Permission Denied

I just installed Hadoop single node but when i run it by logging on localhost it gives error that it cannot make changes to files as permission is denied?
Have you followed all the steps as suggested in: http://hadoop.apache.org/common/docs/current/single_node_setup.html ?
You may want to look at this : http://getsatisfaction.com/cloudera/topics/permission_denied_error_in_desktop
Also, some more information would definitely help.
You have not given necessary permissions.Make a different user other than root.Follow this tutorial to the point http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
It seems to be missing permissions for the user on the directory containing the files
make sure that the user you are logged on , is the owner of the Hadoop directory by running
ls -la command
if not the owner run the command chown -R hadoop user:group hadoop directory and it will work fine.
also you can follow the tutorial of michael noll
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

Resources