Permission Denied error while creating database in hive - hadoop

I am trying to database in hive, but when I run below query in HIVE:
CREATE DATABASE BIGDATA;
I receive the following error message:
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException org.apache.hadoop.security.AccessControlException: Permission denied: user=aseema, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
What is causing this?

This is because of the lack of permission to the user aseema in hdfs. Follow the steps below.
Login as hduser and perform the following operations (from the logs, it seems hduser is a superuser)
hadoop fs -mkdir -p /user/hive/warehouse
hadoop fs -mkdir /tmp
hadoop fs -chmod -R 777 /user/hive
hadoop fs -chmod 777 /tmp
After this, try executing the create database statement from aseema user.

If you are running from Local Mode then you should run this command from hdfs user:
su hdfs
Then change the permission like below if you want:
hdfs dfs -chown -R <username_of_new_owner> /user

Related

Unable to write to HDFS as non sudo user

I've changed the permission of a hdfs directory via
hdfs dfs -chmod 777 /path/to/dir
but, when writing to that directory as a non-sudo user, i get a permission error
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=crtemois, access=WRITE, inode="/aggregation/system/data/clean":owners:hdfs:drwxr-xr-x
The reason is that Apache Ranger was layered on top. Even though the permissions were changed via chmod 777, if the user permission wasn't set in Apache Ranger, writing wouldn't be possible.

hive hadoop permissions not correct

I installed apache kylin which requires Hadoop, hove, hbase and java to work. All things are installed correctly. Now when I try to run this example. I get error after the first command ie ${KYLIN_HOME}/bin/sample.sh
and below is the error I am getting
Loading data to table default.kylin_sales
Failed with exception Unable to move source file:/usr/lib/kylin/sample_cube/data/DEFAULT.KYLIN_SALES.csv to destination hdfs://localhost:54310/user/hive/warehouse/kylin_sales/DEFAULT.KYLIN_SALES.csv
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
I have set 777 permissions for both the above path and I am operating as root
Check the hdfs directory permission. If it is not like below, change permission like below
hdfs dfs -chmod g+w /user/hive/warehouse

get : permission denied in hadoop

When I execute the get command it says permission denied,
I tried the already given solution but didn't worked. Following is the command and its op
hduser#ubuntu:~$ hadoop fs -get /user/hduser/Input/pg*.txt /home/vilas/Desktop/
Warning: $HADOOP_HOME is deprecated.
get: Permission denied
Check out the permissions of this /user/hduser directory, maybe hduser does not have permission to access it, if so then you can execute the following command (as hdfs user)
hdfs dfs chown hduser:hduser /user/hduser
More information about chown here.
then try again.
you must go into the desktop directory, open the terminal there and run the command
hadoop fs -get /user/hduser/Input/pg*.txt .

Can't create directory on hadoop file system

I installed hadoop 2.7.1 from root in /usr/local
now i want to give access to multiple users
when i executed the following command
hdfs dfs -mkdir /user
from hadoop user i got the error
mkdir: Permission denied: user=hadoop, access=WRITE, inode="/user":root:supergroup:drwxr-xr-x
how to resolve this problem . please help me in this
Thanks
suchetan
hdfs user is the admin user for the HDFS. Change to hdfs user and give the necessary permissions to the user you want(hadoop)
or
you can disable the dfs.permissions.enabled in the hdfs_site.xml and restart. After that you can create a folder.

Permission denied at hdfs

I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it give an error message Permission Denied.
Message from terminal with command:
hduser#ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input
put: /usr/local/input-data (Permission denied)
hduser#ubuntu:/usr/local/hadoop$
After using sudo and adding hduser to sudouser:
hduser#ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
hduser#ubuntu:/usr/local/hadoop$
I solved this problem temporary by disabling the dfs permission.By adding below property code
to conf/hdfs-site.xml
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
I had similar situation and here is my approach which is somewhat different:
HADOOP_USER_NAME=hdfs hdfs dfs -put /root/MyHadoop/file1.txt /
What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user hdfs. You can do this with other ID (beware of real auth schemes configuration but this is usually not a case).
Advantages:
Permissions are kept on HDFS.
You don't need sudo.
You don't need actually appropriate local user 'hdfs' at all.
You don't need to copy anything or change permissions because of previous points.
You are experiencing two separate problems here:
hduser#ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input put: /usr/local/input-data (Permission denied)
Here, the user hduser does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive. You should change it.
hduser#ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x
Here, the user root (since you are using sudo) does not have access to the HDFS directory /input. As you can see: hduser:supergroup:rwxr-xr-x says only hduser has write access. Hadoop doesn't really respect root as a special user.
To fix this, I suggest you change the permissions on the local data:
sudo chmod -R og+rx /usr/local/input-data/
Then, try the put command again as hduser.
I've solved this problem by using following steps
su hdfs
hadoop fs -put /usr/local/input-data/ /input
exit
Start a shell as hduser (from root) and run your command
sudo -u hduser bash
hadoop fs -put /usr/local/input-data/ /input
[update]
Also note that the hdfs user is the super user and has all r/w privileges.
For Hadoop 3.x, if you try to create a file on HDFS when unauthenticated (e.g. user=dr.who) you will get this error.
It is not recommended for systems that need to be secure, however if you'd like to disable file permissions entirely in Hadoop 3 the hdfs-site.xml setting has changed to:
<property>
<name>dfs.permissions.enabled</name>
<value>false</value>
</property>
https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml

Resources