Running pig scripts giving me the error: You are a Hue admin but not a HDFS superuser (which is "hdfs") - hadoop

I'm using cloudera quickstart vm-4.7.
I'm unable to run pig scripts as it's throwing the following error message :
Cannot access: /pigwordcount/wordcountinput.txt. Note: You are a Hue admin but not a HDFS superuser (which is "hdfs").
[Errno 2] File /pigwordcount/wordcountinput.txt not found
It is giving me "file not found" but already the files were present in the directory user/cloudera/pigwordcount
How can I resolve this issue?

Related

Getting write permission from HDFS after updating flink-1.40 to flink-1.4.2

Environment
Flink-1.4.2
Hadoop 2.6.0-cdh5.13.0 with 4 nodes in service and Security is off.
Ubuntu 16.04.3 LTS
Java 8
Description
I have a Java job in flink-1.4.0 which writes to HDFS in a specific path.
After updating to flink-1.4.2 I'm getting the following error from Hadoop complaining that the user doesn't have write permission to the given path:
WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:xng (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=user1, access=WRITE, inode="/user":hdfs:hadoop:drwxr-xr-x
NOTE:
If I run the same job on flink-1.4.0, Error disappears regardless of what version of flink (1.4.0 or 1.4.2) dependencies I have for job
Also if I run the job main method from my IDE and pass the same parameters, I don't get above error.
Question
Any Idea what's wrong? Or how to fix?

permission denied error on hdfs hile using put command

While trying to use the put command to add patternsToSkip file to hdfs, I get an error saying: Permission denied: user=root, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x
In the image below, you can see the sequence of commands written along with the error:
I tried to user access as biadmin, root, and even hdfs but with no luck! (details in the image)
please help me fix this error. Thanks folks.
Reason, it is giving permission issue is because you are trying to put the file inside /user directory in hdfs since you are using 2 dots in put statement. You need to login as supergroup to access or copy file inside that particular directory.
What i would suggest is try running below commands to copy file to hdfs.
Target with one dot
hadoop fs -put patternsToSkip .
OR
Giving complete target directory path
hadoop fs -put patternsToSkip /user/<instance_name>/output

Authentication failed, status: 503 error hortonworks HDP 2.4

I am getting the following error: (through command line as well as web-interface).
Useful info:
1. Hive, HDFS, Yarn services are up and running.
2. I can even get into hive prompt through command line and web-interface. The error occurs when I use show databases. (or click refresh symbol on database explorer of web-interface).
3. I logged in as root user, hdfs user
4. I tried changing permissions to 755 for the directory /user/root
Any help would be greatly appreciated..
------------------start of error message (copied from that of web-interface log)
Unable to submit statement. Error while processing statement: FAILED: Hive Internal Error: com.sun.jersey.api.client.ClientHandlerException(java.io.IOException: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 503, message: Service Unavailable) [ERROR_STATUS].
Step 1) Restart Atlas on Sandbox.
Step 2) Restart Hive services on Sandbox.
For me this resolved the issue.
Cheers

hive hadoop permissions not correct

I installed apache kylin which requires Hadoop, hove, hbase and java to work. All things are installed correctly. Now when I try to run this example. I get error after the first command ie ${KYLIN_HOME}/bin/sample.sh
and below is the error I am getting
Loading data to table default.kylin_sales
Failed with exception Unable to move source file:/usr/lib/kylin/sample_cube/data/DEFAULT.KYLIN_SALES.csv to destination hdfs://localhost:54310/user/hive/warehouse/kylin_sales/DEFAULT.KYLIN_SALES.csv
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
I have set 777 permissions for both the above path and I am operating as root
Check the hdfs directory permission. If it is not like below, change permission like below
hdfs dfs -chmod g+w /user/hive/warehouse

Permission denied error for logged in user for Apache Pig

I am getting the following error when I try to run pig -help.
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1717)
at java.io.File.createTempFile0(File.java:1738)
at java.io.File.createTempFile(File.java:1815)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
Here is my configuration-
Apache Hadoop - 1.0.3
Apache Pig - 0.10.0
OS - Ubuntu 64-bit
User for whom the error is seen - "sumod" this is an admin level account. I have also created directory for him in the HDFS.
User for whom this error is NOT seen - "hadoop". I have created this user for hadoop jobs. He is not an admin user. But he belongs to "supergroup" on HDFS.
The paths are properly set for both the users.
I do not have to start hadoop while running "pig -help" command. I only want to make sure that Pig is installed properly.
I am following Apache doc and my understanding is that I do not have to be hadoop user to run Pig and I can be a general system user.
Why am I getting these errors? What am I doing wrong?
I had seen the same exception error. The reason for me was that the user I was running pig did not have write permission on ${hadoop.tmp.dir}
Please check the permissions of the directory where the pigscript is placed.
Whenever a pigscript is executed, errors are logged in a log file, which is written in your present working directory.
Assume your pigscript is in dir1 and your pwd is dir2 and since you are executing as user sumod; sumod should have write permissions in dir2.

Resources