Apache Pig 0.10 with CDH3u0 Hadoop failed to work as normal user - hadoop

I have used Pig, but new to Hadoop/Pig installation.
I have DH3u0 Hadoop installed running Pig 0.8
I downloaded Pig 0.10 and installed it in a separate directory.
I am able to start pig as root user, but failed to start pig as normal user with the following error:
Exception in thread "main" java.io.IOException: Permission denie
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at org.apache.hadoop.util.RunJar.main(RunJar.java:146)
Any pointer to the problem would be greatly appreciated.
Also the log file is defaulted to the pig installed directory, is there a way to default the log directory to the user home directory without using the -l option.

Related

Spark 2.0.1 not finding file passed in through archives flag

I was running Spark job which make use of other files that is passed in through --archives flag of spark
spark-submit .... --archives hdfs:///user/{USER}/{some_folder}.zip .... {file_to_run}.py
Spark is currently running on YARN and when I tried it with spark version 1.5.1 it was fine.
However, when I ran the same commands with spark 2.0.1, I got
ERROR yarn.ApplicationMaster: User class threw exception: java.io.IOException: Cannot run program "/home/{USER}/{some_folder}/.....": error=2, No such file or directory
Since the resource is managed by YARN, it is challenging to manually check if the file gets successfully decompressed and exist when the job runs.
I wonder if anyone has experienced similar issue.

Exception in thread "main" java.io.IOException: Permission denied in command line. hadoop

ramubuntu#ubuntu:~$ hadoop jar ./wordcount.jar com.hadoop.ram.wc.WordCountDriver /input /output
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2024)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
i also tired to change permissions of jar n folders..
i written code using eclipse . under Java 6 version .. but i installed Java 8 in my Ubuntu.. while creating Java project i changed jre to 1.6..
file owner is current user only. i hope you understand my problem
is selinux enabled? try disabled it first
setenforce 0

Executing Mahout against Hadoop cluster

I have a jar file which contains the mahout jars as well as other code I wrote.
It works fine in my local machine.
I would like to run it in a cluster that has Hadoop already installed.
When I do
$HADOOP_HOME/bin/hadoop jar myjar.jar args
I get the error
Exception in thread "main" java.io.IOException: Mkdirs failed to create /some/hdfs/path (exists=false, cwd=file:local/folder/where/myjar/is)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java 440)
...
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
I checked that I can access and create the dir in the hdfs system.
I have also ran hadoop code (no mahout) without a problem.
I am running this in a linux machine.
Check for the mahout user and hadoop user being same. and also check for mahout and hadoop version compatibility.
Regards
Jyoti ranjan panda

Steps to install Hive

I have Hadoop configured in my REDHAT system. I am getting the following error when $HIVE_HOME/bin/hive is executed..
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
hive uses a 'metastore'; it creates this directory when you invoke it for the first time. The meta-directory is usually created in the current working directory you are in (i.e. where you are running the hive command)
which dir are you invoking hive command from? Do you have write permissions there?
try this:
cd <--- this will take you to your home dir (you will have write permissions there)
hive

Permission denied error for logged in user for Apache Pig

I am getting the following error when I try to run pig -help.
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1717)
at java.io.File.createTempFile0(File.java:1738)
at java.io.File.createTempFile(File.java:1815)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
Here is my configuration-
Apache Hadoop - 1.0.3
Apache Pig - 0.10.0
OS - Ubuntu 64-bit
User for whom the error is seen - "sumod" this is an admin level account. I have also created directory for him in the HDFS.
User for whom this error is NOT seen - "hadoop". I have created this user for hadoop jobs. He is not an admin user. But he belongs to "supergroup" on HDFS.
The paths are properly set for both the users.
I do not have to start hadoop while running "pig -help" command. I only want to make sure that Pig is installed properly.
I am following Apache doc and my understanding is that I do not have to be hadoop user to run Pig and I can be a general system user.
Why am I getting these errors? What am I doing wrong?
I had seen the same exception error. The reason for me was that the user I was running pig did not have write permission on ${hadoop.tmp.dir}
Please check the permissions of the directory where the pigscript is placed.
Whenever a pigscript is executed, errors are logged in a log file, which is written in your present working directory.
Assume your pigscript is in dir1 and your pwd is dir2 and since you are executing as user sumod; sumod should have write permissions in dir2.

Resources