Exception in thread "main" java.io.IOException: Permission denied in command line. hadoop - hadoop

ramubuntu#ubuntu:~$ hadoop jar ./wordcount.jar com.hadoop.ram.wc.WordCountDriver /input /output
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:2024)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
i also tired to change permissions of jar n folders..
i written code using eclipse . under Java 6 version .. but i installed Java 8 in my Ubuntu.. while creating Java project i changed jre to 1.6..
file owner is current user only. i hope you understand my problem

is selinux enabled? try disabled it first
setenforce 0

Related

running hadoop-free spark build on windows 10

currently i installed Hadoop ver 3.2.3 on windows 10 and there is no problem running it. i also downloaded spark-3.3.0-bin-without-hadoop.tgz and set Enviornment variable SPARK_HOME appropriately.
i also check hadoop classpath and it returned:
C:\hadoop-3.2.3\etc\hadoop;C:\hadoop-3.2.3\share\hadoop\common;C:\hadoop-3.2.3\share\hadoop\common\lib\*;C:\hadoop-3.2.3\share\hadoop\common\*;C:\hadoop-3.2.3\share\hadoop\hdfs;C:\hadoop-3.2.3\share\hadoop\hdfs\lib\*;C:\hadoop-3.2.3\share\hadoop\hdfs\*;C:\hadoop-3.2.3\share\hadoop\yarn;C:\hadoop-3.2.3\share\hadoop\yarn\lib\*;C:\hadoop-3.2.3\share\hadoop\yarn\*;C:\hadoop-3.2.3\share\hadoop\mapreduce\lib\*;C:\hadoop-3.2.3\share\hadoop\mapreduce\*
seem's fine.
and i added
export SPARK_DIST_CLASSPATH=$(hadoop classpath)
in the spark-env.sh
but still when i type spark-shell or pyspark command in command line i get the famous error
Error: A JNI error has occurred, please check your installation and try again. Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
...
the path seem's to be right.
any idea what should i do now?
thanks for your attention!

Error while trying to run jar in Hadoop

I am getting following error while trying to run jar through hadoop command prompt
Exception in thread "main" java.io.IOException: Error opening job jar: /tmp/NewJar.jar at org.apache.hadoop.util.RunJar.main(RunJar.java:124)
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:127)
at java.util.jar.JarFile.<init>(JarFile.java:136)
at java.util.jar.JarFile.<init>(JarFile.java:73)
at org.apache.hadoop.util.RunJar.main(RunJar.java:122)
Most probable causes :
- Incorrect path of the jar.
- Improper permissions on the folder where Hadoop is trying to run the jar file.
Please make sure you have specified the correct path and you have proper directory permissions.
This error was caused by permission issue for me.
My jar file has the permission rw-r--r-- by default. I changed it to rwx-rwx-rwx through command chmod 777 my_jar.jar. And the error went away.

Steps to install Hive

I have Hadoop configured in my REDHAT system. I am getting the following error when $HIVE_HOME/bin/hive is executed..
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at org.apache.hadoop.util.RunJar.main(RunJar.java:115)
hive uses a 'metastore'; it creates this directory when you invoke it for the first time. The meta-directory is usually created in the current working directory you are in (i.e. where you are running the hive command)
which dir are you invoking hive command from? Do you have write permissions there?
try this:
cd <--- this will take you to your home dir (you will have write permissions there)
hive

Impala on Cloudera CDH "Could not create logging file: Permission denied"

I installed Impala via a parcel in the Cloudera Manager 4.5 on a CDH 4.2.0-1.cdh4.2.0.p0.10 cluster.
When I try to start the service it fails on all nodes with this message
perl -pi -e 's#{{CMF_CONF_DIR}}#/run/cloudera-scm-agent/process/800-impala-IMPALAD#g' /run/cloudera-scm-agent/process/800-impala-IMPALAD/impala-conf/impalad_flags
'[' impalad = impalad ']'
exec /opt/cloudera/parcels/IMPALA-0.6-1.p0.109/lib/impala/../../bin/impalad --flagfile=/run/cloudera-scm-agent/process/800-impala-IMPALAD/impala-conf/impalad_flags
Could not create logging file: Permission denied
COULD NOT CREATE A LOGGINGFILE 20130326-204959.15015!log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /var/log/impalad/impalad.INFO (Permission denied)
at java.io.FileOutputStream.openAppend(Native Method)
...
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:685)
at org.apache.hadoop.fs.FileSystem.<clinit>(FileSystem.java:92)
+ date
Complete StdErr Log
I'm unsure whether the permission issue is cause of Impala not running or whether something else crashes and the permission issues just comes up because the crash log can not be written.
Any help would be great!
Run impala from debug binaries as described here:
https://issues.cloudera.org/browse/IMPALA-160
Seems to be related to the JVM in Kernel 12.04.1 LTS
Original Answer: https://groups.google.com/a/cloudera.org/forum/?fromgroups=#!topic/impala-user/4MRZYbn5hI0

Apache Pig 0.10 with CDH3u0 Hadoop failed to work as normal user

I have used Pig, but new to Hadoop/Pig installation.
I have DH3u0 Hadoop installed running Pig 0.8
I downloaded Pig 0.10 and installed it in a separate directory.
I am able to start pig as root user, but failed to start pig as normal user with the following error:
Exception in thread "main" java.io.IOException: Permission denie
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at org.apache.hadoop.util.RunJar.main(RunJar.java:146)
Any pointer to the problem would be greatly appreciated.
Also the log file is defaulted to the pig installed directory, is there a way to default the log directory to the user home directory without using the -l option.

Resources