Hive cli starting throws error Unrecognized Hadoop major version number: 1.0.4 - hadoop

I am facing below issue to start Hive/beeline :
*Logging initialized using configuration in jar:file:/home/mine/work/apache-hive-2.3.6-bin/lib/hive-common-2.3.6.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: Unrecognized Hadoop major version number: 1.0.4
at org.apache.hadoop.hive.shims.ShimLoader.getHadoopShims(ShimLoader.java:91)*
i followed below url to set up hive setup:
https://www.bogotobogo.com/Hadoop/BigData_hadoop_Hive_Install_On_Ubuntu_16_04.php
previously, i had hadoop 1.2.1. now installed 2.7.3.
bashrc contains:
mine#ubuntu:~$ echo $HADOOP_HOME
/home/mine/work/hadoop-2.7.3
mine#ubuntu:~$ echo $HIVE_HOME
/home/mine/work/apache-hive-2.3.6-bin
hive-env.sh contains:
export HADOOP_HOME=/home/mine/work/hadoop-2.7.3
Derby server started.
I am not understanding where hadoop 1.0.4 comes. Is there any compatible issue.
Kindly, Please help me with ur Valuable suggestions.
Thanks in advance,

Try : export HADOOP_VERSION="2.7.3"

Related

Spark-shell error on windows "Illegal character in path at index 32"

I am trying to setup spark on my new windows laptop. I am getting below error while running spark-shell :
"
ERROR Main: Failed to initialize Spark session.
java.lang.reflect.InvocationTargetException
Caused by: java.net.URISyntaxException: Illegal character in path at index 32: spark://DESKTOP-RCMDGS4:49985/C:\classes"
I am using below s/w :
Spark 3.2.1
Java 11
Hadoop: winutils
I have set below environment variables :
HADOOP_HOME, SPARK_HOME, JAVA_HOME, PATH
This is known issue in latest spark version. Downgrade to 3.0.3 could fix the issue.

Unable to start hadoop problem with namenode

Once I install Hadoop and type hdfs namenode –format or hadoop namenode -format in cmd for the 1st time,
Am getting below error, can anyone help me in solving this.
1st it is asking me this:
Re-format filesystem in Storage Directory root= C:\hadoop-3.2.1\data\namenode; location= null ? (Y or N)
No matter what I give i.e., Y or N, am getting the below error.
ERROR namenode.NameNode: Failed to start namenode
ERROR namenode.NameNode: Failed to start namenode.
java.lang.UnsupportedOperationException
INFO util.ExitUtil: Exiting with status 1: java.lang.UnsupportedOperationException
Quick answer is much appreciated
Regards
ShaX
This is a bug in 3.2.1 release and is supposed to fixed in 3.2.2 or 3.3.0.
The fix is to change the StorageDirectory class by adding FileUtil for Windows permission setup:
if (permission != null) {
try {
Set<PosixFilePermission> permissions =
PosixFilePermissions.fromString(permission.toString());
Files.setPosixFilePermissions(curDir.toPath(), permissions);
} catch (UnsupportedOperationException uoe) {
// Default to FileUtil for non posix file systems
FileUtil.setPermission(curDir, permission);
}
}
I found this issue when publishing a Hadoop 3.2.1 installation guide on Windows:
Latest Hadoop 3.2.1 Installation on Windows 10 Step by Step Guide
I published a temporary resolution and it is working. Refer to my above post for details and you can follow it to complete Hadoop 3.2.1 installation on Windows 10. I've uploaded my updated Hadoop HDFS jar file to the following location:
https://github.com/FahaoTang/big-data/blob/master/hadoop-hdfs-3.2.1.jar

select query errored out in Hive

I am using Hadoop - 1.0.4 & Hive - 1.2.1.
I am facing issue with select query in hive CLI. snippet of error log attached. Please help me resolving the issue.
Thanks Nirmal. Its resolved after upgrading hadoop version to 2.6.0

How to remove apache oozie completly?

I want to remove oozie and reinstall a fresh copy.
I installed oozie by following this steps
http://hadooptutorial.info/apache-oozie-installation-on-ubuntu-14-04/ Can anyone please help me to remove oozie completely from my laptop?
I am using ubuntu latest version ..with hadoop 2.6.0 ..
Earlier I removed /usr/lib/oozie folder but it did not worked out for me after installing a fresh copy of oozie ..(got many errors and exception )
I am describing few of the errrors below after installing fresh copy of oozie
oozie admin -oozie http://localhost:11000/oozie -status
Connection exception has occurred [ java.net.ConnectException Connection refused ]. Trying after 1 sec. Retry count = 1
oozied.sh stop
PID file found but no matching process was found. Stop aborted.
oozie-setup.sh sharelib create -fs hdfs://localhost:9000
setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/io/filefilter/IOFileFilter
Thank you
removing /usr/lib/oozie will not remove oozie entirely .
Something more is require

Hadoop issue with Sqoop installation

I have Hadoop(pseudo distributed mode), Hive, sqoop and mysql installed in my local machine.
But when I am trying to run sqoop Its giving me the following error
Error: /usr/lib/hadoop does not exist!
Please set $HADOOP_COMMON_HOME to the root of your Hadoop installation.
Then I set the sqoop-env-template.sh file with all the information. Beneath is the snapshot of the sqoop-env-template.sh file.
Even after providing the hadoop hive path I face the same error.
I've installed
hadoop in /home/hduser/hadoop version 1.0.3
hive in /home/hduser/hive version 0.11.0
sqoop in /home/hduser/sqoop version 1.4.4
and mysql connector jar java-5.1.29
Could anybody please throw some light on what is going wrong
sqoop-env-template.sh is a template, meaning it doesn't by itself get sourced by the configurator. If you want it to have a custom conf and load it, make a copy as $SQOOP_HOME/conf/sqoop-env.sh.
Note: here is the relevant excerpt from bin/configure-sqoop for version 1.4.4:
SQOOP_CONF_DIR=${SQOOP_CONF_DIR:-${SQOOP_HOME}/conf}
if [ -f "${SQOOP_CONF_DIR}/sqoop-env.sh" ]; then
. "${SQOOP_CONF_DIR}/sqoop-env.sh"
fi

Resources