error to load main class for sqoop after successful instalation - windows

OS: Windows 10
Sqoop Version: 1.4.7
Hadoop Version: 2.9.2
Hi after installing Sqoop on win 10 following the instructions from
https://medium.com/analytics-vidhya/sqoop-how-to-install-in-5-steps-in-windows-10-ca2f17e11e75
i then try to check the install using the "sqoop version" command, but all i get is following error:
it fails:
C:\Users\iamme>sqoop version
Warning: HBASE_HOME and HBASE_VERSION not set.
Warning: HCAT_HOME not set
Warning: HCATALOG_HOME does not exist HCatalog imports will fail.
Please set HCATALOG_HOME to the root of your HCatalog installation.
Warning: ACCUMULO_HOME not set.
Warning: ZOOKEEPER_HOME not set.
Warning: HBASE_HOME does not exist HBase imports will fail.
Please set HBASE_HOME to the root of your HBase installation.
Warning: ACCUMULO_HOME does not exist Accumulo imports will fail.
Please set ACCUMULO_HOME to the root of your Accumulo installation.
Warning: ZOOKEEPER_HOME does not exist Accumulo imports will fail.
Please set ZOOKEEPER_HOME to the root of your Zookeeper installation.
Error: Could not find or load main class org.apache.sqoop.Sqoop
Caused by: java.lang.ClassNotFoundException: org.apache.sqoop.Sqoop
I checked all Hadoop and Sqoop ENV Variables, and they seem to be fine.
Any clue what might be wrong?
Best Regards,
R.S.

Related

Which version of sqoop should I use for hadoop 3.3.0?

I am trying to install sqoop 1.4.7 in windows 10 on hadoop 3.3.0 ,
on using ./configure-sqoop on GIT bash I get following o/p:
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
on verifying the installation using sqoop.cmd version , I get:
Warning: HBASE_HOME and HBASE_VERSION not set.
Warning: HCAT_HOME not set
Warning: HCATALOG_HOME does not exist HCatalog imports will fail.
Please set HCATALOG_HOME to the root of your HCatalog installation.
Warning: ACCUMULO_HOME not set.
Warning: ZOOKEEPER_HOME not set.
Warning: HBASE_HOME does not exist HBase imports will fail.
Please set HBASE_HOME to the root of your HBase installation.
Warning: ACCUMULO_HOME does not exist Accumulo imports will fail.
Please set ACCUMULO_HOME to the root of your Accumulo installation.
Warning: ZOOKEEPER_HOME does not exist Accumulo imports will fail.
Please set ZOOKEEPER_HOME to the root of your Zookeeper installation.
The system cannot find the path specified.
Please help with a solution to this problem
it depends if you are working on a server-side or client-side.
If you are on the server-side do the following:
Copy the Sqoop artifact to the machine where you want to run Sqoop server. The Sqoop server acts as a Hadoop client, therefore Hadoop libraries (Yarn, Mapreduce, and HDFS jar files) and configuration files (core-site.xml, mapreduce-site.xml, ...) must be available on this node. You do not need to run any Hadoop related services - running the server on a “gateway” node is perfectly fine.
Decompress Sqoop distribution tarball
tar -xvf sqoop-<version>-bin-hadoop<hadoop-version>.tar.gz
Move decompressed content to any location
mv sqoop-<version>-bin-hadoop<hadoop version>.tar.gz /usr/lib/sqoop
Change working directory
cd /usr/lib/sqoop
And sqoopt needs the enviromantal variables to be poiting at Hadoop libraries. So set and export the variables like that:
# Export HADOOP_HOME variable
export HADOOP_HOME=/...
# Or alternatively HADOOP_*_HOME variables
export HADOOP_COMMON_HOME=/...
export HADOOP_HDFS_HOME=/...
export HADOOP_MAPRED_HOME=/...
export HADOOP_YARN_HOME=/...
Sqoop server will need to impersonate users to access HDFS, so edit the core-site.xml file:
<property>
<name>hadoop.proxyuser.sqoop2.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.sqoop2.groups</name>
<value>*</value>
</property>
For client installation
Just copy Sqoop distribution artifact on target machine and unzip it in desired location. You can start client with following command:
sqoop2-shell
for the path variable, All user and administrator facing shell commands are stored in bin/ directory. It’s recommended to add this directory to your $PATH for easier execution, for example:
PATH=$PATH:`pwd`/bin/
This combination works well for me:
Hadoop: 3.3.1
Sqoop: 1.4.7
Connector/J: 5.1.49
Note:
You must replaced "commons-lang3-3.4.jar" by "commons-lang-2.6.jar" in "$SQOOP_HOME/lib" to avoid error "java.lang.ClassNotFoundException: org.apache.commons.lang.StringUtils" when using "Sqoop import".

Which version of sqoop is compatible with hadoop 3.0

When running sqoop version, getting this error:
hadoopusr#houssein:~$ sqoop version
Warning: /usr/lib/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/lib/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2326: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2421: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
2019-02-24 04:43:16,533 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Sqoop 1.4.7
git commit id 2328971411f57f0cb683dfb79d19d4d19d185dd8
Compiled by maugli on Thu Dec 21 15:59:58 STD 2017`
Any help?
First, change the version of Hadoop to 2.6.x.
The environment warnings say that you need put each dependency in the correspondent directory.
I am assuming the below:
SQOOP_HOME and other params are configured in .profile
the HADOOP_COMMON_HOME & HADOOP_MAPRED_HOME setup are already done in $SQOOP_HOME/conf path.
I too faced a similar issue but it resolved after i added the mysql connectors
I downloaded the below from the link given below, the file mysql-connector-java-8.0.15.tar.gz (you can google and download it from a different link as well)
[Link to download][1]
Steps to configure the mysql connector.
Once the mysql-connector-java gz package is downloaded, Run the below commands to unzip it:
tar -xvf mysql-connector-java-8.0.15.tar.gz
mv mysql-connector-java-8.0.15/mysql-connector-java-8.0.15.jar /$SQOOP_HOME/lib
Finally, to verify the installation, run:
sqoop version
This will output version about sqoop
Hope this helps!!

Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation

My VM details:
Cloudera Quickstart VM 5.5.0
VM = VM workstation 12 player
Windows = Windows 10 / 64 bit
Java = Java 1.8
when I run the "sqoop"command , I'm facing the error below:
**Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.**
Can you please help to rectify this error?
In addition to
>>> ACCUMULO_HOME='/var/lib/accumulo'
and
>>> export ACCUMULO_HOME
don't forget to create the directory
>>> sudo mkdir /var/lib/accumulo
Set the variable ACCUMULO_HOME to /var/lib/accumulo and export this variable.
This will resolve this warning.
If you happen to be installing through the Apache Bigtop, you may need to use
$ export ACCUMULO_HOME=/usr/bin/sqoop
test it out with something like
$sqoop help import
Late but i hope this workaround could help you. I've had the same problem but i'm using the cloudera sandbox.
In my case to solve it i just created an empty directory and set $ACCUMULO_HOME env variable as following:
$ sqoop help
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
...
As said, first create the accumulo dir
$ mkdir /var/lib/accumulo
Next create the var
$ ACCUMULO_HOME='/var/lib/accumulo'
$ export ACCUMULO_HOME
That's it
$ echo $ACCUMULO_HOME
$ sqoop help
usage: sqoop COMMAND [ARGS]
Available commands:
codegen Generate code to interact with database records
create-hive-table Import a table definition into Hive
eval Evaluate a SQL statement and display the results
....
use the following commands:
1. sudo mkdir /var/lib/accumulo
2. ACCUMULO_HOME='/var/lib/accumulo'
3. export ACCUMULO_HOME
Its just a warning that you could ignore. Sqoop tried to validate the environment and send these warning.
If you are really bothered , You could try with
Set up $ACCUMULO_HOME to some directory which you would not use it.

Error running pseudo-distributed hbase

I installed Hadoop and HBase in Mac OSX 10.9 through Homebrew. The version of Hadoop is 2.5.1, and the version of HBase is 0.98.6.1.
After I started HDFS, and try to start HBase, I got these errors:
Error: Could not find or load main class org.apache.hadoop.hbase.util.HBaseConfTool
Error: Could not find or load main class org.apache.hadoop.hbase.zookeeper.ZKServerTool
starting master, logging to /usr/local/Cellar/hbase/0.98.6.1/logs/hbase-lsphate-master-Ethans-MacBook-Pro.local.out
Error: Could not find or load main class org.apache.hadoop.hbase.master.HMaster
localhost: starting regionserver, logging to /usr/local/Cellar/hbase/0.98.6.1/logs/hbase-lsphate-regionserver-Ethans-MacBook-Pro.local.out
localhost: Error: Could not find or load main class org.apache.hadoop.hbase.regionserver.HRegionServer
Is there any suggestion of this error? I've googled it and tried any solution I can find but they were all no use.
Your's HBASE_HOME might not be pointing to the correct location. Try exporting HBASE_HOME and HBASE_CONF_DIR like
export HBASE_HOME=/usr/local/Cellar/hbase/0.98.6.1/libexec
export HBASE_CONF_DIR=$HBASE_HOME/conf
Thanks.

Hadoop issue with Sqoop installation

I have Hadoop(pseudo distributed mode), Hive, sqoop and mysql installed in my local machine.
But when I am trying to run sqoop Its giving me the following error
Error: /usr/lib/hadoop does not exist!
Please set $HADOOP_COMMON_HOME to the root of your Hadoop installation.
Then I set the sqoop-env-template.sh file with all the information. Beneath is the snapshot of the sqoop-env-template.sh file.
Even after providing the hadoop hive path I face the same error.
I've installed
hadoop in /home/hduser/hadoop version 1.0.3
hive in /home/hduser/hive version 0.11.0
sqoop in /home/hduser/sqoop version 1.4.4
and mysql connector jar java-5.1.29
Could anybody please throw some light on what is going wrong
sqoop-env-template.sh is a template, meaning it doesn't by itself get sourced by the configurator. If you want it to have a custom conf and load it, make a copy as $SQOOP_HOME/conf/sqoop-env.sh.
Note: here is the relevant excerpt from bin/configure-sqoop for version 1.4.4:
SQOOP_CONF_DIR=${SQOOP_CONF_DIR:-${SQOOP_HOME}/conf}
if [ -f "${SQOOP_CONF_DIR}/sqoop-env.sh" ]; then
. "${SQOOP_CONF_DIR}/sqoop-env.sh"
fi

Resources