Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation - sqoop

My VM details:
Cloudera Quickstart VM 5.5.0
VM = VM workstation 12 player
Windows = Windows 10 / 64 bit
Java = Java 1.8
when I run the "sqoop"command , I'm facing the error below:
**Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.**
Can you please help to rectify this error?

In addition to
>>> ACCUMULO_HOME='/var/lib/accumulo'
and
>>> export ACCUMULO_HOME
don't forget to create the directory
>>> sudo mkdir /var/lib/accumulo

Set the variable ACCUMULO_HOME to /var/lib/accumulo and export this variable.
This will resolve this warning.

If you happen to be installing through the Apache Bigtop, you may need to use
$ export ACCUMULO_HOME=/usr/bin/sqoop
test it out with something like
$sqoop help import

Late but i hope this workaround could help you. I've had the same problem but i'm using the cloudera sandbox.
In my case to solve it i just created an empty directory and set $ACCUMULO_HOME env variable as following:
$ sqoop help
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
...
As said, first create the accumulo dir
$ mkdir /var/lib/accumulo
Next create the var
$ ACCUMULO_HOME='/var/lib/accumulo'
$ export ACCUMULO_HOME
That's it
$ echo $ACCUMULO_HOME
$ sqoop help
usage: sqoop COMMAND [ARGS]
Available commands:
codegen Generate code to interact with database records
create-hive-table Import a table definition into Hive
eval Evaluate a SQL statement and display the results
....

use the following commands:
1. sudo mkdir /var/lib/accumulo
2. ACCUMULO_HOME='/var/lib/accumulo'
3. export ACCUMULO_HOME

Its just a warning that you could ignore. Sqoop tried to validate the environment and send these warning.
If you are really bothered , You could try with
Set up $ACCUMULO_HOME to some directory which you would not use it.

Related

error to load main class for sqoop after successful instalation

OS: Windows 10
Sqoop Version: 1.4.7
Hadoop Version: 2.9.2
Hi after installing Sqoop on win 10 following the instructions from
https://medium.com/analytics-vidhya/sqoop-how-to-install-in-5-steps-in-windows-10-ca2f17e11e75
i then try to check the install using the "sqoop version" command, but all i get is following error:
it fails:
C:\Users\iamme>sqoop version
Warning: HBASE_HOME and HBASE_VERSION not set.
Warning: HCAT_HOME not set
Warning: HCATALOG_HOME does not exist HCatalog imports will fail.
Please set HCATALOG_HOME to the root of your HCatalog installation.
Warning: ACCUMULO_HOME not set.
Warning: ZOOKEEPER_HOME not set.
Warning: HBASE_HOME does not exist HBase imports will fail.
Please set HBASE_HOME to the root of your HBase installation.
Warning: ACCUMULO_HOME does not exist Accumulo imports will fail.
Please set ACCUMULO_HOME to the root of your Accumulo installation.
Warning: ZOOKEEPER_HOME does not exist Accumulo imports will fail.
Please set ZOOKEEPER_HOME to the root of your Zookeeper installation.
Error: Could not find or load main class org.apache.sqoop.Sqoop
Caused by: java.lang.ClassNotFoundException: org.apache.sqoop.Sqoop
I checked all Hadoop and Sqoop ENV Variables, and they seem to be fine.
Any clue what might be wrong?
Best Regards,
R.S.

Which version of sqoop should I use for hadoop 3.3.0?

I am trying to install sqoop 1.4.7 in windows 10 on hadoop 3.3.0 ,
on using ./configure-sqoop on GIT bash I get following o/p:
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: C:\sqoop_data\sqoop-1.4.7.bin__hadoop-2.6.0/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
on verifying the installation using sqoop.cmd version , I get:
Warning: HBASE_HOME and HBASE_VERSION not set.
Warning: HCAT_HOME not set
Warning: HCATALOG_HOME does not exist HCatalog imports will fail.
Please set HCATALOG_HOME to the root of your HCatalog installation.
Warning: ACCUMULO_HOME not set.
Warning: ZOOKEEPER_HOME not set.
Warning: HBASE_HOME does not exist HBase imports will fail.
Please set HBASE_HOME to the root of your HBase installation.
Warning: ACCUMULO_HOME does not exist Accumulo imports will fail.
Please set ACCUMULO_HOME to the root of your Accumulo installation.
Warning: ZOOKEEPER_HOME does not exist Accumulo imports will fail.
Please set ZOOKEEPER_HOME to the root of your Zookeeper installation.
The system cannot find the path specified.
Please help with a solution to this problem
it depends if you are working on a server-side or client-side.
If you are on the server-side do the following:
Copy the Sqoop artifact to the machine where you want to run Sqoop server. The Sqoop server acts as a Hadoop client, therefore Hadoop libraries (Yarn, Mapreduce, and HDFS jar files) and configuration files (core-site.xml, mapreduce-site.xml, ...) must be available on this node. You do not need to run any Hadoop related services - running the server on a “gateway” node is perfectly fine.
Decompress Sqoop distribution tarball
tar -xvf sqoop-<version>-bin-hadoop<hadoop-version>.tar.gz
Move decompressed content to any location
mv sqoop-<version>-bin-hadoop<hadoop version>.tar.gz /usr/lib/sqoop
Change working directory
cd /usr/lib/sqoop
And sqoopt needs the enviromantal variables to be poiting at Hadoop libraries. So set and export the variables like that:
# Export HADOOP_HOME variable
export HADOOP_HOME=/...
# Or alternatively HADOOP_*_HOME variables
export HADOOP_COMMON_HOME=/...
export HADOOP_HDFS_HOME=/...
export HADOOP_MAPRED_HOME=/...
export HADOOP_YARN_HOME=/...
Sqoop server will need to impersonate users to access HDFS, so edit the core-site.xml file:
<property>
<name>hadoop.proxyuser.sqoop2.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.sqoop2.groups</name>
<value>*</value>
</property>
For client installation
Just copy Sqoop distribution artifact on target machine and unzip it in desired location. You can start client with following command:
sqoop2-shell
for the path variable, All user and administrator facing shell commands are stored in bin/ directory. It’s recommended to add this directory to your $PATH for easier execution, for example:
PATH=$PATH:`pwd`/bin/
This combination works well for me:
Hadoop: 3.3.1
Sqoop: 1.4.7
Connector/J: 5.1.49
Note:
You must replaced "commons-lang3-3.4.jar" by "commons-lang-2.6.jar" in "$SQOOP_HOME/lib" to avoid error "java.lang.ClassNotFoundException: org.apache.commons.lang.StringUtils" when using "Sqoop import".

Which version of sqoop is compatible with hadoop 3.0

When running sqoop version, getting this error:
hadoopusr#houssein:~$ sqoop version
Warning: /usr/lib/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/lib/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2326: HADOOP_ORG.APACHE.SQOOP.SQOOP_USER: bad substitution
/usr/local/hadoop/libexec/hadoop-functions.sh: line 2421: HADOOP_ORG.APACHE.SQOOP.SQOOP_OPTS: bad substitution
2019-02-24 04:43:16,533 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
Sqoop 1.4.7
git commit id 2328971411f57f0cb683dfb79d19d4d19d185dd8
Compiled by maugli on Thu Dec 21 15:59:58 STD 2017`
Any help?
First, change the version of Hadoop to 2.6.x.
The environment warnings say that you need put each dependency in the correspondent directory.
I am assuming the below:
SQOOP_HOME and other params are configured in .profile
the HADOOP_COMMON_HOME & HADOOP_MAPRED_HOME setup are already done in $SQOOP_HOME/conf path.
I too faced a similar issue but it resolved after i added the mysql connectors
I downloaded the below from the link given below, the file mysql-connector-java-8.0.15.tar.gz (you can google and download it from a different link as well)
[Link to download][1]
Steps to configure the mysql connector.
Once the mysql-connector-java gz package is downloaded, Run the below commands to unzip it:
tar -xvf mysql-connector-java-8.0.15.tar.gz
mv mysql-connector-java-8.0.15/mysql-connector-java-8.0.15.jar /$SQOOP_HOME/lib
Finally, to verify the installation, run:
sqoop version
This will output version about sqoop
Hope this helps!!

passing mysql properties via sqoop eval

sqoop eval command :
sqoop eval --connect 'jdbc:mysql://<connection url>' --driver com.mysql.jdbc.Driver --query "select max(rdate) from test.sqoop_test"
gives me output:
Warning: /usr/hdp/2.3.2.0-2950/accumulo does not exist! Accumulo
imports will fail. Please set $ACCUMULO_HOME to the root of your
Accumulo installation. Warning: /usr/hdp/2.3.2.0-2950/zookeeper does
not exist! Accumulo imports will fail. Please set $ZOOKEEPER_HOME to
the root of your Zookeeper installation. 16/10/05 18:38:17 INFO
sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950 16/10/05
18:38:17 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead. 16/10/05 18:38:17
WARN sqoop.ConnFactory: Parameter --driver is set to an explicit
driver however appropriate connection manager is not being set (via
--connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly
which connection manager should be used next time. 16/10/05 18:38:17
INFO manager.SqlManager: Using default fetchSize of 1000
-------------- | max(rdate) |
-------------- | 2014-01-25 |
but i want output without warning and table boundries like:
max(rdate) 2014-01-25
i basically want to store this output to a file.
thanks in advance
You can perform Sqoop Import operation to save output in HDFS.
Warnings are straight forward.
You can set $ACCUMULO_HOME, $ZOOKEEPER_HOME if available.
You can set --connection-manager corresponding to Mysql
For the sake of security,
It's recommended to use -P for password rather than writing in command.
These are not errors, you can live with these warnings.
You can create a .sh file , write your sqoop commands into it, then run it as
shell_file_name.sh > your_output_file.txt
We have two ways to get the query results:
The other way is you can write to HDFS by importing query results(--target-dir /path) and read from there.
You can change the file system option in sqoop command to store the results from import query, So the idea behind is you importing data to local file system rather HDFS.
eg: sqoop import -fs local -jt local --connect "connection string" --username root --password root query "Select * from table" --target-dir /home/output
https://sqoop.apache.org/docs/1.4.0-incubating/SqoopUserGuide.html#id1762587

Doubts in configuration of SQOOP

Scenario:
I have configured SQOOP on my PC. But I am facing some problem that,
when I go for bin/sqoop I get some error as:
Error:
Exception in thread "main"
`java.lang.NoSuchMethodError:`
org.apache.hadoop.conf.Configuration.getInstances(Ljava/lang/
String;Ljava/lang/Class;)Ljava/util/List;
at com.cloudera.sqoop.tool.SqoopTool.loadPlugins(SqoopTool.java:139)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:209)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
Question:
What could be the problem? I have also set the path of $HBASE_HOME and $ZOOKEEPER_HOME.
Please suggest me how can we do it.
Thanks.
I am giving you the steps as I configured on my terminal.
Downloaded sqoop-1.3.0-cdh3u1 from the Cloudera archive.
Download mysql-connector-java-5.0.8 and copy the mysql-connector-java-5.0.8.jar file to lib and bin directory of sqoop (for sqoop and mysql connection)
Copy all jars from lib to bin (optional)
Add 2 lines in .bash_profile file
export SQOOP_HOME=/home/hadoop/Desktop/Cloudera/sqoop-1.3.0-cdh3u1
export PATH=$PATH:$SQOOP_HOME/bin
Save it and just type sqoop help on terminal
It worked on my terminal. Post me the steps you followed .
Maybe this helps:
https://issues.apache.org/jira/browse/SQOOP-384
Try to downgrade to a different version of Sqoop.

Resources