sqoop server start failed - hadoop

I am trying to connect to a sqoop server on localhost:
sqoop:000> set server --host manager --port 12000 --webapp sqoop
Server is set successfully
sqoop:000> show version -all
client version:
Sqoop 1.99.6 source revision 07244c3915975f26f03d9e1edf09ab7d06619bb8
Compiled by root on Wed Apr 29 10:40:43 CST 2015
0 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception has occurred during processing command
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0000:An unknown error has occurred
sqoop:000>
Port 12000 is closed
$ netstat -na|grep 12000
Why is this happening?

Hadoop libraries need to be set in a file named catalina.properties inside directory server/conf. In this file, you need to set the hadoop libraries path in common-loader property. Default will be /usr/lib/hadoop and /usr/lib/hadoop/lib. If you have your hadoop libraries at any different locations then point that directory in this property.
sqoop2-tool verify can be used to verify all the sqoop server configurations. If it is successful, you can start your server using sqoop2-server start.
Ref:
https://sqoop.apache.org/docs/1.99.6/Installation.html

Related

HBase Shell works but Unable to connect to Web UI

Following the 2. Quick Start - Standalone HBase instructions I installed HBase standalone on my Macbook. The shell works - I can create tables, drop tables, etc.
However, I can't seem to get the Web UI working. I've tried a variety of configurations for the hbase-site.xml file. I don't get any get one HBase errors that I don't think is related, but I get "Unable to connect" from the browser.
Has anyone else had this issue and solved it?
Errors:
Thu Nov 1 06:26:42 PDT 2018 Starting master on USERS-MacBook-Pro.local
.
.
.
2018-11-01 06:26:49,477 INFO [main] server.NIOServerCnxnFactory: binding to port 0.0.0.0/0.0.0.0:2181
2018-11-01 06:26:49,621 ERROR [main] server.ZooKeeperServer: ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
2018-11-01 06:26:49,632 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxnFactory: Accepted socket connection from /127.0.0.1:56148
.
.
.
2018-11-01 06:26:50,248 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
.
.
.

Hadoop command `hadoop fs -ls` gives ConnectionRefused error

When I run hadoop command like hadoop fs -ls, I get following error/warnings:
16/08/04 11:24:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: Call From master/172.17.100.54 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Am I doing anything wrong with the hadoop path?
Hadoop Native Libraries Guide say its some thing to do with
installation. please check documentation to resolve this.
Native Hadoop Library
Hadoop has native implementations of certain components for performance reasons and for non-availability of Java implementations. These components are available in a single, dynamically-linked native library called the native hadoop library. On the *nix platforms the library is named libhadoop.so.
Please note the following:
It is mandatory to install both the zlib and gzip development packages on the target platform in order to build the native hadoop library; however, for deployment it is sufficient to install just one package if you wish to use only one codec.
It is necessary to have the correct 32/64 libraries for zlib, depending on the 32/64 bit jvm for the target platform, in order to build and deploy the native hadoop library.
Runtime
The bin/hadoop script ensures that the native hadoop library is on the library path via the system property: -Djava.library.path=<path>
During runtime, check the hadoop log files for your MapReduce tasks.
If everything is all right, then: DEBUG util.NativeCodeLoader - Trying to load the custom-built native-hadoop library... INFO util.NativeCodeLoader - Loaded the native-hadoop library
If something goes wrong, then: INFO util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Check
NativeLibraryChecker is a tool to check whether native libraries are loaded correctly. You can launch NativeLibraryChecker as follows
$ hadoop checknative -a
14/12/06 01:30:45 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
14/12/06 01:30:45 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /home/ozawa/hadoop/lib/native/libhadoop.so.1.0.0
zlib: true /lib/x86_64-linux-gnu/libz.so.1
snappy: true /usr/lib/libsnappy.so.1
lz4: true revision:99
bzip2: false
Second thing Connection refused is something related to your setup. please double check setup.
also see the below as pointers..
Hadoop cluster setup - java.net.ConnectException: Connection refused
Hadoop - java.net.ConnectException: Connection refused

Hive Internal Error: java.lang.ClassNotFoundException(org.apache.atlas.hive.hook.HiveHook)

I am running a hive query throwh oozie using hue..
I am creating a table through hue-oozie work flow...
My job is failing but when I check in hive the table is created.
Log shows below error:
16157 [main] INFO org.apache.hadoop.hive.ql.hooks.ATSHook - Created ATS Hook
2015-09-24 11:05:35,801 INFO [main] hooks.ATSHook (ATSHook.java:<init>(84)) - Created ATS Hook
16159 [main] ERROR org.apache.hadoop.hive.ql.Driver - hive.exec.post.hooks Class not found:org.apache.atlas.hive.hook.HiveHook
2015-09-24 11:05:35,803 ERROR [main] ql.Driver (SessionState.java:printError(960)) - hive.exec.post.hooks Class not found:org.apache.atlas.hive.hook.HiveHook
16159 [main] ERROR org.apache.hadoop.hive.ql.Driver - FAILED: Hive Internal Error: java.lang.ClassNotFoundException(org.apache.atlas.hive.hook.HiveHook)
java.lang.ClassNotFoundException: org.apache.atlas.hive.hook.HiveHook
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
Not able to identify the issue....
I am usig HDP 2.3.1
Basically this error is due to missing atlas jar in oozie share lib.
In HDP the Atlas jar is available in /usr/hdp/2.3.0.0-2557/atlas/
Put all the jars related to atlas in hadoop share lib ..
hadoop fs -put /usr/hdp/2.3.0.0-2557/atlas/hook/hive/* /user/oozie/share/lib/lib200344/hive
Add 'export HIVE_AUX_JARS_PATH=<atlas package>/hook/hive' in hive-env.sh .
Copy <atlas package>/conf/application.propertiesto hive conf directory.
Restart the oozie services. This will solve this problem. If anybody face the problem please comment here so that I can help.
[Comment by Immo Huneke: when using the Hortonworks sandbox VM, I found that just putting the jar files in the share/lib folder under HDFS was enough to resolve the problem. I didn't have to update hive-env.sh or copy the application.properties file. But check the exact path of your share/lib folder by executing the command hdfs dfs -ls /user/oozie/share/lib before copying.]
hive>add jar /usr/hdp//atlas/hook/hive/hive-bridge-${VERSION}.jar
it will be ok.
hope help for u.
It Seems You CLASS is not found exception.
Have you installed Oozie Sharedlib, if Yes, please update all the hive dependent Jar in the sharedLib Location, and check if the status
Also check if Hive Client is available in all the Nodes under the cluster and same should be running
​I tried each and every possible solution mentioned in this forum and in stackoverflow, but it did not resolve my issue.
Finally, I resolved it by copying all the jars in /hook/hive to lib (create a new lib folder at job.properties level) folder of my oozie workflow

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable with hadoop-2.6.0

I have started working in hadoop, i am a beginner. I have succuefully install the hadoop-2.6.0 in ubuntu 15.04 64 bit.
The commond like start-all.sh, start-dfs.sh etc are working nicely.
I am facing the problem when i am trying to move the local file system to HDFS.
Like in copyFromLocal command:
hadoop dfs -copyFromLocal ~/Hadoop/test/text2.txt ~/Hadoop/test_hds/input.txt
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
15/06/04 23:18:29 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
copyFromLocal: Call From royaljay-Inspiron-N4010/127.0.1.1 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Same problem in mkdir command:
hadoop dfs -put ~/test/test/test1.txt hd.txt
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
15/06/03 20:49:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: Cannot create file/user/hduser/hd.txt.COPYING. Name node is in safe mode.
I have found many solutions, but no one is working out.
If anyone have idea about this please tell me.
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
You should not use hadoop dfs, instead use the following command:
hdfs dfs -copyFromLocal ...
Do not use ~, instead mention full path like /home/hadoop/Hadoop/test/text2.txt
Call From royaljay-Inspiron-N4010/127.0.1.1 to localhost:9000 failed
on connection exception: java.net.ConnectException: Connection
refused; For more details see:
http://wiki.apache.org/hadoop/ConnectionRefused
127.0.1.1 will cause loopback problems. Remove the line with 127.0.1.1 from /etc/hosts.
NOTE: For copying files from local filesystem to HDFS, try using -put command instead of -copyFromLocal.

Installing Hue, permission denied error?

I'm getting the following while trying to build Hue:
(6211) *** Controller starting at Thu Aug 8 11:29:50 2013
Should start 1 new children
Controller.spawn_children(number=1)
$HADOOP_HOME=
$HADOOP_BIN=/usr/local/hadoop/bin/hadoop
$HIVE_CONF_DIR=~/hive-0.10.0/conf
$HIVE_HOME=~/hive-0.10.0
find: `~/hive-0.10.0/lib': No such file or directory
$HADOOP_CLASSPATH=:
$HADOOP_OPTS=-Dlog4j.configuration=log4j.properties
$HADOOP_CONF_DIR=~/hive-0.10.0/conf:/usr/local/hadoop/conf
$HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce
CWD=/usr/local/hue/desktop/conf
Executing /usr/local/hadoop/bin/hadoop jar /usr/local/hue/apps/beeswax/src/beeswax/../../java-lib/BeeswaxServer.jar --beeswax 8002 --desktop-host 127.0.0.1 --desktop-port 8888 --query-lifetime 604800000 --metastore 8003
Exception in thread "main" java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.createTempFile(File.java:1879)
at org.apache.hadoop.util.RunJar.main(RunJar.java:119)
I've changed the configuration file so it doesn't use hue but the user that I'm logged in as which has read and write permissions in the hadoop dfs, hadoop, hive, etc. Not sure why it's doing this...
It seems that it is starting to start Beeswax in /usr/local/hue/desktop/conf. Beeswax should be running as the 'hue' user by default (https://github.com/cloudera/hue/blob/master/desktop/core/src/desktop/supervisor.py#L67) so it need to be writable by 'hue'.

Resources