HBase Shell works but Unable to connect to Web UI - macos

Following the 2. Quick Start - Standalone HBase instructions I installed HBase standalone on my Macbook. The shell works - I can create tables, drop tables, etc.
However, I can't seem to get the Web UI working. I've tried a variety of configurations for the hbase-site.xml file. I don't get any get one HBase errors that I don't think is related, but I get "Unable to connect" from the browser.
Has anyone else had this issue and solved it?
Errors:
Thu Nov 1 06:26:42 PDT 2018 Starting master on USERS-MacBook-Pro.local
.
.
.
2018-11-01 06:26:49,477 INFO [main] server.NIOServerCnxnFactory: binding to port 0.0.0.0/0.0.0.0:2181
2018-11-01 06:26:49,621 ERROR [main] server.ZooKeeperServer: ZKShutdownHandler is not registered, so ZooKeeper server won't take any action on ERROR or SHUTDOWN server state changes
2018-11-01 06:26:49,632 INFO [NIOServerCxn.Factory:0.0.0.0/0.0.0.0:2181] server.NIOServerCnxnFactory: Accepted socket connection from /127.0.0.1:56148
.
.
.
2018-11-01 06:26:50,248 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
.
.
.

Related

Apache Nifi windows unable to load NAR library bundles

I'm only attempting to launch the Nifi UI as a local instance to start playing with it. I've unzipped the package and made sure to set the JAVA_HOME variable to my Java 1.8. When I try to bin/run-nifi, in my nifi-app log, the error message is:
2018-05-03 15:03:50,585 INFO [main] org.apache.nifi.NiFi Launching NiFi...
2018-05-03 15:03:52,330 INFO [main] o.a.nifi.properties.NiFiPropertiesLoader Determined default nifi.properties path to be 'Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\conf\nifi.properties'
2018-05-03 15:03:52,363 INFO [main] o.a.nifi.properties.NiFiPropertiesLoader Loaded 146 properties from Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\conf\nifi.properties
2018-05-03 15:03:52,423 INFO [main] org.apache.nifi.NiFi Loaded 146 properties
2018-05-03 15:03:52,779 INFO [main] org.apache.nifi.BootstrapListener Started Bootstrap Listener, Listening for incoming requests on port 64802
2018-05-03 15:03:53,071 INFO [main] org.apache.nifi.BootstrapListener Successfully initiated communication with Bootstrap
2018-05-03 15:03:53,181 WARN [main] org.apache.nifi.nar.NarUnpacker Unable to load NAR library bundles due to java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework directory does not have read/write privilege Will proceed without loading any further Nar bundles
2018-05-03 15:03:53,242 ERROR [main] org.apache.nifi.NiFi Failure to launch NiFi due to java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework could not be created
java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework could not be created
at org.apache.nifi.util.FileUtils.ensureDirectoryExistAndCanReadAndWrite(FileUtils.java:48)
at org.apache.nifi.nar.NarClassLoaders.load(NarClassLoaders.java:155)
at org.apache.nifi.nar.NarClassLoaders.init(NarClassLoaders.java:131)
at org.apache.nifi.NiFi.<init>(NiFi.java:133)
at org.apache.nifi.NiFi.<init>(NiFi.java:71)
at org.apache.nifi.NiFi.main(NiFi.java:292)
2018-05-03 15:03:53,383 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web server...
2018-05-03 15:03:53,387 INFO [Thread-1] org.apache.nifi.NiFi Jetty web server shutdown completed (nicely or otherwise).
I've followed the installation instructions and haven't been able to trouble shoot. How do I load these NAR files upon running Nifi?
Thanks
I believe the underlying error in your output is java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework could not be created.
NiFi requires file permissions to create and write several directories, there is a list in the NiFi Admin Guide: How to install and start NiFi. NiFi does this to unpack the NAR files, write logs, and for various data repositories that comprise your data flow.
You have a few options:
Modify the permissions of the directory to allow NiFi read/write access. This can be done for each individual child directory.
Copy the entire NiFi distribution to a read/write location and run it from there.
Edit the conf/nifi-properties file to change the locations of these directories to read/write locations. See NiFi Admin Guide: System Properties for help on the properties.
Symlinks are a great solution for systems that support symlinks.
Two things you can try:
Run NiFi with administrator privilege (not a good practice) by going to ~\<NIFI_INSTALLATION_DIR>\bin and right click run-nifi.bat. Click Run as Administrator
Move NiFi directory to a location where the logged in user has full access to. Ex: C:\Users\<YOUR_USER>\Documents\. Now try to execute bin\run-nifi.bat
Similarly to the resolution that James proposed. I had to do the below 3-step process.
My scenario: I'm using docker containers and had the same problem. Even changing the user of my container to root didn't work. So, I did the following:
1 - Download Minifi https://nifi.apache.org/minifi/download.html
2 - Untar and execute the Minifi agent on my own laptop (I'm using MAC) so that the necessary folders and files will be created.
3 - Tar it up again and add to the DockerFile of my container creation
Done! Everything worked fine after that.

Apache Nutch 2.3: won't inject urls (hangs) & hadoop log shows warning

I'm stuck trying to set up Nutch 2.3 with Elasticsearch 5.4. The problem is in Nutch as I cannot get it to inject my urls. The hadoop log shows the following warning:
Console:
aurora apache-nutch-2.3.1 # runtime/local/bin/nutch inject urls/seed.txt
InjectorJob: starting at 2017-06-14 17:08:28
InjectorJob: Injecting urlDir: urls/seed.txt
** it hangs here**
and the
Hadoop log:
aurora apache-nutch-2.3.1 # cat runtime/local/logs/hadoop.log
2017-06-14 17:08:28,339 INFO crawl.InjectorJob - InjectorJob: starting at 2017-06-14 17:08:28
2017-06-14 17:08:28,340 INFO crawl.InjectorJob - InjectorJob: Injecting urlDir: urls/seed.txt
2017-06-14 17:08:28,992 WARN util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I've tried setting my Hadoop environment variables following this thread (Hadoop "Unable to load native-hadoop library for your platform" warning) but I'm still getting the same error.
Any ideas?
Don't worry about warning. And I believe you are running on a Linux distribution
Nutch2.3 is not compatible with ES 5.x. I had written a custom IndexWriter which invokes Logstash at given port which in turn invokes Elastic Search. You may try this approach or something around it.

sqoop server start failed

I am trying to connect to a sqoop server on localhost:
sqoop:000> set server --host manager --port 12000 --webapp sqoop
Server is set successfully
sqoop:000> show version -all
client version:
Sqoop 1.99.6 source revision 07244c3915975f26f03d9e1edf09ab7d06619bb8
Compiled by root on Wed Apr 29 10:40:43 CST 2015
0 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception has occurred during processing command
Exception: org.apache.sqoop.common.SqoopException Message: CLIENT_0000:An unknown error has occurred
sqoop:000>
Port 12000 is closed
$ netstat -na|grep 12000
Why is this happening?
Hadoop libraries need to be set in a file named catalina.properties inside directory server/conf. In this file, you need to set the hadoop libraries path in common-loader property. Default will be /usr/lib/hadoop and /usr/lib/hadoop/lib. If you have your hadoop libraries at any different locations then point that directory in this property.
sqoop2-tool verify can be used to verify all the sqoop server configurations. If it is successful, you can start your server using sqoop2-server start.
Ref:
https://sqoop.apache.org/docs/1.99.6/Installation.html

Running Yarn in Managed Mode

I am trying to run the simple-yarn-app at https://github.com/hortonworks/simple-yarn-app in the Managed Mode.
I have copied the jar from local to hdfs like
$ hadoop fs -copyFromLocal simple-yarn-app-1.0-SNAPSHOT.jar /simple-yarn-app-1.0-SNAPSHOT.jar
When I try to run it like
hadoop jar simple-yarn-app-1.0-SNAPSHOT.jar com.hortonworks.simpleyarnapp.Client /bin/date 2 /simple-yarn-app-1.0-SNAPSHOT.jar
I get FAILED as the state. I looked for the log at $HADOOP_HOME/logs, but I do not find an associated log.
Will there be a log present somewhere in the HDFS?
Below is the error I get:
`[[2~14/07/10 12:23:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/07/10 12:23:32 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
Submitting application application_1405002979164_0004
14/07/10 12:23:32 INFO impl.YarnClientImpl: Submitted application application_1405002979164_0004 to ResourceManager at /0.0.0.0:8032
Application application_1405002979164_0004 finished with state FAILED at 1405013016326
I get the following Diagnostic Report,
----------------Failure Diagnostics----------------
Application application_1405062676632_0002 failed 2 times due to AM Container for appattempt_1405062676632_0002_000002 exited with exitCode: -1000 due to: .
Failing this attempt..
Failing the application.
How should I debug or fix this?
Just in case if anybody else is going through the same problem, when running in managed mode for this application, upload jar to hdfs and give hdfs path.
bin/hadoop jar simple-yarn-app-1.0-SNAPSHOT.jar com.hortonworks.simpleyarnapp.Client /bin/date 2 hdfs://XXX/user/XYZ/apps/simple/simple-yarn-app-1.0-SNAPSHOT.jar

Hortonworks Data node install: Exception in secureMain

Am trying to install Hortonworks Hadoop single node cluster. I am able to start namenode and secondary namenode, but datanode failed with the following error. How do I solve this issue?
2014-04-04 18:22:49,975 FATAL datanode.DataNode (DataNode.java:secureMain(1841)) - Exception in secureMain
java.lang.RuntimeException: Although a UNIX domain socket path is configured as /var/lib/hadoop-hdfs/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be loaded."
See Native Libraries Guide. Make sure libhadoop.so is available in $HADOOP_HOME\bin. Look into the logs for this message:
INFO util.NativeCodeLoader - Loaded the native-hadoop library
If instead you find
INFO util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
then it means the libhadoop.so is not available, and you'll have to investigate why. Alternatively you can turn off HDFS shortcircuit if you wish, or enable the legacy short-circuit instead using dfs.client.use.legacy.blockreader.local, to remove the libhadoop dependency. But I reckon would be better to find out what's the problem with your library.
Make sure you read and understand the articles linked before asking further questions.

Resources