could not load snappy native libraries for HBase - hadoop

I have been trying and reading different blogs but failed to get snappy Libraries check TRUE.
OS in use - CentOs 6.9
Java Version & Path
java -version
java version "1.8.0_121"
Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)
[root#hadoop1 bin]# $JAVA_HOME
-bash: /usr/local/jdk1.8.0_121: is a directory
Output of - hadoop checknative -a
17/10/26 11:16:13 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
17/10/26 11:16:13 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/local/hadoop-2.7.1/lib/native/libhadoop.so
zlib: true /lib64/libz.so.1
snappy: false
lz4: true revision:99
bzip2: false
openssl: false Cannot load libcrypto.so (libcrypto.so: cannot open shared object file: No such file or directory)!
17/10/26 11:16:13 INFO util.ExitUtil: Exiting with status 1
hbase org.apache.hadoop.util.NativeLibraryChecker
2017-10-26 10:46:07,878 WARN [main] bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
2017-10-26 10:46:07,881 INFO [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/local/hadoop-2.7.1/lib/native/libhadoop.so
zlib: true /lib64/libz.so.1
snappy: false
lz4: true revision:99
bzip2: false
Few statements from : hbase-env.sh
export JAVA_HOME="/usr/local/jdk1.8.0_121"
export HBASE_LIBRARY_PATH=/usr/local/hadoop-2.7.1/lib/native/Linux-amd64-64:/usr/local/hadoop-2.7.1/lib/native
(commented for now, tried uncommenting it too)
export LD_LIBRARY_PATH=/usr/local/hbase-1.2.6/lib/native/Linux-amd64-64
export JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH:/usr/local/hadoop-2.7.1/lib/native
I have all the *.so required in the required path.
output of - ps -ef | grep hbase to check the paths checked by HBase for libraries.

Related

Unable to run nifi in window server 2016

I am not able to run nifi in window server 2016 even though I change to java 8
java version "1.8.0_202"
Java(TM) SE Runtime Environment (build 1.8.0_202-b08)
Java HotSpot(TM) 64-Bit Server VM (build 25.202-b08, mixed mode)
C:\Users\HelloWorld\Desktop\nifi-1.16.1-bin\nifi-1.16.1\bin>run-nifi.bat
2022-05-11 23:20:58,804 INFO [main] org.apache.nifi.bootstrap.Command Starting Apache NiFi...
2022-05-11 23:20:58,804 INFO [main] org.apache.nifi.bootstrap.Command Working Directory: C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1
2022-05-11 23:20:58,804 INFO [main] org.apache.nifi.bootstrap.Command Command: C:\Program Files\Java\jdk1.8.0_202\bin\java.exe -classpath C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\conf;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\logback-classic-1.2.11.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\logback-core-1.2.11.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-api-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-nar-utils-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-properties-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-property-utils-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-runtime-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-stateless-api-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\nifi-stateless-bootstrap-1.16.1.jar;C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\lib\slf4j-api-1.7.36.jar -Dorg.apache.jasper.compiler.disablejsr199=true -Xmx512m -Xms512m -Dcurator-log-only-first-connection-issue-as-error-level=true -Djavax.security.auth.useSubjectCredsOnly=true -Djava.security.egd=file:/dev/urandom -Dzookeeper.admin.enableServer=false -Dsun.net.http.allowRestrictedHeaders=true -Djava.net.preferIPv4Stack=true -Djava.awt.headless=true -Djava.protocol.handler.pkgs=sun.net.www.protocol -Dnifi.properties.file.path=C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\.\conf\nifi.properties -Dnifi.bootstrap.listen.port=54233 -Dapp=NiFi -Dorg.apache.nifi.bootstrap.config.log.dir=C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\bin\..\\logs org.apache.nifi.NiFi
2022-05-11 23:20:59,078 WARN [main] org.apache.nifi.bootstrap.Command Failed to set permissions so that only the owner can read pid file C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\bin\..\run\nifi.pid; this may allows others to have access to the key needed to communicate with NiFi. Permissions should be changed so that only the owner can read this file
2022-05-11 23:20:59,078 WARN [main] org.apache.nifi.bootstrap.Command Failed to set permissions so that only the owner can read status file C:\Users\HelloWorld\Desktop\NIFI-1~1.1-B\NIFI-1~1.1\bin\..\run\nifi.status; this may allows others to have access to the key needed to communicate with NiFi. Permissions should be changed so that only the owner can read this file
2022-05-11 23:20:59,078 INFO [main] org.apache.nifi.bootstrap.Command Launched Apache NiFi with Process ID 964

Logstash - Logstash stopped processing because of an error: (SystemExit) exit to install netflow

I am going to install NetFlow.
Here is a document for it.
My logstash.yml setting is following.
modules:
- name: netflow
var.input.udp.port: 9996
I've run this command.
/usr/share/logstash/bin/logstash --modules netflow -M netflow.var.input.udp.port=9996
I've got following error.
JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[INFO ] 2022-02-15 23:44:29.148 [main] runner - Starting Logstash {"logstash.version"=>"7.17.0", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.13+8 on 11.0.13+8 +indy +jit [linux-x86_64]"}
[INFO ] 2022-02-15 23:44:29.163 [main] runner - JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
Your settings are invalid. Reason: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.
[FATAL] 2022-02-15 23:44:29.208 [main] Logstash - Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.20.1.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.20.1.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:94) ~[?:?]
Is there a solution for it?
The error message states the following:
Your settings are invalid. Reason: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.
So you simply need to make the /usr/share/logstash/data folder writable by the logstash user.

osx get pyarrow.lib.ArrowIOError: Unable to load libhdfs

import pyarrow as pa
client = pa.hdfs.connect('localhost', 9000)
ERROR
Traceback (most recent call last):
File "/Users/wyx/project/py3.7aio/hdfs/list_dir.py", line 13, in <module>
client = pa.hdfs.connect('localhost', 9000)
File "/Users/wyx/project/py3.7aio/.env/lib/python3.6/site-packages/pyarrow/hdfs.py", line 207, in connect
extra_conf=extra_conf)
File "/Users/wyx/project/py3.7aio/.env/lib/python3.6/site-packages/pyarrow/hdfs.py", line 38, in __init__
self._connect(host, port, user, kerb_ticket, driver, extra_conf)
File "pyarrow/io-hdfs.pxi", line 89, in pyarrow.lib.HadoopFileSystem._connect
File "pyarrow/error.pxi", line 83, in pyarrow.lib.check_status
pyarrow.lib.ArrowIOError: Unable to load libhdfs
I install hadoop by brew and get any native libs so I build hadoop3.1.1 by Native Libraries Guide but I can't get any
libhdfs.so which the pyarrow need I only get libhdfs.dylib
➜ native git:(branch-3.1.1) ✗ hadoop checknative -a
2019-02-24 22:05:31,686 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
2019-02-24 22:05:31,689 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2019-02-24 22:05:31,695 WARN erasurecode.ErasureCodeNative: ISA-L support is not available in your platform... using builtin-java codec where applicable
Native library checking:
hadoop: true /usr/local/Cellar/hadoop/3.1.1/libexec/lib/native/libhadoop.dylib
zlib: true /usr/lib/libz.1.dylib
zstd : false
snappy: true /usr/local/lib/libsnappy.1.dylib
lz4: true revision:10301
bzip2: false
openssl: false build does not support openssl.
ISA-L: false libhadoop was built without ISA-L support
2019-02-24 22:05:31,723 INFO util.ExitUtil: Exiting with status 1: ExitException

Does NativeLoader supported on Windows?

I have build Hadoop 2.7.3 from source, all succeeded. I am using a prebuild Spark 2.0 binary with Hadoop 2.7 support. When I start the spark-shell, I got this warning.
16/09/23 14:53:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hadoop checknative -a gives me
16/09/23 14:59:47 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
16/09/23 14:59:47 WARN zlib.ZlibFactory: Failed to load/initialize native-zlib library
Native library checking:
hadoop: true D:\hadoop-2.7.3\bin\hadoop.dll
zlib: false
snappy: false
lz4: true revision:99
bzip2: false
openssl: false build does not support openssl.
winutils: true D:\hadoop-2.7.3\bin\winutils.exe
16/09/23 14:59:47 INFO util.ExitUtil: Exiting with status 1
Do I have to get native build for all the libraries? I checked the Hadoop build instruction, and I could not find any information about build the other libraries.
Or maybe there's some miss configuration in my Spark. But I could not figure out what. I have these environment variable set for my Spark:
set HADOOP_HOME=D:/hadoop-2.7.3
set HADOOP_CONF_DIR=%HADOOP_HOME%/etc/hadoop
set SPARK_HOME=D:/spark-2.0.0-bin-hadoop2.7
set HADOOP_COMMON_LIB_NATIVE_DIR=%HADOOP_HOME%/bin
set SPARK_LOCAL_IP=

Hadoop command `hadoop fs -ls` gives ConnectionRefused error

When I run hadoop command like hadoop fs -ls, I get following error/warnings:
16/08/04 11:24:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: Call From master/172.17.100.54 to master:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Am I doing anything wrong with the hadoop path?
Hadoop Native Libraries Guide say its some thing to do with
installation. please check documentation to resolve this.
Native Hadoop Library
Hadoop has native implementations of certain components for performance reasons and for non-availability of Java implementations. These components are available in a single, dynamically-linked native library called the native hadoop library. On the *nix platforms the library is named libhadoop.so.
Please note the following:
It is mandatory to install both the zlib and gzip development packages on the target platform in order to build the native hadoop library; however, for deployment it is sufficient to install just one package if you wish to use only one codec.
It is necessary to have the correct 32/64 libraries for zlib, depending on the 32/64 bit jvm for the target platform, in order to build and deploy the native hadoop library.
Runtime
The bin/hadoop script ensures that the native hadoop library is on the library path via the system property: -Djava.library.path=<path>
During runtime, check the hadoop log files for your MapReduce tasks.
If everything is all right, then: DEBUG util.NativeCodeLoader - Trying to load the custom-built native-hadoop library... INFO util.NativeCodeLoader - Loaded the native-hadoop library
If something goes wrong, then: INFO util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Check
NativeLibraryChecker is a tool to check whether native libraries are loaded correctly. You can launch NativeLibraryChecker as follows
$ hadoop checknative -a
14/12/06 01:30:45 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
14/12/06 01:30:45 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /home/ozawa/hadoop/lib/native/libhadoop.so.1.0.0
zlib: true /lib/x86_64-linux-gnu/libz.so.1
snappy: true /usr/lib/libsnappy.so.1
lz4: true revision:99
bzip2: false
Second thing Connection refused is something related to your setup. please double check setup.
also see the below as pointers..
Hadoop cluster setup - java.net.ConnectException: Connection refused
Hadoop - java.net.ConnectException: Connection refused

Resources