Hadoop2 client on Windows for a Linux Cluster - client

We have a linux hadoop cluster but for a variety of reasons have some windows clients connecting and pushing data to the linux cluster.
In hadoop1 we had been able to run hadoop via cygwin
However in hadoop2 as stated on the website cygwin is not required or not supported.
Questions
what exactly has changed ? why would a client (only) not run under
cygwin or it could ? Apart from paths what other considerations are at play ?
Apart from the property below for job submissions is there anything else that needs to considered for windows/client interacting with a linux cluster
conf.set("mapreduce.app-submission.cross-platform", "true");
Extracting the hadoop-2.6.0-cdh5.5.2 and running it from cygwin with the right configurations under $HADOOP_HOME/etc yields some classpath or classpath formation issues class not found issues ? For instance the following run
hdfs dfs -ls
Error: Could not find or load main class org.apache.hadoop.fs.FsShell
Then looking at the classpath looks like they contain cygwin paths . attempt to convert them to windows paths so that the jar can be looked up
in $HADOOP_HOME/etc/hdfs.sh locate the dfs command and change to
elif [ "$COMMAND" = "dfs" ] ; then
if $cygwin; then
CLASSPATH=`cygpath -p -w "$CLASSPATH"`
fi
CLASS=org.apache.hadoop.fs.FsShell
This results in the following:
16/04/07 16:01:05 ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:378)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:393)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:386)
at org.apache.hadoop.util.GenericOptionsParser.preProcessForWindows(GenericOptionsParser.java:438)
at org.apache.hadoop.util.GenericOptionsParser.parseGeneralOptions(GenericOptionsParser.java:484)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:170)
at org.apache.hadoop.util.GenericOptionsParser.<init>(GenericOptionsParser.java:153)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:64)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:362)
16/04/07 16:01:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Warning: fs.defaultFs is not set when running "ls" command.
Found 15 items
-ls: Fatal internal error
java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:505)
at org.apache.hadoop.util.Shell.run(Shell.java:478)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1100)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:582)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getOwner(RawLocalFileSystem.java:565)
at org.apache.hadoop.fs.shell.Ls.adjustColumnWidths(Ls.java:139)
at org.apache.hadoop.fs.shell.Ls.processPaths(Ls.java:110)
at org.apache.hadoop.fs.shell.Command.recursePath(Command.java:373)
at org.apache.hadoop.fs.shell.Ls.processPathArgument(Ls.java:98)
at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)
at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:118)
at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:305)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:362)
For the above my question should I be going further to try and fix this so that i can reuse my existing client .sh scripts or just convert them .bat ?

the problem is that cygwin needs to return windows paths rather than cygwin paths. Also winutils.exe needs to be installed in the path as described here
Simply fix the scripts to return the actual win paths and turn off a few commands which don't run under cygwin
#!/bin/bash
# fix $HADOOP_HOME/bin/hdfs
sed -i -e "s/bin=/#bin=/g" $HADOOP_HOME/bin/hdfs
sed -i -e "s#DEFAULT_LIBEXEC_DIR=\"\$bin\"/../libexec#DEFAULT_LIBEXEC_DIR=\"\$HADOOP_HOME\\\libexec\"#g" $HADOOP_HOME/bin/hdfs
sed -i "/export CLASSPATH=$CLASSPATH/i CLASSPATH=\`cygpath -p -w \"\$CLASSPATH\"\`" $HADOOP_HOME/bin/hdfs
# fix $HADOOP_HOME/libexec/hdfs-config.sh
sed -i -e "s/bin=/#bin=/g" $HADOOP_HOME/libexec/hdfs-config.sh
sed -i -e "s#DEFAULT_LIBEXEC_DIR=\"\$bin\"/../libexec#DEFAULT_LIBEXEC_DIR=\"\$HADOOP_HOME\\\libexec\"#g" $HADOOP_HOME/libexec/hdfs-config.sh
# fix $HADOOP_HOME/libexec/hadoop-config.sh
sed -i "/HADOOP_DEFAULT_PREFIX=/a HADOOP_PREFIX=" $HADOOP_HOME/libexec/hadoop-config.sh
sed -i "/export HADOOP_PREFIX/i HADOOP_PREFIX=\`cygpath -p -w \"\$HADOOP_PREFIX\"\`" $HADOOP_HOME/libexec/hadoop-config.sh
# fix $HADOOP_HOME/bin/hadoop
sed -i -e "s/bin=/#bin=/g" $HADOOP_HOME/bin/hadoop
sed -i -e "s#DEFAULT_LIBEXEC_DIR=\"\$bin\"/../libexec#DEFAULT_LIBEXEC_DIR=\"\$HADOOP_HOME\\\libexec\"#g" $HADOOP_HOME/bin/hadoop
sed -i "/export CLASSPATH=$CLASSPATH/i CLASSPATH=\`cygpath -p -w \"\$CLASSPATH\"\`" $HADOOP_HOME/bin/hadoop

Related

Suppressing warnings for hadoop fs -get -p command

I am copying huge number of files using hadoop fs -get -p command. I want to retain (timestamps, ownerships) Many of the files are not able to retain the permissions
as the userid are not available in the local machine. So for these files I am getting "get: chown: changing ownership /a/b/c.txt Operation not permitted)
Is it possible to suppress the error, because it might be possible that I might get other issues as well. If I do 2>/dev/null, this will suppress all the issues
So I don't want to use this option. Is there any way I can suppress ONLY issues related to Privileges.?
Any hint can be really helpful?
Not very elegant, but functionnal, use grep -v your_undesired_pattern
hadoop fs -get -p command 2>&1 | grep -v "changing ownership"
From the Hadoop side, no. The error is printed using System.err.println and is coming from the OS as the command execs chown.

Apache Kylin Unable to find HBase common lib

I have installed Hadoop version 2.6.0, HBase version 0.99.0 , Hive version 1.2, Kylin version 1.5.0.
I have setup all of the above in Standalone mode while in running Kylin it checks in early stage about Hadoop, HBase and Hive. Each and everything has been installed but when I start Kylin it give an error of HBase common lib not found.
Following is the log of Apache Kylin.
KYLIN_HOME is set to bin/../
16/03/24 18:02:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
kylin.security.profile is set to testing
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Downloads/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/kunalgupta/Downloads/spark-1.6.0-bin-hadoop2.6/lib/spark-assembly-1.6.0-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/lib/hive-common-1.2.1.jar!/hive-log4j.properties
cut: illegal option -- -
usage: cut -b list [-n] [file ...]
cut -c list [file ...]
cut -f list [-s] [-d delim] [file ...]
HIVE_CONF is set to: /Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/conf/, use it to locate hive configurations.
HCAT_HOME is set to: /Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/hcatalog, use it to find hcatalog path:
usage: dirname path
find: -printf: unknown primary or operator
hive dependency: /Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/conf/::/Users/kunalgupta/Desktop/kunal/Desktop/Backup/Kunal/Downloads/apache-hive-1.2.1-bin/hcatalog/share/hcatalog/hive-hcatalog-core-1.2.1.jar
cut: illegal option -- -
usage: cut -b list [-n] [file ...]
cut -c list [file ...]
cut -f list [-s] [-d delim] [file ...]
hbase-common lib not found
Please somebody help me out.
The issue you encountered is that cut command on mac-osx does not support "--output-delimiter" option. I encountered the same error while installing kylin-1.5.1.
The easy to solve it is to use the gnu binaries in your shell instead of osx binaries.
Use brew for installing coreutils (I changed all the commonly used shell utils to their gnu versions)
Use the below command for that.
brew install coreutils findutils gnu-tar gnu-sed gawk gnutls gnu-indent gnu-getopt --default-names
Now to make the shell use these instead of mac binaries add the path to these utils in your shell profile file.
vi ~/.profile
add the following lines to this file
PATH="/usr/local/opt/coreutils/libexec/gnubin:$PATH"
After this open a new terminal window and do
echo $PATH
The result should have the path that we set in the previous step in the front (prepended)
Now start kylin, should work smoothly.
Some references links that helped me:
Mac forum link
Installation guide from apple.se
vi /etc/profile
export JAVA_HOME=/usr/java/jdk1.8.0_181-cloudera/
export CLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$CLASSPATH
export KYLIN_HOME=/usr/local/kylin
export CDH_HOME=/opt/cloudera/parcels/CDH
export SPARK_HOME=${CDH_HOME}/lib/spark
export HBASE_HOME=${CDH_HOME}/lib/hbase
export HIVE_HOME=${CDH_HOME}/lib/hive
export HADOOP_HOME=${CDH_HOME}/lib/hadoop
export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH
source /etc/profile
added ${HBASE_HOME} to /etc/profile
cat /opt/cloudera/parcels/CDH/lib/hbase/bin/hbase
if [ -n "${shaded_jar}" ] && [ -f "${shaded_jar}" ]; then
CLASSPATH="${CLASSPATH}:${shaded_jar}"
# fall through to grabbing all the lib jars and hope we're in the omnibus tarball
#
# N.B. shell specifically can't rely on the shaded artifacts because RSGroups is only
# available as non-shaded
#
# N.B. pe and ltt can't easily rely on shaded artifacts because they live in hbase-mapreduce:test-jar
# and need some other jars that haven't been relocated. Currently enumerating that list
# is too hard to be worth it.
#
else
for f in $HBASE_HOME/lib/*.jar; do
CLASSPATH=${CLASSPATH}:$f;
done
# make it easier to check for shaded/not later on.
shaded_jar=""
fi
for f in "${HBASE_HOME}"/lib/client-facing-thirdparty/*.jar; do
if [[ ! "${f}" =~ ^.*/htrace-core-3.*\.jar$ ]] && \
[ "${f}" != "htrace-core.jar$" ] && \
[[ ! "${f}" =~ ^.*/slf4j-log4j.*$ ]]; then
CLASSPATH="${CLASSPATH}:${f}"
fi
done
I use centos7 ,apache-kylin-3.0.0-alpha2-bin-hadoop3 and hbase 2.1.4 .
I found solution in these links.
https://issues.apache.org/jira/browse/KYLIN-3564
http://92072234.wiz03.com/share/s/2i1O8Q1L1k042IDoOy3h7BgH2K4G6J2SoQv42Xc4b01xpCrj
The second link solve my problem.
Edit hbase file
$HBASE_HOME/bin/hbase
change
CLASSPATH=${CLASSPATH}:$JAVA_HOME/lib/tools.jar
to
CLASSPATH=${CLASSPATH}:$JAVA_HOME/lib/tools.jar:/YOUR HBASE FULL PATH or $HBASE_HOME/lib/*
I solved it as the following:
1. export HBASE_CLASSPATH=/opt/cloudera/parcels/CDH-6.2.0-/hbase-common-2.0.0-cdh6.2.0.jar
2. then start kylin again.
You are running on Windows? Sorry Kylin only runs on Linux as of version 1.5

Unable to find Namenode class when setting up Hadoop on Windows 8

Trying to set up Hadoop 2.4.1 on my machine using Cygwin and I'm stuck when I try to run
$ hdfs namenode -format
which gives me
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
I think it's due to an undefined environment variable, since I can run
$ hadoop version
without a problem. I've defined the following:
JAVA_HOME
HADOOP_HOME
HADOOP_INSTALL
as well as adding the Hadoop \bin and \sbin (and Cygwin's \bin) to the Path. Am I missing an environment variable that I need to define?
Ok, looks like the file hadoop\bin\hdfs also has to be changed like the hadoop\bin\hadoop file described here.
The end of the file must be changed from:
exec "$JAVA" -Dproc_$COMMAND $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$#"
to
exec "$JAVA" -classpath "$(cygpath -pw "$CLASSPATH")" -Dproc_$COMMAND $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$#"
I assume I'll have to make similar changes to the hadoop\bin\mapred and hadoop\bin\yarn when I get to using those files.

How to make mahout interact with hadoop HDFS

I am using HDP mahout version 0.8. I have set MAHOUT_LOCAL="". When I run mahout, I see the message HADOOP LOCAL NOT SET RUNNING ON HADOOP but my program is not writing output to HDFS directory.
Can anyone tell me how to make my mahout program take input from HDFS and write output to HDFS?
Did you set the $MAHOUT_HOME/bin and $HADOOP_HOME/bin on the PATH ?
For example on Linux:
export PATH=$PATH:$MAHOUT_HOME/bin/:$HADOOP_HOME/bin/
export HADOOP_CONF_DIR=$HADOOP_HOME/conf/
Then, almost all the Mahout's commands use the options -i (input) and -o (output).
For example:
mahout seqdirectory -i <input_path> -o <output_path> -chunk 64
Assuming you have your mahout jar build which takes input and write to hdfs. Do the following:
From hadoop bin directory:
./hadoop jar /home/kuntal/Kuntal/BIG_DATA/mahout-recommender.jar mia.recommender.RecommenderIntro --tempDir /home/kuntal/Kuntal/BIG_DATA --recommenderClassName org.apache.mahout.cf.taste.impl.recommender.GenericItemBasedRecommender
#Input Output Args specify if required
-Dmapred.input.dir=./ratingsLess.txt -Dmapred.output.dir=/input/output
Please check this:
http://chimpler.wordpress.com/2013/02/20/playing-with-the-mahout-recommendation-engine-on-a-hadoop-cluster/

Error: Could not find or load main class org.apache.flume.node.Application - Install flume on hadoop version 1.2.1

I have built a hadoop cluster which 1 master-slave node and the other is slave. And now, I wanna build a flume to get all log of the cluster on master machine. However, when I try to install flume from tarball and I always get:
Error: Could not find or load main class org.apache.flume.node.Application
So, please help me to find the answer, or the best way to install flume on my cluster.
many thanks!
It is basically because of FLUME_HOME..
Try this command
$ unset FLUME_HOME
I know its been almost a year for this question, but I saw it!
When you set your agnet using sudo bin/flume-ng.... make sure to specify the file where the agent configuration is.
--conf-file flume_Agent.conf -> -f conf/flume_Agent.conf
This did the trick!
look like you run flume-ng in /bin folder
flume after build in
/flume-ng-dist/target/apache-flume-1.5.0.1-bin/apache-flume-1.5.0.1-bin
run flume-ng in this
I suppose you are trying to run flume from cygwin on windows? If that is the case, I had a similar issue. The problem might be with the flume-ng script.
Find the following line in bin/flume-ng:
$EXEC java $JAVA_OPTS $FLUME_JAVA_OPTS "${arr_java_props[#]}" -cp "$FLUME_CLASSPATH" \
-Djava.library.path=$FLUME_JAVA_LIBRARY_PATH "$FLUME_APPLICATION_CLASS" $*
and replace it with this
$EXEC java $JAVA_OPTS $FLUME_JAVA_OPTS "${arr_java_props[#]}" -cp `cygpath -wp "$FLUME_CLASSPATH"` \
-Djava.library.path=`cygpath -wp $FLUME_JAVA_LIBRARY_PATH` "$FLUME_APPLICATION_CLASS" $*
Notice that the paths have been replaced with the windows directories. Java would not be able to find the library paths from the cygdrive paths and we would have to convert it to the correct windows paths wherever applicable
Maybe you are using the source files, you first should compile the source code and generate the binary code, then inside the binary files directory, you can execute: bin/flume-ng agent --conf ./conf/ -f conf/flume.conf -Dflume.root.logger=DEBUG,console -n agent1. All these information you can follow: https://cwiki.apache.org/confluence/display/FLUME/Getting+Started
I got same issue before, it's simply due to FLUME_CLASSPATH not set
the best way to debug is see the java command being fired and make sure that flume lib is included in the CLASSPATH (-cp),
As in following command its looking for /lib/*, thats where the flume-ng-*.jar are, but its incorrect because there's nothing in /lib, in this line -cp '/staging001/Flume/server/conf://lib/*:/lib/*'. It has to be ${FLUME_HOME}/lib.
usr/lib/jvm/java-1.8.0-ibm-1.8.0.3.20-1jpp.1.el7_2.x86_64/jre/bin/java -Xms100m -Xmx500m $'-Dcom.sun.management.jmxremote\r' \
-Dflume.monitoring.type=http \
-Dflume.monitoring.port=34545 \
-cp '/staging001/Flume/server/conf://lib/*:/lib/*' \
-Djava.library.path= org.apache.flume.node.Application \
-f /staging001/Flume/server/conf/flume.conf -n client
So, if you look at the flume-ng script,
There's FLUME_CLASSPATH setup, which if absent it is setup based on FLUME_HOME.
# prepend $FLUME_HOME/lib jars to the specified classpath (if any)
if [ -n "${FLUME_CLASSPATH}" ] ; then
FLUME_CLASSPATH="${FLUME_HOME}/lib/*:$FLUME_CLASSPATH"
else
FLUME_CLASSPATH="${FLUME_HOME}/lib/*"
fi
So make sure either of those environments is set. With FLUME_HOME set, (I'm using systemd)
Environment=FLUME_HOME=/staging001/Flume/server/
Here's the working java exec.
/usr/lib/jvm/java-1.8.0-ibm-1.8.0.3.20-1jpp.1.el7_2.x86_64/jre/bin/java -Xms100m -Xmx500m \
$'-Dcom.sun.management.jmxremote\r' \
-Dflume.monitoring.type=http \
-Dflume.monitoring.port=34545 \
-cp '/staging001/Flume/server/conf:/staging001/Flume/server/lib/*:/lib/*' \
-Djava.library.path= org.apache.flume.node.Application \
-f /staging001/Flume/server/conf/flume.conf -n client

Resources