Unable to setup Hadoop single node on Win7 - windows

I am trying to setup Hadoop to run on a single node on windows as per this guide.
When I get to this part:
%HADOOP_PREFIX%\bin\hdfs namenode -format
I get this error:
C:\deploy\etc\hadoop>%HADOOP_PREFIX%\bin\hdfs namenode -format
15/07/29 14:51:11 ERROR util.Shell: Failed to locate the winutils binary in the
hadoop binary path
java.io.IOException: Could not locate executable C:\deploy\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:355)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:370)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:363)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:14
72)
What is the winutils.exe? I don't see it as part of the hadoop tar file :(

Related

How to install hadoop?

I'm trying to get Hadoop running 'Local Mode' on my Windows 10 machine.
When I run the command:
bin/hadoop namenode -format
I get the following error message, and the folder hadoop which contains hdfs Isn't being created. Why do I get this error?
C:\hadoop-2.3.0\bin>hadoop namenode -format
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
18/04/12 01:52:41 FATAL namenode.NameNode: Exception in namenode join
java.lang.ExceptionInInitializerError
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1324)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3107)
at java.base/java.lang.String.substring(String.java:1873)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:49)
... 2 more
Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.util.StringUtils
at org.apache.hadoop.util.ExitUtil.terminate(ExitUtil.java:170)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1331)

java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO Fails to start DFS

I have installed/configured Hadoop on windows hadoop-2.6.0
I couldn't successfully start "sbin\start-dfs" run command.
I am getting below error
16/12/20 13:03:56 FATAL namenode.NameNode: Failed to start namenode.
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.a
ccess0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:5
57)
at org.apache.hadoop.fs.FileUtil.canWrite(FileUtil.java:996)
at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.analyze
Storage(Storage.java:490)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverStorageDirs(FSI
mage.java:308)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(
FSImage.java:202)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.loadFSImage(FSNam
esystem.java:1020)
There was a similar question for running YARN. And it was told that including hadoop-2.6.0/sbin and hadoop-2.6.0/bin in path would resolve the problem. But still i am facing the error.
Can anyone help me in fixing this?
Please make sure you have proper access permission to namenode directory. Also, format the namenode and start the hdfs services.

Unable to write file on HDFS

I have installed hadoop-2.6.0 and also I checked that all the hadoop daemons are running. I am able to create or copy directory in hdfs but not able to copy file to hdfs.
Command:
bin/hadoop fs -copyFromLocal /home/130853/Hadoop_Data/abc /trial/abc
It's giving following exception:
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(II[BI[BIILjava/lang/String;JZ)V
at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSumsByteArray(Native Method)
at org.apache.hadoop.util.NativeCrc32.calculateChunkedSumsByteArray(NativeCrc32.java:86)
at org.apache.hadoop.util.DataChecksum.calculateChunkedSums(DataChecksum.java:430)
at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:202)
at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:163)
at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:144)
at org.apache.hadoop.hdfs.DFSOutputStream.close(DFSOutputStream.java:2217)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:72)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:106)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:54)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:466)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:391)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:328)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:263)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:248)
at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:306)
at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:278)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:243)
at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:260)
at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:244)
at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:220)
at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:267)
at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
Can someone please help on this?
what is your Linux System ? 64bit or 32 bit ?
if it's 64bit , suggest you recomplie hadoop source code in order to it support native lib .
if it's 32bit ,suggest you switch 64bit system.
try execute command ,
hadoop checknative -a

Hadoop: error while format namenodes - 'Could not find or load main class namenodes'

After I installed all needed for uploading hadoop (unix on win-7-64bit), and I got this error (bold):
roeygol#roeygol-PC /etc/hadoop-2.5.1/bin
$ ./hdfs namenodes -format
Error: Could not find or load main class namenodes
I defined the needed nodes as requested and all other configurations, how can I solve this issue?
hdfs namenode -format
its "namenode" not "namenodes"

Hadoop\HDFS: "no such file or directory"

I have installed Hadoop 2.2 on a single machine using this tutorial: http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
Some details were changed a little bit - for example, I used java 8, /hadoop root dir etc. Users, SSH, config keys - the same.
Namenode was successfully formatted:
13/12/22 05:42:31 INFO common.Storage: Storage directory /hadoop/tmp/dfs/name has been successfully formatted.
13/12/22 05:42:31 INFO namenode.FSImage: Saving image file /hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
13/12/22 05:42:32 INFO namenode.FSImage: Image file /hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 198 bytes saved in 0 seconds.
13/12/22 05:42:32 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
13/12/22 05:42:32 INFO util.ExitUtil: Exiting with status 0
13/12/22 05:42:32 INFO namenode.NameNode: SHUTDOWN_MSG:
However, not 'mkdir' neither even 'ls' command worked:
$ /hadoop/hadoop/bin/hadoop fs -ls
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
13/12/22 05:39:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
Thanks for any help guys.
Try
hadoop fs -ls /
Tested on hadoop 2.4
In Hadoop 2.4
hdfs dfs -mkdir /input
hdfs dfs -ls /
Worked in my case:
First Get hadoop installed path by :
echo ${HADOOP_INSTALL} //in my case output is : `/user/local/hadoop`
Then create directory at your hadoop installed path, If you know your hadoop installed directory ignore above command
hadoop fs -mkdir -p /user/local/hadoop/your_directory
Here hadoop is directory
Tested on hadoop 2.4
I have verified this worked in Hadoop 2.5
hdfs dfs -mkdir /input
(where /input is the HDFS directory)

Resources