Running Hadoop On Windows 7 - hadoop

I'm having trouble running Hadoop 1.0.3 on Windows 7 64-bit.
I'm following directions from this link. I've downloaded Cygwin and successfully started the SSH daemon. I unpacked Hadoop into the Cygwin /usr/local folder using Win-Zip 16.5. I edited the hadoop-env.sh to point to my JDK using MetaPad:
export JAVA_HOME="C:\\Program Files\\Java\\jdk1.7.0_02"
I ran dos2unix to make sure that I didn't have any issues with characters.
But when I run the hadoop command in Cygwin terminal to get the version I see this:
$ bin/hadoop version
bin/hadoop: line 2: $'\r': command not found
bin/hadoop: line 17: $'\r': command not found
bin/hadoop: line 18: $'\r': command not found
bin/hadoop: line 49: $'\r': command not found
: No such file or directoryn
bin/hadoop: line 52: $'\r': command not found
bin/hadoop: line 60: syntax error near unexpected token `$'in\r''
'in/hadoop: line 60: `case "`uname`" in
Michael#Michael-PC /usr/local/hadoop
$
Has anyone seen this? Is there an easy correction that I missed?

Seems like carriage return difference between unix and windows is causing the problem .
Try running dos2unix on the shell script
Go to the hadoop bin directory and try :
dos2unix.exe hadoop.sh
And then try the hadoop command .

Related

How to download a Shell script from Github with Unix formatting?

I am trying to share a Shell script from my Github with people. It is for a workshop and I thought the easiest way would be to have them download it with:
wget https://raw.githubusercontent.com/user/repo/master/script.sh
But I am getting this when I try to run it:
script.sh: line 2: $'\r': command not found
script.sh: line 4: $'\r': command not found
script.sh: line 6: $'\r': command not found
script.sh: line 7: syntax error near unexpected token `$'\r''
'cript.sh: line 7: `helpFunction()
However, it works when I convert it to Unix with dos2unix using:
dos2unix script.sh
How can I avoid having to use this extra step? I am also open to other suggestions about sharing my code in a simple way.

Unable to run HBASE from Cygwin from Windows 10

I have configured the HBASE using the below link
https://hbase.apache.org/0.94/cygwin.html
I have configured successfully but unable to run the HBase and the errors that are displayed are not meaningful.
$ ./start-hbase.sh
./start-hbase.sh: line 20: $'\r': command not found
./start-hbase.sh: line 22: $'\r': command not found
./start-hbase.sh: line 28: $'\r': command not found
./start-hbase.sh: line 30: cd: $'.\r': No such file or directory
./start-hbase.sh: line 31: $'\r': command not found
./start-hbase.sh: line 35: $'\r': command not found
: No such file or directory/usr/local/hbase/bin
./start-hbase.sh: line 37: $'\r': command not found
./start-hbase.sh: line 66: syntax error: unexpected end of file
I am unable to understand where to start the debug.
The error is caused by file using CRLF termination instead of expected LF.
Use d2u start-hbase.sh to change it. d2u is part of dos2unix package

Executin hadoop namenode -format

I am trying to do work on hadoop so for that i use:-
Java 1.6
Eclipse Europa 3.3.2
Installing Cygwin
hadoop 0.19.1
when i use the command error occur:-
$ hadoop-*/bin/hadoop namenode -format
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 2: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 7: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 10: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 13: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 16: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 19: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 29: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 32: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 35: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 38: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 41: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 46: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 49: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 52: $'\r': command not found
/home/user/hadoop-0.19.1/bin/../conf/hadoop-env.sh: line 55: $'\r': command not found
cygwin warning:
MS-DOS style path detected: C:\CYGWIN~1\home\user\HADOOP~1.1\/build/native
Preferred POSIX equivalent is: /cygdrive/c/CYGWIN~1/home/user/HADOOP~1.1/build/native
CYGWIN environment variable option "nodosfilewarning" turns off this warning.
Consult the user's guide for more details about POSIX paths:
http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
/bin/java: No such file or directoryC:\Program Files\Java\jdk1.6.0_37
/bin/java: No such file or directoryC:\Program Files\Java\jdk1.6.0_37
/bin/java: cannot execute: No such file or directory Files\Java\jdk1.6.0_37
can anyone help me to remove this error.
thank you
It seems like you have a problem with your newline characters which prevents hadoop to find your java binaries. See this question on stackoverflow.
Try running dos2unix on your hadoop-env.sh.

Hadoop 2.2 Conf fail to first time ./hdfs namenode -format

I put jdk in "E:/cygwin/jdk7",so I updated JAVA_HOME both in environment and hadoop-env.sh to it.
but when I first run ./hdfs namenode -format,I got this:
$ ./hdfs namenode -format
E:/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 18: $'\r': command not found
E:/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 20: $'\r': command not found
E:/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 25: $'\r': command not found
E:/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 32: $'\r': command not found
E:/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 34: $'\r': command not found
E:/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 36: syntax error near unexpected token `$'do\r''':/cygwin/hadoop-2.2.0/etc/hadoop/hadoop-env.sh: line 36: `for f in $HADOOP_HOME/contrib/capacity-scheduler/*.jar; do/bin/java: No such file or directory
/bin/java: cannot execute: No such file or directory
I changed core-site.xml、hdfs-site.xml、mapred-site.xml、yarn-site.xml like hundred times,but it still doesn't work
I appreciate if anyone give any suggestions!

Not able to set JAVA_HOME for hadoop using cygwin

I am trying to set JAVA_HOME in env.sh in hadoop. I am using cygwin on Windows 7.
I have edited the env.sh as :
export JAVA_HOME= "/cygdrive/C/Program Files/Java/jdk1.6.0_26"
In environment variables I have set JAVA_HOME as C:\Program Files\Java\jdk1.6.0_26
And the path as %JAVA_HOME%\bin;c:\cygwin\bin;c:\cygwin\usr\sbin
But still I am getting these errors.
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 2: $'\r': command not found
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 7: $'\r': command not found
': not a valid identifierlibexec/../conf/hadoop-env.sh: line 9: export:
`/cygdrive/C/Program Files/Java/jdk1.6.0_26
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 12: $'\r': command not found
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 41: $'\r': command not found
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 46: $'\r': command not found
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 52: $'\r': command not found
/cygdrive/d/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 55: $'\r': command not found
Error: JAVA_HOME is not set.
I have checked everything possible on this website but still facing the problem.
Use export JAVA_HOME="C:/PROGRA~2/Java/jdk1.6.0_03" #for 32 bit Java arch
Use export export JAVA_HOME="C:/PROGRA~1/Java/jdk1.6.0_03" #for 64 bit Java arch
This will fix your problem... The answer I am giving is pretty late but it will help others who are beginners facing the same problem.
use C:\"Program Files"\Java\jdk1.6.0_26 instead, that should solve the java problem.

Resources