Hive on window 10 using cygwin: Path error - hadoop

I got this error when install hive on window 10 using cygwin.
I had added this code
C:\WINDOWS\system32>mklink /J D:\cygdrive\d\ d:\
But the path like below. I think Cygwin doesn't understand cygdrive\d\hadoop\hive3.1.2....
It must be cygdrive/d/hadoop/hive3.1.2/....
Error
org.apache.hadoop.hive.metastore.HiveMetaException: File /cygdrive/d/hadoop/hive3.1.2\scripts\metastore\upgrade\mssql\upgrade.order.mssqlnot found
Underlying cause: java.io.FileNotFoundException : \cygdrive\d\hadoop\hive3.1.2\scripts\metastore\upgrade\mssql\upgrade.order.mssql (The system cannot find the path specified)
How can I fix this?
Update: Solved.
I was fixed by using Wsl instead of Cygwin. In Wsl, install openJdk8 of Java, set Java_home to /user/lib/jvm/openJdk8... then rerun hive command again. Done.
It seems Cygwin calls java of windows, it should be using Linux version.

Related

Error for Kafka install on windows 7(without Cygwin)

I am trying to install kafka_2.10-0.10.1.0 on windows 7(without Cygwin), but I am getting below error when I am trying to start server
.\bin\windows\zookeeper-server-start.bat .\config\server.properties
The syntax of the command is incorrect.
The syntax of the command is incorrect.
I have followed below steps.
Downloaded kafka_2.10-0.10.1.0 folder.
Changed log.dirs path in server.properties
Set the JAVA_HOME path in system
Please guide to install

Apache Spark with Hadoop distribution failing to run on Windows

I tried running spark-1.5.1-bin-hadoop2.6 distribution (and newer versions of Spark with same results) on Windows using Cygwin.
When trying to execute spark-shell script in the bin folder, I get below output:
Error: Could not find or load main class org.apache.spark.launcher.Main
I tried to set CLASSPATH to the location of lib/spark-assembly-1.5.1-hadoop2.6.0.jar but to no avail.
(FYI: I am able to run the same distribution fine on my MAC with no extra setup steps required)
Please assist in finding resolution for Cygwin execution on Windows.
I ran into and solved a similar problem with cywin on Windows 10 and spark-1.6.0.
build with Maven (maybe you're past this step)
mvn -DskipTests package
make sure JAVA_HOME is set to a JDK
$ export JAVA_HOME="C:\Program Files\Java\jdk1.8.0_60"
$ ls "$JAVA_HOME"
bin include LICENSE THIRDPARTYLICENSEREADME.txt ....
use the Windows batch file. Launch from PowerShell or CommandPrompt if you have terminal problems with cygwin.
$ chmod a+x bin/spark-shell.cmd
$ ./bin/spark-shell.cmd
My solution to the problem was to move the Spark installation into a path that didn't have spaces in it. Under Program Files I got the above error, but moving it directly under C:\ and running spark-shell.bat file cleared it up.

JSP, OSX 10.9.4 & ImageMagick - "/opt/local/bin/identify": error=2, No such file or directory

I am currently working on a large legacy JSP project - not that I know anything about JSP. I have tried to set up the app in IntelliJ and I have set numerous keys in my .MacOSX/environment.plist file however when I try to upload an image (ImageMagick is used) in my local environment in the application I get the following error in the debug terminal (the error is all on one line):
.ImageUploadException: command /opt/local/bin/identify -format %w;%h; - failed: java.util.concurrent.ExecutionException: java.io.IOException: Cannot run program "/opt/local/bin/identify": error=2, No such file or directory
to make the error easier to read
.ImageUploadException: command /opt/local/bin/identify -format %w;%h;
- failed: java.util.concurrent.ExecutionException: java.io.IOException:
Cannot run program "/opt/local/bin/identify": error=2, No such file or directory
After some research I create a launchd.conf file in the /etc folder and added the following line as advised on Stackoverflow and some other places:
setenv PATH /usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin
I rebooted and I still get the same error? If I run echo $PATH in my terminal I get the following:
/opt/local/bin:/opt/local/sbin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/local/git/bin
Does anyone know what I am doing wrong? I am using a MacBook Pro running OS X 10.9.4 and MacPorts is installed as when I run $ which port I get the following: /opt/local/bin/port
Thanks in advance
What I did was reinstall the X-Code tools and reinstalled ImageMagick after the amends above... it all seems to work now?

Apache pig in Cygwin

Is there any sources available for running Apache in Cygwin. With the latest Hadoop version i was able to setup a hadoop cluster in windows machine successfully, but I can't make PIG run in a cygwin terminal. The following error returns while attempting invoking pig grunt.
$ pig -x local
cygwin warning:
MS-DOS style path detected: c:\pig/conf/pig.properties
Preferred POSIX equivalent is: /cygdrive/c/pig/conf/pig.properties
CYGWIN environment variable option "nodosfilewarning" turns off this warning.
Consult the user's guide for more details about POSIX paths:
http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
cygpath: cannot create short name of C:\pig\logs
Cannot locate pig-withouthadoop.jar. do 'ant jar-withouthadoop', and try again.
Any help would be appreciated.
Thanks
To resolve the above error, I have rebuild pig for hadoop-2.2.0 as described in the below link and able to get rid of the exception.
http://javatute.com/javatute/faces/post/hadoop/2014/installing-pig-11-for-hadoop-2-on-ubuntu-12-lts.xhtml

bin/hadoop: line 133: C:Java/jdk1.7.0_45/bin/java: No such file or directory

Can someone help on this? I am trying to get hadoop 2.2.0 version and got error message
$ bin/hadoop version
bin/hadoop: line 133: C:Java/jdk1.7.0_45/bin/java: No such file or directory
bin/hadoop: line 133: exec: C:Java/jdk1.7.0_45/bin/java: cannot execute: No such file or directory
I am trying to install single instance hadoop on windows 7/64.
I did install Cygwin64 and hadoop on "c/+1/Hadoop/hadoop-2.2.0"
JAVA_HOME is
$ echo $JAVA_HOME
c:Java/jdk1.7.0_45
Any idea will be more than welcome so feel free to fire up!
"C:Java/jdk1.7.0_45/bin/java" is neither a valid Windows path nor a valid cygwin path. So your JAVA_HOME is set incorrectly. Set it to the directory where you installed the JDK. Maybe you mean "/cydrive/c/Java/jdk1.7.0_45/bin/java". Using "where java" or "which java" might help a bit.
(opinion follows...)
In my experience trying to set up hadoop on windows using cygwin is a tough battle, and usually not worth the effort. When I have to develop on Windows machines I usually set up a virtual machine running Linux, and everything tends to go much smoother.

Resources