Hadoop 2.6.0 build fails in windows. But Hadoop-2.5.0 built successfully. I am building the source using visual studio 2010 command prompt.
Build Failure
[INFO] Apache Hadoop Common ............................... SUCCESS [03:18 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 15.649 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 29.325 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.031 s]
[INFO] Apache Hadoop HDFS ................................. FAILURE [ 31.917 s]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
Error
[exec] (Link target) ->
[exec] jni_helper.obj : error LNK2019: unresolved external symbol __imp_JNI_CreateJavaVM referenced in function getGlobalJNIEnv [K:\Hadoop-2.6.0\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs.vcxproj]
[exec] jni_helper.obj : error LNK2019: unresolved external symbol __imp_JNI_GetCreatedJavaVMs referenced in function getGlobalJNIEnv [K:Hadoop-2.6.0\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\target\native\hdfs.vcxproj]
Exception
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run
(make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec dir="K:\Hadoop-2.6.0\hadoop-2.6.0-src\hadoop
hdfs-project\hadoop-hdfs\target/native" executable="msbuild" failonerror="true">... #
8:140 in K:\Hadoop-2.6.0\hadoop-2.6.0-src\hadoop-hdfs-project\hadoop-hdfs\targe
\antrun\build-main.xml
[ERROR] -> [Help 1]
I suggest you build hadoop on Linux evn.And then you can copy the dest files to windows.
I have built hadoop2.2 .here is the download link 64bit
Finally i got a solution for the problem. My answer may help to someone.
I have installed 64 bit windows 8 and java jdk 32 bit. That's the problem when compiling hadoop hdfs native source.
I just installed java jdk 64 bit and the problem is resolved.
Related
I am trying to build Spark 2.2.0-rc2 release using mvn but unable to do so.
$ uname -a
Linux knoldus-Vostro-15-3568 4.4.0-46-generic #67-Ubuntu SMP Thu Oct 20 15:05:12 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
$ java -version
openjdk version "1.8.0_131"
Below is the error stack that I am getting:
$ ./build/mvn -Phadoop-2.7,yarn,mesos,hive,hive-thriftserver -DskipTests clean install
...
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-tags_2.11 ---
[INFO] Using zinc server for incremental compilation
java.lang.NoClassDefFoundError: Could not initialize class sun.util.calendar.ZoneInfoFile
at sun.util.calendar.ZoneInfo.getTimeZone(ZoneInfo.java:589)
at java.util.TimeZone.getTimeZone(TimeZone.java:560)
at java.util.TimeZone.setDefaultZone(TimeZone.java:666)
at java.util.TimeZone.getDefaultRef(TimeZone.java:636)
at java.util.Date.<init>(Date.java:254)
at java.util.zip.ZipUtils.dosToJavaTime(ZipUtils.java:71)
at java.util.zip.ZipUtils.extendedDosToJavaTime(ZipUtils.java:88)
at java.util.zip.ZipEntry.getTime(ZipEntry.java:194)
at sbt.IO$.next$1(IO.scala:278)
at sbt.IO$.sbt$IO$$extract(IO.scala:286)
at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255)
at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255)
at sbt.Using.apply(Using.scala:24)
at sbt.IO$.unzipStream(IO.scala:255)
at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249)
at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249)
at sbt.Using.apply(Using.scala:24)
at sbt.IO$.unzip(IO.scala:249)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138)
at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:140)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139)
at sbt.IO$.withTemporaryDirectory(IO.scala:344)
at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139)
at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:58)
at com.typesafe.zinc.Compiler$.compilerInterface(Compiler.scala:148)
at com.typesafe.zinc.Compiler$.create(Compiler.scala:53)
at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40)
at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40)
at com.typesafe.zinc.Cache.get(Cache.scala:41)
at com.typesafe.zinc.Compiler$.apply(Compiler.scala:40)
at com.typesafe.zinc.Main$.run(Main.scala:96)
at com.typesafe.zinc.Nailgun$.zinc(Nailgun.scala:93)
at com.typesafe.zinc.Nailgun$.nailMain(Nailgun.scala:82)
at com.typesafe.zinc.Nailgun.nailMain(Nailgun.scala)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.martiansoftware.nailgun.NGSession.run(NGSession.java:280)
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 5.657 s]
[INFO] Spark Project Tags ................................. FAILURE [ 0.371 s]
[INFO] Spark Project Sketch ............................... SKIPPED
[INFO] Spark Project Networking ........................... SKIPPED
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Unsafe ............................... SKIPPED
[INFO] Spark Project Launcher ............................. SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project ML Local Library ..................... SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Mesos ................................ SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6.855 s
[INFO] Finished at: 2017-05-30T13:47:02+05:30
[INFO] Final Memory: 50M/605M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-tags_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-tags_2.11
I am not able to understand this error.
scala> java.util.TimeZone.getDefault
res0: java.util.TimeZone = sun.util.calendar.ZoneInfo[id="Asia/Kolkata",offset=19800000,dstSavings=0,useDaylight=false,transitions=6,lastRule=null]
locale is LANG=en_IN LANGUAGE=en_IN:en LC_CTYPE="en_IN" LC_NUMERIC="en_IN" LC_TIME="en_IN" LC_COLLATE="en_IN" LC_MONETARY="en_IN" LC_MESSAGES="en_IN" LC_PAPER="en_IN" LC_NAME="en_IN" LC_ADDRESS="en_IN" LC_TELEPHONE="en_IN" LC_MEASUREMENT="en_IN" LC_IDENTIFICATION="en_IN" LC_ALL=
I guess the issue is with your time zone. Export LC_ALL=en_US.UTF-8 and start over. Make sure to have en_US.UTF-8 in locale for all entries.
$ locale
LANG="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_CTYPE="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_ALL="en_US.UTF-8"
I had similar issue and the reason was missing timezone info file in java installation (after updating java 8 to newer version).
When I looked for tzdb.dat file - there was only link pointing to missing target
I was able to solve it on RedHat just by:
yum update tzdata-java
and this site was helpful here:
https://ekuric.wordpress.com/tag/tzdb-dat/
for other OS-es solution might be similar (there is even timezone updater tool from Oracle: https://www.oracle.com/technetwork/java/javase/documentation/tzupdater-readme-136440.html)
In windows 7 64 bit, I try to build hadoop version 2.7.1 as described in the thread:Apache Hadoop 2.7.1 binary for Windows 64-bit platform
i installed all needed software, for c compiler i use the c++ compiler of windows sdk 7.1 (visual studio 2010 isn't installed)
I run from Windows SDK 7.1 Command Prompt with release x64 the following command:
mvn package -Pdist,native-win -DskipTests -Dtar
but the build failed with errors
The C compiler identification is unknown
-- The CXX compiler identification is unknown
CMake Error in CMakeLists.txt:
No CMAKE_C_COMPILER could be found.
CMake Error in CMakeLists.txt:
No CMAKE_CXX_COMPILER could be found.
The main setting of the command Prompt is:
APPVER=6.1
CL=/AI C:\Windows\Microsoft.NET\Framework64\v4.0.30319
CommandPromptType=Native
CURRENT_CPU=x64
FrameworkVersion=v4.0.30319
platform=x64
PlatformToolset=Windows7.1SDK
PROCESSOR_ARCHITECTURE=AMD64
sdkdir=C:\Program Files\Microsoft SDKs\Windows\v7.1\
SESSIONNAME=Console
TARGET_CPU=x64
TARGET_PLATFORM=WIN7
ToolsVersion=4.0
USERDOMAIN=WIN7X64
VS100COMNTOOLS=C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\Tools\
WindowsSDKDir=C:\Program Files\Microsoft SDKs\Windows\v7.1\
WindowsSDKVersionOverride=v7.1
The following is the console output:
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS 2.7.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-os) # hadoop-hdfs ---
....
....
main:
[INFO] Executed tasks
[INFO]
[INFO] --- maven-antrun-plugin:1.7:run (make) # hadoop-hdfs ---
[INFO] Executing tasks
main:
[exec] -- The C compiler identification is unknown
[exec] -- The CXX compiler identification is unknown
[exec] CMake Error in CMakeLists.txt:
[exec] No CMAKE_C_COMPILER could be found.
[exec]
[exec]
[exec]
[exec] CMake Error in CMakeLists.txt:
[exec] No CMAKE_CXX_COMPILER could be found.
[exec]
[exec]
[exec]
[exec] -- Configuring incomplete, errors occurred!
[exec] See also "E:/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/target
/native/CMakeFiles/CMakeOutput.log".
[exec] See also "E:/hadoop-2.7.1-src/hadoop-hdfs-project/hadoop-hdfs/target
/native/CMakeFiles/CMakeError.log".
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 2.995 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 4.477 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 4.696 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.250 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.759 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 3.775 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 3.354 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 4.056 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 3.807 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [02:09 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 12.776 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 15.304 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.031 s]
[INFO] Apache Hadoop HDFS ................................. FAILURE [ 42.105 s]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
.....
.....
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:55 min
[INFO] Finished at: 2016-12-03T14:30:39+02:00
[INFO] Final Memory: 83M/494M
[INFO] ------------------------------------------------------------------------
.....
....
I googled like: The CXX compiler identification is unknown
but i didn't find a solution for my problem
How to configure cmake or maven to avoid this error and build hadoop 2.7.1 using windows sdk 7.1 ?
Should i need to install visual studio 2010?
What i missed to do?
The build failure is due to the issue of the break support of cmake version 3.7.1 for Microsoft Windows SDK for Windows 7.
i used cmake-3.6.3-win64-x64 and the build success, view the screenshot:
I post the Issue #16483 at 3.7.1 broke support for Microsoft Windows SDK for Windows 7
The Cmake issue is planeed to be resolved in Cmake version 3.7.2
I used the following software tools:
apache-maven-3.3.9
cmake-3.6.3-win64-x64
cygwin64
jdk1.7.0_79
protoc-2.5.0-win32
UPDATE-2
I have updated the Windows SDK compilers and now have reduced the errors to 2.
c:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\include\intrin.h(26): fatal error C1083: Cannot open include file: 'ammintrin.h': No such file or directory [C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\native\native.vcxproj]
now where ever I have searched, I have found that to get this .h file I need to install the Microsoft Visual Studio Service Pack 1. I don't have Visual Studio, I am using Windows SDK 7.1. Now, where can I find the equivalent Service Pack 1 for Windows SDK 7.1?
UPDATE-2 END
UPDATE
After staring at the stack trace for sometime I see that the error occurs here
Midl:
C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\midl.exe /W2 /WX /nologo /char signed /env x64 /Oicf /app_config /out"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/" /h "hadoopwinutilsvc_h.h" /tlb "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.tlb" /robust hadoopwinutilsvc.idl
TRACKER : error TRK0002: Failed to execute command: ""C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\midl.exe" /W2 /WX /nologo /char signed /env x64 /Oicf /app_config /outC:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/ /h hadoopwinutilsvc_h.h /tlb C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.tlb /robust hadoopwinutilsvc.idl". The handle is invalid. [C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj]
So, I tried to run just this command
"C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\midl.exe" /W2 /WX /nologo /char signed /env x64 /Oicf /app_config /outC:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/ /h hadoopwinutilsvc_h.h /tlb C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.tlb /robust hadoopwinutilsvc.idl"
Now I get the following error
64 bit Processing .\hadoopwinutilsvc.idl
hadoopwinutilsvc.idl
midl : command line error MIDL1001 : cannot open input file oaidl.idl
When I searched for the file oaidl.ild. I found it in folder
C:\Program Files\Microsoft SDKs\Windows\v7.1\Include
I included this in the PATH and ran the command again with the same error. I noticed that the file name is 'OAIdl.Idl' and not 'oaidl.idl'.
The first line in the file hadoopwinutilsvc.idl is 'include oaidl.idl'.
Does this mean that these is an error in hadoop source?
END UPDATE - Original question below.
I am trying to build hadoop on Windows 7 x64.
I keep getting the following error.
I have provided the relevant stack trace below. Any help will be much appreciated.
The command I ran on the command prompt is mvn package -Pdist,native-win -DskipTests -Dtar
[DEBUG] env: PATH=C:\csvn\bin\;C:\csvn\Python25\;C:\Program Files\CollabNet\Subversion Client;C:\Windows\system32;C:\Windows;C:\Windows\system32\wbem;C:\Windows\system32\windowspowershell\v1.0\;c:\program files\ibm\gsk8\lib64;c:\program files (x86)\ibm\gsk8\lib;C:\IBM\WEBSPH~1\APPSER~1\db2\BIN;C:\IBM\WEBSPH~1\APPSER~1\db2\FUNCTION;C:\IBM\WEBSPH~1\APPSER~1\db2\SAMPLES\REPL;C:\Program Files\nodejs\;%JAVA_HOME%\bin;C:\Program Files (x86)\Skype\Phone\;C:\Program Files (x86)\CMake\bin;C:\Program Files (x86)\Git\cmd;C:\Program Files\Microsoft Windows Performance Toolkit\;C:\Users\ajayamrite\AppData\Roaming\npm;C:\apache-maven-3.3.3/bin;C:\cygwin64\bin;C:\protoc-2.5.0-win32;C:\Windows\Microsoft.NET\Framework64\v4.0.30319;C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin;C:\hadoop\bin;c:\hadoop\sbin;C:\Program Files (x86)\CMake\bin;C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\bin\x86_amd64
[DEBUG] env: PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC
[DEBUG] env: PLATFORM=x64
[DEBUG] env: PROCESSOR_ARCHITECTURE=AMD64
[DEBUG] env: PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 70 Stepping 1, GenuineIntel
[DEBUG] env: PROCESSOR_LEVEL=6
[DEBUG] env: PROCESSOR_REVISION=4601
[DEBUG] env: PROGRAMDATA=C:\ProgramData
[DEBUG] env: PROGRAMFILES=C:\Program Files
[DEBUG] env: PROGRAMFILES(X86)=C:\Program Files (x86)
[DEBUG] env: PROGRAMW6432=C:\Program Files
[DEBUG] env: PROMPT=$P$G
[DEBUG] env: PSMODULEPATH=C:\Windows\system32\WindowsPowerShell\v1.0\Modules\
[DEBUG] env: PUBLIC=C:\Users\Public
[DEBUG] env: PYTHONHOME=C:\csvn\Python25\
[DEBUG] env: SESSIONNAME=Console
[DEBUG] env: SYSTEMDRIVE=C:
[DEBUG] env: SYSTEMROOT=C:\Windows
[DEBUG] env: TEMP=C:\Users\AJAYAM~1\AppData\Local\Temp
[DEBUG] env: TMP=C:\Users\AJAYAM~1\AppData\Local\Temp
[DEBUG] env: USERDOMAIN=bedouinvm
[DEBUG] env: USERNAME=ajayamrite
[DEBUG] env: USERPROFILE=C:\Users\ajayamrite
[DEBUG] env: WDIR=C:\
[DEBUG] env: WINDIR=C:\Windows
[DEBUG] env: ZLIB_HOME=C:\zlib127-dll
[DEBUG] Executing command line: [msbuild, C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common/src/main/winutils/winutils.sln, /nologo, /p:Configuration=Release, /p:OutDir=C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/bin/, /p:IntermediateOutputPath=C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/, /p:WsceConfigDir=../etc/hadoop, /p:WsceConfigFile=wsce-site.xml]
Building the projects in this solution one at a time. To enable parallel build, please add the "/m" switch.
Build started 5/27/2015 3:13:28 PM.
Project "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.sln" on node 1 (default targets).
ValidateSolutionConfiguration:
Building solution configuration "Release|x64".
Project "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.sln" (1) is building "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.vcxproj.metaproj" (2) on node 1 (default targets).
Project "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.vcxproj.metaproj" (2) is building "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj" (3) on node 1 (default targets).
C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\Microsoft.CppBuild.targets(297,5): warning MSB8003: Could not find WindowsSDKDir variable from the registry. TargetFrameworkVersion or PlatformToolset may be set to an invalid version number. [C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj]
InitializeBuildStatus:
Touching "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.unsuccessfulbuild".
Midl:
C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\midl.exe /W2 /WX /nologo /char signed /env x64 /Oicf /app_config /out"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/" /h "hadoopwinutilsvc_h.h" /tlb "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.tlb" /robust hadoopwinutilsvc.idl
TRACKER : error TRK0002: Failed to execute command: ""C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\midl.exe" /W2 /WX /nologo /char signed /env x64 /Oicf /app_config /outC:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/ /h hadoopwinutilsvc_h.h /tlb C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.tlb /robust hadoopwinutilsvc.idl". The handle is invalid. [C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj]
Done Building Project "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj" (default targets) -- FAILED.
Done Building Project "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.vcxproj.metaproj" (default targets) -- FAILED.
Done Building Project "C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.sln" (default targets) -- FAILED.
Build FAILED.
"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.sln" (default target) (1) ->
"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.vcxproj.metaproj" (default target) (2) ->
"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj" (default target) (3) ->
(PrepareForBuild target) ->
C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\Microsoft.CppBuild.targets(297,5): warning MSB8003: Could not find WindowsSDKDir variable from the registry. TargetFrameworkVersion or PlatformToolset may be set to an invalid version number. [C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj]
"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.sln" (default target) (1) ->
"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\winutils.vcxproj.metaproj" (default target) (2) ->
"C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj" (default target) (3) ->
(Midl target) ->
TRACKER : error TRK0002: Failed to execute command: ""C:\Program Files\Microsoft SDKs\Windows\v7.1\Bin\midl.exe" /W2 /WX /nologo /char signed /env x64 /Oicf /app_config /outC:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/ /h hadoopwinutilsvc_h.h /tlb C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\target/winutils/libwinutils.tlb /robust hadoopwinutilsvc.idl". The handle is invalid. [C:\hadoop-2.7.0-src\hadoop-common-project\hadoop-common\src\main\winutils\libwinutils.vcxproj]
1 Warning(s)
1 Error(s)
Time Elapsed 00:00:01.36
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.558 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.334 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 2.580 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.170 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.133 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 2.585 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 2.008 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 3.022 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 2.663 s]
[INFO] Apache Hadoop Common ............................... FAILURE [ 22.818 s]
[INFO] Apache Hadoop NFS .................................. SKIPPED
[INFO] Apache Hadoop KMS .................................. SKIPPED
[INFO] Apache Hadoop Common Project ....................... SKIPPED
[INFO] Apache Hadoop HDFS ................................. SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-server-sharedcachemanager .............. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] hadoop-yarn-site ................................... SKIPPED
[INFO] hadoop-yarn-registry ............................... SKIPPED
[INFO] hadoop-yarn-project ................................ SKIPPED
[INFO] hadoop-mapreduce-client ............................ SKIPPED
[INFO] hadoop-mapreduce-client-core ....................... SKIPPED
[INFO] hadoop-mapreduce-client-common ..................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED
[INFO] hadoop-mapreduce-client-app ........................ SKIPPED
[INFO] hadoop-mapreduce-client-hs ......................... SKIPPED
[INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
[INFO] hadoop-mapreduce ................................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
[INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
[INFO] Apache Hadoop Archives ............................. SKIPPED
[INFO] Apache Hadoop Rumen ................................ SKIPPED
[INFO] Apache Hadoop Gridmix .............................. SKIPPED
[INFO] Apache Hadoop Data Join ............................ SKIPPED
[INFO] Apache Hadoop Ant Tasks ............................ SKIPPED
[INFO] Apache Hadoop Extras ............................... SKIPPED
[INFO] Apache Hadoop Pipes ................................ SKIPPED
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Azure support ........................ SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 43.747 s
[INFO] Finished at: 2015-05-27T15:13:30+01:00
[INFO] Final Memory: 68M/306M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (compile-ms-winutils) on project hadoop-common: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.3.1:exec (compile-ms-winutils) on project hadoop-common: Command execution failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:197)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: Command execution failed.
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:303)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 20 more
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
at org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:402)
at org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:164)
at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:750)
at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:292)
... 22 more
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
I'm also trying to build hadoop 2.7.1 on Windows 7 x64.
For the question at UPDATE-2:
I have downloaded the file "ammintrin.h" and puted it in: "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\include" (I don't have Visual Studio, only installed Windows SDK 7.1)
It worked for me without installing Microsoft Visual Studio Service Pack 1.
Here there is the file: http://www.mathworks.com/matlabcentral/answers/uploaded_files/735/ammintrin.m
(Notice that you have to rename with the .h extension)
I'm trying to compile Spark 1.2.0 using maven 3.2.2, scala 2.10.4, java 1.8.0_05 and what I'm getting is:
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 3.513 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 8.909 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 6.625 s]
[INFO] Spark Project Core ................................. FAILURE [01:06 min]
[INFO] Spark Project Bagel ................................ SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:26 min
[INFO] Finished at: 2015-01-17T22:10:43+01:00
[INFO] Final Memory: 41M/554M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-core_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-core_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:347)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:154)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:143)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: Compilation failed
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:105)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
at sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
at sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
at sbt.inc.Incremental$.compile(Incremental.scala:37)
at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
at com.typesafe.zinc.Compiler.compile(Compiler.scala:184)
at com.typesafe.zinc.Compiler.compile(Compiler.scala:164)
at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:92)
at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)
at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)
at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)
at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
... 20 more
Could you please guide me where might be the problem? I tried to find solution elsewhere. There was a problem reported in SPARK-3794 but I think that I have patched version (although version description in jira ticket is weired).
I'm running packaging with following mvn command:
mvn -Dhadoop.version=2.4.1 -DskipTests clean package -X
Concise
The issue could be caused by or a combination of the following:
openjdk instead of oracle jdk
a zinc server is still running
The JAVA_HOME is incorrect
Verbose
The issue could be caused because openjdk was used:
user#host $ java -version
openjdk version "1.8.0_111"
OpenJDK Runtime Environment (build 1.8.0_111-b15)
OpenJDK 64-Bit Server VM (build 25.111-b15, mixed mode)
instead of the oracle one. Once the path was updated, i.e.:
user#host $ java -version
java version "1.8.0_92"
Java(TM) SE Runtime Environment (build 1.8.0_92-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.92-b14, mixed mode)
the issue did not occur anymore when the following commands:
git clone https://github.com/apache/spark.git && \
cd spark && \
build/mvn -DskipTests clean package
were run and the build succeeded:
[INFO] Kafka 0.10 Source for Structured Streaming ......... SUCCESS [ 9.715 s]
[INFO] Spark Project Java 8 Tests ......................... SUCCESS [ 5.586 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 06:12 min (Wall Clock)
[INFO] Finished at: 2016-12-19T14:20:39+01:00
[INFO] Final Memory: 71M/859M
[INFO] ------------------------------------------------------------------------
The issue could also caused because a Zinc server was still running, e.g.:
user#host $ ps -ef | grep -i zinc
once this process was killed using kill <enter pid zinc server process> the issue was solved.
Ensure that the JAVA_HOME is correct by issuing JAVA_HOME=/opt/jdk1.8.0_92 and verify it by issuing: echo $JAVA_HOME
This might work for you.
Before build run:
./dev/change-scala-version.sh 2.11
to change the Scala version.
To compile Spark with maven, you should do the following steps
Change scala version to the scala version in your machine: ./dev/change-scala-version.sh <version>
Shutdown zinc: ./build/zinc-<version>/bin/zinc -shutdown
Compile Spark: ./build/mvn -Pyarn -Phadoop-<version> -Dscala-<version> -DskipTests clean package
You might try disabling the use of Zinc Server from ./spark/pom.xml
Comment out
<!--<useZincServer>true</useZincServer>-->
I did this, then saw JVM Out of Memory Exception. That also might be the issue for you.
I'm trying to build Hadoop 2.6 on windows and installed the prerequisite as mentioned on their website. I;m getting the following fatal error while building it and the process stops. Any suggestions. Hadoop 2.5.2 built fine without any issues.
Thank you!
s\winutils.vcxproj" (default target) (4) ->
(Link target) ->
LINK : fatal error LNK1123: failure during conversion to COFF: file invalid or
corrupt [C:\Hadoop\hadoop-2.6.0-src\hadoop-common-project\hadoop-common\src\mai
n\winutils\winutils.vcxproj]
57 Warning(s)
1 Error(s)
Time Elapsed 00:00:05.67
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [0.671s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.577s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1.734s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.124s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.498s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1.734s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1.390s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [2.434s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.920s]
[INFO] Apache Hadoop Common .............................. FAILURE [13.970s]
................................
................................
................................
................................
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27.253s
[INFO] Finished at: Tue Dec 23 16:19:50 EST 2014
[INFO] Final Memory: 82M/1045M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2:exec (com
pile-ms-winutils) on project hadoop-common: Command execution failed. Process ex
ited with an error: 1(Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e swit
ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please rea
d the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
xception
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
You need to remove .NET Framework 4.5 and install .NET Framework 4. After that, try to rebuild with the following command :
mvn package -Pdist,native-win -DskipTests -Dtar
Hope it help.
I am not sure if this is still relevant to you but I found the answer. You need .NET Framework 4. I had to remove .NET Framework 4.5 and install 4. Once I did that, the error went away.
Someone in another question had answered it but it was not specific to hadoop.
Failure during conversion to COFF: file invalid or corrupt