I am attempting to install these technologies on a OS X loaded with Mountain Lion for testing purposes. The setup is a single-node setup using 'localhost'.
I am running into a few issues...
1) running sudo zkCli -> ls / throws an error
Exception in thread "main" org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /
at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1468)
at org.apache.zookeeper.ZooKeeper.getChildren(ZooKeeper.java:1496)
at org.apache.zookeeper.ZooKeeperMain.processZKCmd(ZooKeeperMain.java:725)
at org.apache.zookeeper.ZooKeeperMain.processCmd(ZooKeeperMain.java:593)
at org.apache.zookeeper.ZooKeeperMain.executeLine(ZooKeeperMain.java:365)
at org.apache.zookeeper.ZooKeeperMain.run(ZooKeeperMain.java:323)
at org.apache.zookeeper.ZooKeeperMain.main(ZooKeeperMain.java:282)
2) I attempt to run bin/accumulo init and receive this error...
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/start/Platform
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.start.Platform
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/accumulo/start/Main
Caused by: java.lang.ClassNotFoundException: org.apache.accumulo.start.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
It seems that there is something wrong with my class but I am not sure what I need to do.
Here is the summary of the AccumuLo compile:
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Accumulo Project ........................... SUCCESS [17.267s]
[INFO] Trace ............................................. SUCCESS [7.819s]
[INFO] Fate .............................................. SUCCESS [2.638s]
[INFO] Start ............................................. SUCCESS [49.560s]
[INFO] Core .............................................. SUCCESS [2:57.195s]
[INFO] Server ............................................ SUCCESS [23.385s]
[INFO] Examples .......................................... SUCCESS [0.321s]
[INFO] Simple Examples ................................... SUCCESS [19.038s]
[INFO] MiniCluster ....................................... SUCCESS [38.770s]
[INFO] Accumulo Maven Plugin ............................. SUCCESS [20.568s]
[INFO] Testing ........................................... SUCCESS [2:55.802s]
[INFO] Proxy ............................................. SUCCESS [1:36.702s]
[INFO] Assemblies ........................................ SUCCESS [17.033s]
[INFO] Documentation ..................................... SUCCESS [0.282s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10:49.977s
[INFO] Finished at: Thu Aug 29 15:05:17 EDT 2013
[INFO] Final Memory: 33M/83M
[INFO] ------------------------------------------------------------------------
1) Is Zookeeper running?
2) I would recommend running Accumulo from the binary tarball download instead of building it yourself. If you must build it, use mvn package -P assemble. Are you running from trunk? If so, we no longer run accumulo from the source directory. Look for a built tarball in assemble/target, install that and run it. Is accumulo-env.sh configured properly? Do you have any old environment variables configured in your bash scripts setting an incorrect ACCUMULO_HOME?
Related
I've had problems to compile the 3.0.0-alpha1 Hadoop version on kubuntu 12.04 (64 bits).
Apparently, this version requires a HADOOP-8887 patch is applied.
After preparing the installation with all dependencies on a machine,
according to BUILD.txt and apply the patch, the error that would correct by this patch, still remains, as shown in the log file.
All information is welcome to help resolve the issue.
command for compiling:
mvn clean package -Pdist -Dtar -Dmaven.javadoc.skip=true -DskipTests -fail-at-end -Pnative
log file tail:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [0.601s]
[INFO] Apache Hadoop Build Tools ......................... SUCCESS [0.365s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.584s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [0.793s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.100s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.017s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1.050s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [0.407s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [0.978s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.697s]
[INFO] Apache Hadoop Common .............................. SUCCESS [23.360s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [0.600s]
[INFO] Apache Hadoop KMS ................................. SUCCESS [4.903s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.024s]
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [11.244s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [17.609s]
[INFO] Apache Hadoop HDFS Native Client .................. SUCCESS [2.746s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [12.556s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.564s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [0.758s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.021s]
[INFO] Apache Hadoop YARN ................................ SUCCESS [0.021s]
[INFO] Apache Hadoop YARN API ............................ SUCCESS [3.985s]
[INFO] Apache Hadoop YARN Common ......................... SUCCESS [3.795s]
[INFO] Apache Hadoop YARN Server ......................... SUCCESS [0.056s]
[INFO] Apache Hadoop YARN Server Common .................. SUCCESS [1.618s]
[INFO] Apache Hadoop YARN NodeManager .................... SUCCESS [4.077s]
[INFO] Apache Hadoop YARN Web Proxy ...................... SUCCESS [0.401s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ...... SUCCESS [0.847s]
[INFO] Apache Hadoop YARN Timeline Service ............... SUCCESS [1.369s]
[INFO] Apache Hadoop YARN ResourceManager ................ SUCCESS [5.544s]
[INFO] Apache Hadoop YARN Server Tests ................... SUCCESS [0.749s]
[INFO] Apache Hadoop YARN Client ......................... SUCCESS [1.743s]
[INFO] Apache Hadoop YARN SharedCacheManager ............. SUCCESS [0.449s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ........ SUCCESS [0.512s]
[INFO] Apache Hadoop YARN Timeline Service HBase tests ... SUCCESS [1.503s]
[INFO] Apache Hadoop YARN Applications ................... SUCCESS [0.017s]
[INFO] Apache Hadoop YARN DistributedShell ............... SUCCESS [0.500s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher .......... SUCCESS [0.338s]
[INFO] Apache Hadoop YARN Site ........................... SUCCESS [0.018s]
[INFO] Apache Hadoop YARN Registry ....................... SUCCESS [0.844s]
[INFO] Apache Hadoop YARN Project ........................ SUCCESS [8.001s]
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [0.061s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [2.869s]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [1.362s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [0.390s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [1.791s]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [0.848s]
[INFO] Apache Hadoop MapReduce JobClient ................. SUCCESS [2.293s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SUCCESS [0.214s]
[INFO] Apache Hadoop MapReduce NativeTask ................ FAILURE [9.687s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.666s]
[INFO] Apache Hadoop MapReduce ........................... FAILURE [0.515s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.657s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [2.721s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [0.972s]
[INFO] Apache Hadoop Archive Logs ........................ SUCCESS [0.393s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [0.554s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.620s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [0.264s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [0.304s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [2.345s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [0.336s]
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [0.645s]
[INFO] Apache Hadoop Azure support ....................... SUCCESS [0.622s]
[INFO] Apache Hadoop Client .............................. SUCCESS [3.436s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.432s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [1.044s]
[INFO] Apache Hadoop Azure Data Lake support ............. SUCCESS [0.674s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [4.542s]
[INFO] Apache Hadoop Kafka Library support ............... SUCCESS [0.188s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.015s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [18.374s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:58.223s
[INFO] Finished at: Tue Nov 01 17:10:11 BRST 2016
[INFO] Final Memory: 250M/629M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1:cmake-compile (cmake-compile) on project hadoop-ma
preduce-client-nativetask: make failed with error code 2 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:3.0.0-alpha1:cm
ake-compile (cmake-compile) on project hadoop-mapreduce-client-nativetask: make failed with error code 2
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: make failed with error code 2
at org.apache.hadoop.maven.plugin.cmakebuilder.CompileMojo.runMake(CompileMojo.java:229)
at org.apache.hadoop.maven.plugin.cmakebuilder.CompileMojo.execute(CompileMojo.java:96)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-mapreduce: An Ant BuildE
xception has occured: exec returned: 1
[ERROR] around Ant part ...<exec failonerror="true" dir="/home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target" ex
ecutable="bash">... # 10:125 in /home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target/antrun/build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (
dist) on project hadoop-mapreduce: An Ant BuildException has occured: exec returned: 1
around Ant part ...<exec failonerror="true" dir="/home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target" executable
="bash">... # 10:125 in /home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target/antrun/build-main.xml
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 1
around Ant part ...<exec failonerror="true" dir="/home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target" executable
="bash">... # 10:125 in /home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target/antrun/build-main.xml
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
Caused by: /home/hadoop/mnt/hadoop-3.0.0-alpha1-src/hadoop-mapreduce-project/target/antrun/build-main.xml:10: exec returned: 1
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327)
... 21 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-mapreduce-client-nativetask
I'm following the tutorial to build and install hadoop.
http://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os
However, when I give the below command from the VS2010 command prompt:
mvn package -Pdist,native-win -DskipTests -Dtar
I get the below error:
main:
[mkdir] Skipping C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native because it already exists.
[exec] Current OS is Windows 8.1
[exec] Executing 'cmake' with arguments:
[exec] 'C:\hdfs\hadoop-hdfs-project\hadoop-hdfs/src/'
[exec] '-DGENERATED_JAVAH=C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native/javah'
[exec] '-DJVM_ARCH_DATA_MODEL=64'
[exec] '-DREQUIRE_LIBWEBHDFS=false'
[exec] '-DREQUIRE_FUSE=false'
[exec] '-G'
[exec] 'Visual Studio 10 Win64'
[exec]
[exec] The ' characters around the executable and arguments are
[exec] not part of the command.
Execute:Java13CommandLauncher: Executing 'cmake' with arguments:
'C:\hdfs\hadoop-hdfs-project\hadoop-hdfs/src/'
'-DGENERATED_JAVAH=C:\hdfs\hadoop-hdfs-project\hadoophdfs\target/native/javah'
'-DJVM_ARCH_DATA_MODEL=64'
'-DREQUIRE_LIBWEBHDFS=false'
'-DREQUIRE_FUSE=false'
'-G'
'Visual Studio 10 Win64'
The ' characters around the executable and arguments are not part of the command.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 1.781s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 1.333s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.030s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.375s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 2.104s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 6.628s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 1.047s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 1.173s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 1.594s]
[INFO] Apache Hadoop Common ............................... SUCCESS [ 59.046s]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 1.905s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 6.491s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.150s]
[INFO] Apache Hadoop HDFS ................................. FAILURE [ 19.351s]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] hadoop-yarn-site ................................... SKIPPED
[INFO] hadoop-yarn-registry ............................... SKIPPED
[INFO] hadoop-yarn-project ................................ SKIPPED
[INFO] hadoop-mapreduce-client ............................ SKIPPED
[INFO] hadoop-mapreduce-client-core ....................... SKIPPED
[INFO] hadoop-mapreduce-client-common ..................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED
[INFO] hadoop-mapreduce-client-app ........................ SKIPPED
[INFO] hadoop-mapreduce-client-hs ......................... SKIPPED
[INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
[INFO] hadoop-mapreduce ................................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
[INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
[INFO] Apache Hadoop Archives ............................. SKIPPED
[INFO] Apache Hadoop Rumen ................................ SKIPPED
[INFO] Apache Hadoop Gridmix .............................. SKIPPED
[INFO] Apache Hadoop Data Join ............................ SKIPPED
[INFO] Apache Hadoop Ant Tasks ............................ SKIPPED
[INFO] Apache Hadoop Extras ............................... SKIPPED
[INFO] Apache Hadoop Pipes ................................ SKIPPED
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] -----------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] -----------------------------------------------------------------------
[INFO] Total time: 01:47 min
[INFO] Finished at: 2015-03-28T21:18:11+00:00
[INFO] Final Memory: 78M/363M
[INFO] -----------------------------------------------------------------------
[WARNING] The requested profile "native-bin" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): CreateProcess error=2, The system cannot find the file specified [ERROR] around Ant part ...<exec failonerror="true" dir="C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... # 5:107 in C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): CreateProcess error=2, The system cannot find the file specifiedaround Ant part ...<exec failonerror="true" dir="C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... # 5:107 in C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:197)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): CreateProcess error=2, The system cannot find the file specifiedaround Ant part ...<exec failonerror="true" dir="C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake">... # 5:107 in C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 20 more
Caused by: C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml:5: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): CreateProcess error=2, The system cannot find the file specified
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:675)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor19.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327)
... 22 more
Caused by: java.io.IOException: Cannot run program "cmake" (in directory "C:\hdfs\hadoop-hdfs-project\hadoop-hdfs\target\native"): CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at java.lang.Runtime.exec(Runtime.java:620)
at org.apache.tools.ant.taskdefs.Execute$Java13CommandLauncher.exec(Execute.java:862)
at org.apache.tools.ant.taskdefs.Execute.launch(Execute.java:481)
at org.apache.tools.ant.taskdefs.Execute.execute(Execute.java:495)
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:631)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672)
... 34 more
Caused by: java.io.IOException: CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessImpl.create(Native Method)
at java.lang.ProcessImpl.<init>(ProcessImpl.java:386)
at java.lang.ProcessImpl.start(ProcessImpl.java:137)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 40 more
I don't understand the error at all. I have cmake installed and C:\cmake\bin added to the path as well.
Hmm...this error makes me think you had a typo in your Maven command:
[WARNING] The requested profile "native-bin" could not be activated because it does not exist.
Check your command; make sure it's native-win, not native-bin
I also wrote a guide recently for building Hadoop on Server 2012 R2 (should be identical for 8.1). Let me know.
Hello I'm trying to build hadoop 2.6.0 on Windows 8.1. Unfortunately without luck so far.
I have installed:
jdk1.7.0_71 (added Variable JAVA_HOME with value C:\Program Files\Java\jdk1.7.0_71 to the User Variables)
cygwin64 (added its installation-directory as value D:\cygwin64\bin to the PATH Variable under System Variables)
Maven 3.2.5 (added its installation-directory as value D:\maven\bin to the PATH Variable under System Variables)
Protocol Buffer 2.5 (added its installation-directory as value D:\protobuf to the PATH Variable under System Variables)
Visual Studio 2010
In the Visual Studio 2010 Command Prompt (started as Administrator) I have changed the drive to d: and used the folder of the hadoop src as a starting point ("D:\hdp").
I do again set the JAVA_HOME in short notation and do set the Platform
set JAVA_HOME=C:\PROGRA~1\Java\jdk1.7.0_71
set Platform=x64
Afterwards I try to build hadoop using the following Maven command:
mvn -e -X package -Pdist -DskipTests -Dtar
After Building fails at "Apache Hadoop Common" stating the following:
[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.6.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc failure -> [Help 1]
I have slightly changed the file "ProtocMojo.java" under D:\hdp\hadoop-maven-plugins\src\main\java\org\apache\hadoop\maven\plugin\protoc by inserting the following afer line 56 in the file:
protocCommand = "D:\\protobuf\\protoc.exe";
This helps to build further to "Apache Hadoop HDFS" where it fails again stating as follows:
[INFO] Executing tasks
Build sequence for target(s) `main' is [main]
Complete build sequence is [main, ]
main:
[mkdir] Skipping D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target\native because it already exists.
[exec] Current OS is Windows 8.1
[exec] Executing 'cmake' with arguments:
[exec] 'D:\hdp\hadoop-hdfs-project\hadoop-hdfs/src/'
[exec] '-DGENERATED_JAVAH=D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native/javah'
[exec] '-DJVM_ARCH_DATA_MODEL=64'
[exec] '-DREQUIRE_LIBWEBHDFS=false'
[exec] '-DREQUIRE_FUSE=false'
[exec] '-G'
[exec] 'Visual Studio 10 Win64'
[exec]
[exec] The ' characters around the executable and arguments are
[exec] not part of the command.
Execute:Java13CommandLauncher: Executing 'cmake' with arguments:
'D:\hdp\hadoop-hdfs-project\hadoop-hdfs/src/'
'-DGENERATED_JAVAH=D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native/javah'
'-DJVM_ARCH_DATA_MODEL=64'
'-DREQUIRE_LIBWEBHDFS=false'
'-DREQUIRE_FUSE=false'
'-G'
'Visual Studio 10 Win64'
The ' characters around the executable and arguments are
not part of the command.
[exec] CMake Error: Could not create named generator Visual Studio 10 Win64
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 0.906 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 0.719 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 1.469 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.265 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 1.766 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 5.516 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 1.431 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 2.119 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 1.969 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:11 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 4.087 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 11.742 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.110 s]
[INFO] Apache Hadoop HDFS ................................. FAILURE [ 11.782 s]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] hadoop-yarn-site ................................... SKIPPED
[INFO] hadoop-yarn-registry ............................... SKIPPED
[INFO] hadoop-yarn-project ................................ SKIPPED
[INFO] hadoop-mapreduce-client ............................ SKIPPED
[INFO] hadoop-mapreduce-client-core ....................... SKIPPED
[INFO] hadoop-mapreduce-client-common ..................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED
[INFO] hadoop-mapreduce-client-app ........................ SKIPPED
[INFO] hadoop-mapreduce-client-hs ......................... SKIPPED
[INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
[INFO] hadoop-mapreduce ................................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
[INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
[INFO] Apache Hadoop Archives ............................. SKIPPED
[INFO] Apache Hadoop Rumen ................................ SKIPPED
[INFO] Apache Hadoop Gridmix .............................. SKIPPED
[INFO] Apache Hadoop Data Join ............................ SKIPPED
[INFO] Apache Hadoop Ant Tasks ............................ SKIPPED
[INFO] Apache Hadoop Extras ............................... SKIPPED
[INFO] Apache Hadoop Pipes ................................ SKIPPED
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:57 min
[INFO] Finished at: 2015-03-09T17:08:10+01:00
[INFO] Final Memory: 88M/1092M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec dir="D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake" failonerror="true">... # 5:106 in D:\hdp\hadoop-hdfs-project\hadoop-h
dfs\target\antrun\build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildExcep
tion has occured: exec returned: 1
around Ant part ...<exec dir="D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake" failonerror="true">... # 5:106 in D:\hdp\hadoop-hdfs-project\hadoop-hdfs\targ
et\antrun\build-main.xml
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 1
around Ant part ...<exec dir="D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake" failonerror="true">... # 5:106 in D:\hdp\hadoop-hdfs-project\hadoop-hdfs\targ
et\antrun\build-main.xml
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml:5: exec returned: 1
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327)
... 21 more
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-hdfs
This is where I'm stucked and do not know how to solve this problem. Help is appreciated as I am new to hadoop and building in general.
Thanks!
elro
I had the same error. Fixed it by making sure the cmake wasn't the one that comes with cygwin. For some reason, that cmake doesn't recognize Visual Studio.
Did you forget the native-win profile? Try mvn package -Pdist,native-win -DskipTests -Dtar
I've built hadoop on Windows before and didn't have to change any source code. Try changing the source back, run the above command, and post more of the protoc error you got with maven.
I need help as i am trying to figure this out from last 2-3 days..
I am setting up Hadoop on Windows-7 (64-bit) machine. This is to try out the integration of R with Hadoop.
I followed instructions for Hadoop installation as given in the URL - http://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os
Environment Variables are as below
JAVE_HOME : C:\Program Files\Java\jdk1.6.0_45
M2_HOME : C:\Hadoop\apache-maven-3.1.1
PATH : C:\cygwin64\bin;C:\Hadoop\apache-maven-3.1.1\bin;C:\Hadoop\protoc-2.5.0-win32
mvn package -Pdist,native-win -DskipTests -Dtar
i ran above command from Windows SDK 7.1 command prompt and getting error below... Early help is hugely appreciated..
****************************************************************************************
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [8.893s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [4.782s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [9.500s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [2.145s]
[INFO] Apache Hadoop Project Dist POM .................... FAILURE [4.141s]
[INFO] Apache Hadoop Maven Plugins ....................... SKIPPED
[INFO] Apache Hadoop Auth ................................ SKIPPED
[INFO] Apache Hadoop Auth Examples ....................... SKIPPED
[INFO] Apache Hadoop Common .............................. SKIPPED
[INFO] Apache Hadoop NFS ................................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] Apache Hadoop HDFS ................................ SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SKIPPED
[INFO] hadoop-yarn ....................................... SKIPPED
[INFO] hadoop-yarn-api ................................... SKIPPED
[INFO] hadoop-yarn-common ................................ SKIPPED
[INFO] hadoop-yarn-server ................................ SKIPPED
[INFO] hadoop-yarn-server-common ......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager .................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
[INFO] hadoop-yarn-server-tests .......................... SKIPPED
[INFO] hadoop-yarn-client ................................ SKIPPED
[INFO] hadoop-yarn-applications .......................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
[INFO] hadoop-mapreduce-client ........................... SKIPPED
[INFO] hadoop-mapreduce-client-core ...................... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SKIPPED
[INFO] hadoop-yarn-site .................................. SKIPPED
[INFO] hadoop-yarn-project ............................... SKIPPED
[INFO] hadoop-mapreduce-client-common .................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
[INFO] hadoop-mapreduce-client-app ....................... SKIPPED
[INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
[INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] hadoop-mapreduce .................................. SKIPPED
[INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
[INFO] Apache Hadoop Distributed Copy .................... SKIPPED
[INFO] Apache Hadoop Archives ............................ SKIPPED
[INFO] Apache Hadoop Rumen ............................... SKIPPED
[INFO] Apache Hadoop Gridmix ............................. SKIPPED
[INFO] Apache Hadoop Data Join ........................... SKIPPED
[INFO] Apache Hadoop Extras .............................. SKIPPED
[INFO] Apache Hadoop Pipes ............................... SKIPPED
[INFO] Apache Hadoop Tools Dist .......................... SKIPPED
[INFO] Apache Hadoop Tools ............................... SKIPPED
[INFO] Apache Hadoop Distribution ........................ SKIPPED
[INFO] Apache Hadoop Client .............................. SKIPPED
[INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 32.981s
[INFO] Finished at: Thu Feb 13 14:06:51 IST 2014
[INFO] Final Memory: 32M/190M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (pre-dist) on project hadoop-project-dist: An Ant BuildException has occured : exec returned: 2 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (pre-dist) on project hadoop-project-dist: An Ant BuildException has occured: exec returned: 2
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:317)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:152)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:555)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:214)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:158)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException
has occured: exec returned: 2
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:283)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:106)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: C:\Hadoop\hdfs\hadoop-project-dist\target\antrun\build-main.xml:31: exec returned: 2
at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:650)
at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)
at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:390)
at org.apache.tools.ant.Target.performTasks(Target.java:411)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)
at org.apache.tools.ant.Project.executeTarget(Project.java:1366)
at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:270)
... 21 more
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-project-dist
Thanks in advance
Gopal
Have you solved the problem? I had the similar problem when I was installing Hadoop. I found that all the problems in building the source file phase were related to Path variable setting.
1. Make sure you have JDK installed not only JRE available
2. One additional path setting needed, if you are following instruction on this link: http://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os :
Add .NET MsBuild.exe home directory to Path, it should be similar to this C:\Windows\Microsoft.NET\Framework64\v4.030319
3. Check all the path values are corrected. During my installation, I had a problem which was related to Cygwin path, cygwin was installed in the folder named cygwin64, but at the beginning I just add cygwin as path value, which took me some time to figure it out.
I use the following command, it succeeds.
mvn clean package -DskipTests -Dtar
"mvn package -Pdist,native-win -DskipTests -Dtar" According to this command you are trying to compile and package "native libraries"(native-win keyword) and to do this whatever packages given in your given link is not sufficient.
To solve this you have two alternatives:
1) No need of native libraries, hadoop will use "builtin-java classes" instead of native libraries. use this command : "mvn package -Pdist -DskipTests -Dtar" instead of yours
2)if you want to go for native libraries- Follow Native Libraries Guide. Here all needed packages and other important information for building native libraries is given.
My suggestion is use 1st option because i read somewhere that building native libraries on windows have lot of issues and not preferred on windows.
Read "Building.txt" in Hadoop src folder it may help you for using variety of commands for packaging.
Above may be the solution for your problem.Try it
I had the same error. When I ran mvn package -Pdist,native-win -DskipTests -Dtar with -X, maven showed me that there are some troubles with executing dist-copynativelibs.sh script.
The problem was that Windows command prompt can't run sh commands, but only bash commands. Don't know why - Cygwin was installed correctly and added to path. So I have changed executable="sh" to executable="bash" in pom file D:\hdfs\hadoop-project-dist\pom.xml:
<exec executable="bash" dir="${project.build.directory}" failonerror="true">
<arg line="./dist-copynativelibs.sh"/>
</exec>
and build was success.
I'm using maven to build my project and cobertura for UT code coverage collection.
My problem is when I try to build the project, I met the error below, finally I found the failure is caused by cobertura plugin when I was issuing the command : mvn cobertura:instrument.
[INFO] Unable to obtain CommandsFile location.
Embedded error: Permission denied
I tried to go through the directories and I think I have the permission.
Have anyone ever seen the failure before? How to debug it?
$ mvn cobertura:instrument
[INFO] Scanning for projects...
[INFO] Reactor build order:
[INFO] MyProject
[INFO] ------------------------------------------------------------------------
[INFO] Building XXXXXX
[INFO] task-segment: [cobertura:instrument]
[INFO] ------------------------------------------------------------------------
[INFO] [cobertura:instrument]
[INFO] Skipping cobertura mojo for project with packaging type 'pom'
[INFO] ------------------------------------------------------------------------
[INFO] Building ato-client
[INFO] task-segment: [cobertura:instrument]
[INFO] ------------------------------------------------------------------------
[INFO] [cobertura:instrument]
[INFO] ------------------------------------------------------------------------
[ERROR] BUILD ERROR
[INFO] ------------------------------------------------------------------------
[INFO] Unable to obtain CommandsFile location.
Embedded error: Permission denied
[INFO] ------------------------------------------------------------------------
[INFO] Trace
org.apache.maven.lifecycle.LifecycleExecutionException: Unable to obtain CommandsFile location.
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:703)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeStandaloneGoal(DefaultLifecycleExecutor.java:553)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:523)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:371)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:332)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:181)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:356)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:137)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:356)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315)
at org.codehaus.classworlds.Launcher.launch(Launcher.java:255)
at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430)
at org.codehaus.classworlds.Launcher.main(Launcher.java:375)
Caused by: org.apache.maven.plugin.MojoExecutionException: Unable to obtain CommandsFile location.
at org.codehaus.mojo.cobertura.tasks.AbstractTask.executeJava(AbstractTask.java:195)
at org.codehaus.mojo.cobertura.tasks.InstrumentTask.execute(InstrumentTask.java:131)
at org.codehaus.mojo.cobertura.CoberturaInstrumentMojo.execute(CoberturaInstrumentMojo.java:145)
at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:483)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:678)
... 16 more
Caused by: java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at java.io.File.createTempFile(File.java:1828)
at net.sourceforge.cobertura.util.CommandLineBuilder.<init>(CommandLineBuilder.java:96)
at org.codehaus.mojo.cobertura.tasks.CommandLineArguments.getCommandsFile(CommandLineArguments.java:82)
at org.codehaus.mojo.cobertura.tasks.AbstractTask.executeJava(AbstractTask.java:191)
... 20 more
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2 seconds
[INFO] Finished at: Tue Jan 22 09:30:25 CST 2013
[INFO] Final Memory: 22M/241M
[INFO] ------------------------------------------------------------------------
I think your current user doesn't have write permissions to the temporary folder (check java.io.tmpdir system property value):
Caused by: java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1704)
at java.io.File.createTempFile(File.java:1792)
at java.io.File.createTempFile(File.java:1828)
at net.sourceforge.cobertura.util.CommandLineBuilder.(CommandLineBuilder.java:96)
at enter code here
The error said the user of java process has no write permission on temp directory (/tmp).
Java process will write pid to a file on the temp directory.
Jps/jstat can use this pid file to get jvm informations.
See also:
jstat
How can I prevent Java from creating hsperfdata files
jps
Have you checked that the user has permissions to write to the standard temporary file directory provided by the java configuration?