When I try to build hadoop using:
mvn install -e -DskipTests
It always throw the following error:
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /home/xiu/myGit/hadoop2/hadoop-mr1-project/hadoop-mr1/src/test/java/org/apache/hadoop/mapreduce/security/TestTokenCache.java:[153,4] getDelegationTokenSecretManager() is not public in org.apache.hadoop.hdfs.server.namenode.FSNamesystem; cannot be accessed from outside package
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop MR1 ................................. FAILURE [3.144s]
[INFO] Apache Hadoop MR1 Examples ........................ SKIPPED
[INFO] Apache Hadoop MR1 Project ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 3.353s
[INFO] Finished at: Thu Jul 18 11:01:30 PDT 2013
[INFO] Final Memory: 32M/100M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile (default-testCompile) on project hadoop-mr1: Compilation failure
[ERROR] /home/xiu/myGit/hadoop2/hadoop-mr1-project/hadoop-mr1/src/test/java/org/apache/hadoop/mapreduce/security/TestTokenCache.java:[153,4] getDelegationTokenSecretManager() is not public in org.apache.hadoop.hdfs.server.namenode.FSNamesystem; cannot be accessed from outside package
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
I have tried clean and reinstall different versions of maven with no luck. Any expert knows what is going on here?
I came across same problem during hadoop 2.2.0 source code building. During "mvn install -DskipTests" this error came in "Hadoop Auth" Folder.
From somewhere(i dont remember from where) i came to know that there is one missing dependency in pom.xml of this Hadoop Auth folder
dependency is
<dependency>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-util</artifactId>
<scope>test</scope>
</dependency>
I added this dependency and again tried "mvn install -DskipTests". My error resolved.
You just try to add this dependency in pom.xml of "APACHE Hadoop MR1". May ur error will resolve
Seems like Maven compiler plugin complaining that getDelegationTokenSecretManager() method in Hadoop FSNameSystem is not accessible due to method visibility i.e. it's not public:
[ERROR] /home/xiu/myGit/hadoop2/hadoop-mr1-project/hadoop-mr1/src/test/java/org/apache/hadoop/mapreduce/security/TestTokenCache.java:[153,4] getDelegationTokenSecretManager() is not public in org.apache.hadoop.hdfs.server.namenode.FSNamesystem; cannot be accessed from outside package
[INFO] 1 error
I would suggest you to check on your Hadoop version defined in your pom.xml, and double check that the method is really accessible in the Hadoop version JavaDoc.
Related
I'm trying to install Giraph 1.1 but ran into an issue. According to this thread I should apply a patch to my installation. Unfortunately, my problem stems from that. I downloaded and copied the .patch file linked in there to the source folder and added the following to the pom.xml:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-patch-plugin</artifactId>
<version>1.2</version>
<configuration>
<patches>
<patch>GIRAPH-1110.02.patch</patch>
</patches>
...
</plugins>
Unfortunately when I run Maven with:
sudo mvn -Phadoop_yarn -Dhadoop.version=2.8.1 clean package -DskipTests
it still fails with the same error as it did before:
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /usr/local/giraph/giraph-core/target/munged/main/org/apache/giraph/job/GiraphJob.java:[213,11] setPingInterval(org.apache.hadoop.conf.Configuration,int) is not public in org.apache.hadoop.ipc.Client; cannot be accessed from outside package
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Giraph Parent ............................... SUCCESS [ 6.298 s]
[INFO] Apache Giraph Core ................................. FAILURE [ 9.359 s]
[INFO] Apache Giraph Examples ............................. SKIPPED
[INFO] Apache Giraph Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16.062 s
[INFO] Finished at: 2018-02-06T22:53:00+02:00
[INFO] Final Memory: 51M/640M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on project giraph-core: Compilation failure
[ERROR] /usr/local/giraph/giraph-core/target/munged/main/org/apache/giraph/job/GiraphJob.java:[213,11] setPingInterval(org.apache.hadoop.conf.Configuration,int) is not public in org.apache.hadoop.ipc.Client; cannot be accessed from outside package
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
which clearly shows that the issue has not been patched. When I check the file manually, I can confirm that there's no change. If I change the file manually before compilation, the changes get discarded (I'm guessing the file gets redownloaded). Also can't run the installation with offline flag since it depends on downloading some dependencies. I'm at my wits end here.
My Maven version is 3.3.9.
I am trying to add the dependency as below to use the version mentioned or the latest.I am expecting maven to take the dependencies with version >= 0.0.1 but maven is failing to resolve the dependencies.If i mention the exact version the dependencies are getting downloaded.
Iam using maven 3.5
<dependency>
<groupId>com.company.esb.fuse</groupId>
<artifactId>common</artifactId>
<version>[0.0.1,)</version>
<scope>provided</scope>
</dependency>
maven logs:
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building source out Write 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.401 s
[INFO] Finished at: 2018-01-22T17:26:21+01:00
[INFO] Final Memory: 14M/245M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project source: Could not resolve dependencies for project com.company.esb.source.out:source:war:0.0.1-SNAPSHOT: Failed to collect dependencies at com.company.esb.fuse:common:jar:[0.0.1,): No versions available for com.company.esb.fuse:common:jar:[0.0.1,) within specified range -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
Above details are not sufficient to determine the exact problem. But according to your maven logs,
No versions available for com.company.esb.fuse:common:jar:[0.0.1,)
Note that it says com.company, not com.compant as you typed in this question.
check your pom.xml first and put correct groupId.
If you are still getting the same error, check your local .m2 folder and confirm that you have at least one artifact with a version which is in the declared range.
UPDATE : the error may be in the maven-metadata file. check maven-metadata-local.xml file placed inside .m2/repository/com/company/esb/fuse/common. All versions should be there under <versions> tag.
I have download Apache Spark 1.4.1 from the official site. As follows:
I don't have hadoop installed in my machine.
Apache provides build command. So, I tried to start building the project using following command
build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
But build was failed with the following error:
[INFO] Spark Project External Kafka Assembly ............. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................ SKIPPED
[INFO] --------------------------------------------------------------------- ---
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.840s
[INFO] Finished at: Wed Jul 29 10:43:04 IST 2015
[INFO] Final Memory: 15M/43M
[INFO] ------------------------------------------------------------------------
[ERROR] Plugin org.apache.maven.plugins:maven-enforcer-plugin:1.4 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.apache.maven.plugins:maven-enforcer-plugin:jar:1.4: Could not transfer artifact org.apache.maven.plugins:maven-enforcer-plugin:pom:1.4 from/to central (https://repo1.maven.org/maven2): peer not authenticated -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
I am new with Apache Spark please give suggestions.
since it is precompiled Binary distribution you have downloaded. you do not need to compile it again using maven2. just put it on path and use it directly.
Im trying to add GIRAPH 1.1.0 to HADOOP 2.6.0
I have to edit the pom.xml somehow in order to package GIRAPH correctly. I run the command mvn -Phadoop_yarn -Dhadoop.version=2.6.0 package
I edited the default pom.xml file in the line (1292) :
<id>hadoop_2</id>
<modules>
<module>giraph-accumulo</module>
<module>giraph-hbase</module>
<module>giraph-hcatalog</module>
<module>giraph-hive</module>
<module>giraph-gora</module>
<module>giraph-rexster</module>
<module>giraph-dist</module>
</modules>
<properties>
<hadoop.version>2.6.0</hadoop.version>
but when I run the command it gives
[INFO] Apache Giraph Parent .............................. SUCCESS [4.183s]
[INFO] Apache Giraph Core ................................ FAILURE [5.364s]
[INFO] Apache Giraph Examples ............................ SKIPPED
[INFO] Apache Giraph Distribution ........................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 9.911s
[INFO] Finished at: Mon Mar 16 19:05:38 EET 2015
[INFO] Final Memory: 55M/1020M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on project giraph-core: Compilation failure: Compilation failure:
[ERROR] /usr/local/giraph/giraph/giraph-core/target/munged/main/org/apache/giraph/comm/netty/SaslNettyClient.java:[84,68] cannot find symbol
[ERROR] symbol: variable SASL_PROPS
[ERROR] location: class org.apache.hadoop.security.SaslRpcServer
[ERROR] /usr/local/giraph/giraph/giraph-core/target/munged/main/org/apache/giraph/comm/netty/SaslNettyServer.java:[105,62] cannot find symbol
[ERROR] symbol: variable SASL_PROPS
[ERROR] location: class org.apache.hadoop.security.SaslRpcServer
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :giraph-core
my question is how should the pom.xml be modified for it to work with hadoop 2.6.0 ?
You are changing the hadoop version of the wrong profile. With the "-P" parameter, you are choosing it wrong.
For building giraph with hadoop_2 profile, you should use -Phadoop_2 instead of -Phadoop_yarn, like this:
mvn -Phadoop_2 -Dhadoop.version=2.6.0 package
But, if you use Phadoop_2 profile, you won't be able to resolve the error that is related to SASL_PROPS variable, so, for avoiding it, you should use -Phadoop_yarn profile (but editing the hadoop version in the hadoop_yarn profile, instead).
I'm trying to build Hadoop 2.6 on windows and installed the prerequisite as mentioned on their website. I;m getting the following fatal error while building it and the process stops. Any suggestions. Hadoop 2.5.2 built fine without any issues.
Thank you!
s\winutils.vcxproj" (default target) (4) ->
(Link target) ->
LINK : fatal error LNK1123: failure during conversion to COFF: file invalid or
corrupt [C:\Hadoop\hadoop-2.6.0-src\hadoop-common-project\hadoop-common\src\mai
n\winutils\winutils.vcxproj]
57 Warning(s)
1 Error(s)
Time Elapsed 00:00:05.67
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................ SUCCESS [0.671s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.577s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1.734s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.124s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.498s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1.734s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1.390s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [2.434s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.920s]
[INFO] Apache Hadoop Common .............................. FAILURE [13.970s]
................................
................................
................................
................................
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27.253s
[INFO] Finished at: Tue Dec 23 16:19:50 EST 2014
[INFO] Final Memory: 82M/1045M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2:exec (com
pile-ms-winutils) on project hadoop-common: Command execution failed. Process ex
ited with an error: 1(Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e swit
ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please rea
d the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
xception
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-common
You need to remove .NET Framework 4.5 and install .NET Framework 4. After that, try to rebuild with the following command :
mvn package -Pdist,native-win -DskipTests -Dtar
Hope it help.
I am not sure if this is still relevant to you but I found the answer. You need .NET Framework 4. I had to remove .NET Framework 4.5 and install 4. Once I did that, the error went away.
Someone in another question had answered it but it was not specific to hadoop.
Failure during conversion to COFF: file invalid or corrupt