I want to use Oozie for automating Spark Jobs. I'm trying to build Oozie using Maven 3.3.9. I have modified pom.xml file for hadoop 2.7.1, java 1.7, hbase 1.2.0, oozie 4.3.0, I got the error described below during ./mkdistro.sh Maven build command. It sounds like it is related to hbase credentials class. what modifications I have to do to pass this error?
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) # oozie-core ---
[INFO] Compiling 517 source files to /usr/lib/oozie/oozie-4.3.0/core/target/classes
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[28,45] error: package org.apache.hadoop.hbase.security.token does not exist
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[29,45] error: package org.apache.hadoop.hbase.security.token does not exist
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[83,14] error: cannot find symbol
[ERROR] class HbaseCredentials
/usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[84,48] error: cannot find symbol
[ERROR] class HbaseCredentials
/usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[85,29] error: cannot find symbol
[ERROR] class AuthenticationTokenIdentifier
/usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[86,27] error: cannot find symbol
[INFO] 6 errors
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Oozie Main .................................. SUCCESS [ 2.452 s]
[INFO] Apache Oozie Hadoop Utils hadoop-2-4.3.0 ........... SUCCESS [ 4.780 s]
[INFO] Apache Oozie Hadoop Distcp hadoop-2-4.3.0 .......... SUCCESS [ 0.151 s]
[INFO] Apache Oozie Hadoop Auth hadoop-2-4.3.0 Test ....... SUCCESS [ 0.698 s]
[INFO] Apache Oozie Hadoop Libs ........................... SUCCESS [ 0.082 s]
[INFO] Apache Oozie Client ................................ SUCCESS [ 19.330 s]
[INFO] Apache Oozie Share Lib Oozie ....................... SUCCESS [ 4.273 s]
[INFO] Apache Oozie Share Lib HCatalog .................... SUCCESS [ 4.933 s]
[INFO] Apache Oozie Share Lib Distcp ...................... SUCCESS [ 1.104 s]
[INFO] Apache Oozie Core .................................. FAILURE [ 11.598 s]
[INFO] Apache Oozie Share Lib Streaming ................... SKIPPED
[INFO] Apache Oozie Share Lib Pig ......................... SKIPPED
[INFO] Apache Oozie Share Lib Hive ........................ SKIPPED
[INFO] Apache Oozie Share Lib Hive 2 ...................... SKIPPED
[INFO] Apache Oozie Share Lib Sqoop ....................... SKIPPED
[INFO] Apache Oozie Examples .............................. SKIPPED
[INFO] Apache Oozie Share Lib Spark ....................... SKIPPED
[INFO] Apache Oozie Share Lib ............................. SKIPPED
[INFO] Apache Oozie Docs .................................. SKIPPED
[INFO] Apache Oozie WebApp ................................ SKIPPED
[INFO] Apache Oozie Tools ................................. SKIPPED
[INFO] Apache Oozie MiniOozie ............................. SKIPPED
[INFO] Apache Oozie Distro ................................ SKIPPED
[INFO] Apache Oozie ZooKeeper Security Tests .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 52.026 s
[INFO] Finished at: 2018-07-25T11:42:47+03:00
[INFO] Final Memory: 127M/797M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile (default-compile) on project oozie-core: Compilation failure: Compilation failure:
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[28,45] error: package org.apache.hadoop.hbase.security.token does not exist
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[29,45] error: package org.apache.hadoop.hbase.security.token does not exist
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[83,14] error: cannot find symbol
[ERROR] class HbaseCredentials
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[84,48] error: cannot find symbol
[ERROR] class HbaseCredentials
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[85,29] error: cannot find symbol
[ERROR] class AuthenticationTokenIdentifier
[ERROR] /usr/lib/oozie/oozie-4.3.0/core/src/main/java/org/apache/oozie/action/hadoop/HbaseCredentials.java:[86,27] error: cannot find symbol
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
Related
Hi i'm trying to build giraph at virtualbox ubuntu.
I followed under two link
http://giraph.apache.org/quick_start.html
https://lab.hypotheses.org/1207
hadoop worked well but giraph was not installed both of cases...
mvn package -DskipTests
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] Apache Giraph Parent
[INFO] Apache Giraph Core
[INFO] Apache Giraph Blocks Framework
[INFO] Apache Giraph Examples
[INFO] Apache Giraph Accumulo I/O
[INFO] Apache Giraph HBase I/O
[INFO] Apache Giraph HCatalog I/O
[INFO] Apache Giraph Gora I/O
[INFO] Apache Giraph Distribution
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Giraph Parent 1.3.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
Downloading: https://repo.maven.apache.org/maven2/org/apache/maven/plugins/maven-remote-resources-plugin/1.4/maven-remote-resources-plugin-1.4.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Giraph Parent ............................... FAILURE [ 1.126 s]
[INFO] Apache Giraph Core ................................. SKIPPED
[INFO] Apache Giraph Blocks Framework ..................... SKIPPED
[INFO] Apache Giraph Examples ............................. SKIPPED
[INFO] Apache Giraph Accumulo I/O ......................... SKIPPED
[INFO] Apache Giraph HBase I/O ............................ SKIPPED
[INFO] Apache Giraph HCatalog I/O ......................... SKIPPED
[INFO] Apache Giraph Gora I/O ............................. SKIPPED
[INFO] Apache Giraph Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.995 s
[INFO] Finished at: 2020-03-23T23:53:13+09:00
[INFO] Final Memory: 16M/74M
[INFO] ------------------------------------------------------------------------
[ERROR] Plugin org.apache.maven.plugins:maven-remote-resources-plugin:1.4 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.apache.maven.plugins:maven-remote-resources-plugin:jar:1.4: Could not transfer artifact org.apache.maven.plugins:maven-remote-resources-plugin:pom:1.4 from/to central (https://repo.maven.apache.org/maven2): Received fatal alert: protocol_version -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
are there any information for solving this problem, say to me. i gonna reply as fast as i can.
I am trying to build Spark 2.2.0-rc2 release using mvn but unable to do so.
$ uname -a
Linux knoldus-Vostro-15-3568 4.4.0-46-generic #67-Ubuntu SMP Thu Oct 20 15:05:12 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux
$ java -version
openjdk version "1.8.0_131"
Below is the error stack that I am getting:
$ ./build/mvn -Phadoop-2.7,yarn,mesos,hive,hive-thriftserver -DskipTests clean install
...
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-tags_2.11 ---
[INFO] Using zinc server for incremental compilation
java.lang.NoClassDefFoundError: Could not initialize class sun.util.calendar.ZoneInfoFile
at sun.util.calendar.ZoneInfo.getTimeZone(ZoneInfo.java:589)
at java.util.TimeZone.getTimeZone(TimeZone.java:560)
at java.util.TimeZone.setDefaultZone(TimeZone.java:666)
at java.util.TimeZone.getDefaultRef(TimeZone.java:636)
at java.util.Date.<init>(Date.java:254)
at java.util.zip.ZipUtils.dosToJavaTime(ZipUtils.java:71)
at java.util.zip.ZipUtils.extendedDosToJavaTime(ZipUtils.java:88)
at java.util.zip.ZipEntry.getTime(ZipEntry.java:194)
at sbt.IO$.next$1(IO.scala:278)
at sbt.IO$.sbt$IO$$extract(IO.scala:286)
at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255)
at sbt.IO$$anonfun$unzipStream$1.apply(IO.scala:255)
at sbt.Using.apply(Using.scala:24)
at sbt.IO$.unzipStream(IO.scala:255)
at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249)
at sbt.IO$$anonfun$unzip$1.apply(IO.scala:249)
at sbt.Using.apply(Using.scala:24)
at sbt.IO$.unzip(IO.scala:249)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$5.apply(AnalyzingCompiler.scala:140)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:111)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at scala.collection.TraversableOnce$class.$div$colon(TraversableOnce.scala:138)
at scala.collection.AbstractTraversable.$div$colon(Traversable.scala:105)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:140)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:139)
at sbt.IO$.withTemporaryDirectory(IO.scala:344)
at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:139)
at sbt.compiler.IC$.compileInterfaceJar(IncrementalCompiler.scala:58)
at com.typesafe.zinc.Compiler$.compilerInterface(Compiler.scala:148)
at com.typesafe.zinc.Compiler$.create(Compiler.scala:53)
at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40)
at com.typesafe.zinc.Compiler$$anonfun$apply$1.apply(Compiler.scala:40)
at com.typesafe.zinc.Cache.get(Cache.scala:41)
at com.typesafe.zinc.Compiler$.apply(Compiler.scala:40)
at com.typesafe.zinc.Main$.run(Main.scala:96)
at com.typesafe.zinc.Nailgun$.zinc(Nailgun.scala:93)
at com.typesafe.zinc.Nailgun$.nailMain(Nailgun.scala:82)
at com.typesafe.zinc.Nailgun.nailMain(Nailgun.scala)
at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.martiansoftware.nailgun.NGSession.run(NGSession.java:280)
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 5.657 s]
[INFO] Spark Project Tags ................................. FAILURE [ 0.371 s]
[INFO] Spark Project Sketch ............................... SKIPPED
[INFO] Spark Project Networking ........................... SKIPPED
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Unsafe ............................... SKIPPED
[INFO] Spark Project Launcher ............................. SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project ML Local Library ..................... SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Mesos ................................ SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6.855 s
[INFO] Finished at: 2017-05-30T13:47:02+05:30
[INFO] Final Memory: 50M/605M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-tags_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.: CompileFailed -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-tags_2.11
I am not able to understand this error.
scala> java.util.TimeZone.getDefault
res0: java.util.TimeZone = sun.util.calendar.ZoneInfo[id="Asia/Kolkata",offset=19800000,dstSavings=0,useDaylight=false,transitions=6,lastRule=null]
locale is LANG=en_IN LANGUAGE=en_IN:en LC_CTYPE="en_IN" LC_NUMERIC="en_IN" LC_TIME="en_IN" LC_COLLATE="en_IN" LC_MONETARY="en_IN" LC_MESSAGES="en_IN" LC_PAPER="en_IN" LC_NAME="en_IN" LC_ADDRESS="en_IN" LC_TELEPHONE="en_IN" LC_MEASUREMENT="en_IN" LC_IDENTIFICATION="en_IN" LC_ALL=
I guess the issue is with your time zone. Export LC_ALL=en_US.UTF-8 and start over. Make sure to have en_US.UTF-8 in locale for all entries.
$ locale
LANG="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_CTYPE="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_ALL="en_US.UTF-8"
I had similar issue and the reason was missing timezone info file in java installation (after updating java 8 to newer version).
When I looked for tzdb.dat file - there was only link pointing to missing target
I was able to solve it on RedHat just by:
yum update tzdata-java
and this site was helpful here:
https://ekuric.wordpress.com/tag/tzdb-dat/
for other OS-es solution might be similar (there is even timezone updater tool from Oracle: https://www.oracle.com/technetwork/java/javase/documentation/tzupdater-readme-136440.html)
While building the hadoop using maven, I encountered one error that one command in ant script exited abnormally.
more details are in following error.
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 2.372 s]
[INFO] Apache Hadoop KMS .................................. FAILURE [ 5.222 s]
[INFO] Apache Hadoop Common Project ....................... SKIPPED
[INFO] Apache Hadoop HDFS ................................. SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 51.908 s
[INFO] Finished at: 2017-03-18T00:01:56+08:00
[INFO] Final Memory: 77M/771M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (dist) on project hadoop-kms: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec failonerror="true" dir="/Users/jinteng/work/hadoop-compile/hadoop-2.6.5-src/hadoop-common-project/hadoop-kms/target" executable="sh">... # 10:142 in /Users/jinteng/work/hadoop-compile/hadoop-2.6.5-src/hadoop-common-project/hadoop-kms/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
I asked this question for sharing.
Background:
I killed the building process for the slow internet speed while downloading tomcat's tar packet.
Because of the interruption of downloading, the packet is not complete and tomcat-untar.sh would occured an exception while trying to untar the tomcat.
Later, I downloaded the packet separately and pushed it into downloads dir and found it works
[Run command] export MAVEN_OPTS="-Xmx1536m -XX:MaxPermSize=512m" &&
mvn clean install
[Error message] Running org.apache.atlas.web.adapters.TestEntityREST
Tests run: 11, Failures: 1, Errors: 0, Skipped: 8, Time elapsed:
13.995 sec <<< FAILURE! - in org.apache.atlas.web.adapters.TestEntityREST enter code here
cleanup(org.apache.atlas.web.adapters.TestEntityREST) Time elapsed:
0.217 sec <<< FAILURE!
java.lang.NullPointerException: null at org.apache.atlas.RequestContext.clear(RequestContext.java:97) at
org.apache.atlas.web.adapters.TestEntityREST.cleanup(TestEntityREST.java:80)
...
[INFO]
------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Apache Atlas Server Build
Tools .................... SUCCESS [ 1.489 s] [INFO] apache-atlas
....................................... SUCCESS [ 8.805 s] [INFO]
Apache Atlas Integration ........................... SUCCESS [
42.642 s] [INFO] Apache Atlas Common ................................ SUCCESS [ 14.276 s] [INFO] Apache
Atlas Typesystem ............................ SUCCESS [01:15 min]
[INFO] Apache Atlas Client ................................ SUCCESS
[ 17.168 s] [INFO] Apache Atlas Server API
............................ SUCCESS [ 8.440 s] [INFO] Apache Atlas
Notification .......................... SUCCESS [ 38.640 s] [INFO]
Apache Atlas Graph Database Projects ............... SUCCESS [
0.553 s] [INFO] Apache Atlas Graph Database API .................... SUCCESS [ 5.432 s] [INFO] Graph Database Common Code
......................... SUCCESS [ 5.636 s] [INFO] Shaded version
of Apache hbase client .............. SUCCESS [ 10.728 s] [INFO]
Apache Atlas Titan 0.5.4 Graph DB Impl ............. SUCCESS [01:58
min] [INFO] Apache Atlas Graph Database Implementation Dependencies
SUCCESS [ 0.846 s] [INFO] Shaded version of Apache hbase server
.............. SUCCESS [ 23.850 s] [INFO] Apache Atlas Repository
............................ SUCCESS [13:25 min] [INFO] Apache Atlas
Authorization ......................... SUCCESS [ 13.707 s] [INFO]
Apache Atlas Business Catalog ...................... SUCCESS [
31.334 s] [INFO] Apache Atlas UI .................................... SUCCESS [01:11 min] [INFO]
Apache Atlas Web Application ....................... FAILURE [04:29
min] [INFO] Apache Atlas Documentation .........................
SKIPPED [INFO] Apache Atlas FileSystem Model ......................
SKIPPED [INFO] Apache Atlas Plugin Classloader ....................
SKIPPED [INFO] Apache Atlas Hive Bridge Shim ......................
SKIPPED [INFO] Apache Atlas Hive Bridge ...........................
SKIPPED [INFO] Apache Atlas Falcon Bridge Shim ....................
SKIPPED [INFO] Apache Atlas Falcon Bridge .........................
SKIPPED [INFO] Apache Atlas Sqoop Bridge Shim .....................
SKIPPED [INFO] Apache Atlas Sqoop Bridge ..........................
SKIPPED [INFO] Apache Atlas Storm Bridge Shim .....................
SKIPPED [INFO] Apache Atlas Storm Bridge ..........................
SKIPPED [INFO] Apache Atlas Distribution ..........................
SKIPPED [INFO]
------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO]
------------------------------------------------------------------------ [INFO] Total time: 26:06 min [INFO] Finished at:
2017-02-09T01:20:53+09:00 [INFO] Final Memory: 165M/1437M [INFO]
------------------------------------------------------------------------ [ERROR] Failed to execute goal
org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test
(default-test) on project atlas-webapp: There are test failures.
[ERROR] [ERROR] Please refer to
/Users/dongkillee/dev/bin/atlas/webapp/target/surefire-reports for
the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To
see the full stack trace of the errors, re-run Maven with the -e
switch. [ERROR] Re-run Maven using the -X switch to enable full
debug logging. [ERROR] [ERROR] For more information about the
errors and possible solutions, please read the following articles:
[ERROR] [Help 1]
```
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] [ERROR] After correcting the problems, you can resume the
build with the command [ERROR] mvn -rf :atlas-webapp
This is testcase error. Use following command. It will create your build successfully.
export MAVEN_OPTS="-Xmx1536m -XX:MaxPermSize=512m" && mvn clean install -DskipTests
I have explained the complete solution, here. briefly, cause it uses some test cases in maven plugins in pom.xml file, we need to add -DskipTests argument when building the Apache Atlas from scratch.
Do the following items to run the project without any error:
git clone https://github.com/apache/atlas
mvn clean install -DskipTests
mvn clean package -Pdist -DskipTests
This is the error that I am getting when I am trying to install oozie.
Hadoop - 2.5.1
Maven - 3.2.3
Oozie - 4.0.0
I am trying to install this in Virtualbox. I have edited the pom.xml file as well. Is their a problem with the version of Hadoop and oozie?
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Oozie Main .................................. SUCCESS [ 3.086 s]
[INFO] Apache Oozie Client ................................ SUCCESS [ 25.084 s]
[INFO] Apache Oozie Hadoop 1.1.1.oozie-4.0.0 .............. SUCCESS [ 1.705 s]
[INFO] Apache Oozie Hadoop Distcp 1.1.1.oozie-4.0.0 ....... SUCCESS [ 0.238 s]
[INFO] Apache Oozie Hadoop 1.1.1.oozie-4.0.0 Test ......... SUCCESS [ 0.517 s]
[INFO] Apache Oozie Hadoop 2.2.0.oozie-4.0.0 .............. SUCCESS [01:41 min]
[INFO] Apache Oozie Hadoop 2.2.0.oozie-4.0.0 Test ......... SUCCESS [ 35.359 s]
[INFO] Apache Oozie Hadoop Distcp 2.2.0.oozie-4.0.0 ....... SUCCESS [ 3.244 s]
[INFO] Apache Oozie Hadoop 0.23.5.oozie-4.0.0 ............. SUCCESS [ 5.024 s]
[INFO] Apache Oozie Hadoop 0.23.5.oozie-4.0.0 Test ........ SUCCESS [ 0.432 s]
[INFO] Apache Oozie Hadoop Distcp 0.23.5.oozie-4.0.0 ...... SUCCESS [ 0.275 s]
[INFO] Apache Oozie Hadoop Libs ........................... SUCCESS [ 3.906 s]
[INFO] Apache Oozie Hbase 0.94.2.oozie-4.0.0 .............. SUCCESS [ 0.763 s]
[INFO] Apache Oozie Hbase Libs ............................ SUCCESS [ 1.121 s]
[INFO] Apache Oozie HCatalog 0.5.0.oozie-4.0.0 ............ SUCCESS [ 5.821 s]
[INFO] Apache Oozie HCatalog 0.6.0.oozie-4.0.0 ............ SUCCESS [ 26.194 s]
[INFO] Apache Oozie HCatalog Libs ......................... SUCCESS [ 1.084 s]
[INFO] Apache Oozie Share Lib Oozie ....................... FAILURE [ 10.767 s]
[INFO] Apache Oozie Share Lib HCatalog .................... SKIPPED
[INFO] Apache Oozie Core .................................. SKIPPED
[INFO] Apache Oozie Docs .................................. SKIPPED
[INFO] Apache Oozie Share Lib Pig ......................... SKIPPED
[INFO] Apache Oozie Share Lib Hive ........................ SKIPPED
[INFO] Apache Oozie Share Lib Sqoop ....................... SKIPPED
[INFO] Apache Oozie Share Lib Streaming ................... SKIPPED
[INFO] Apache Oozie Share Lib Distcp ...................... SKIPPED
[INFO] Apache Oozie WebApp ................................ SKIPPED
[INFO] Apache Oozie Examples .............................. SKIPPED
[INFO] Apache Oozie Share Lib ............................. SKIPPED
[INFO] Apache Oozie Tools ................................. SKIPPED
[INFO] Apache Oozie MiniOozie ............................. SKIPPED
[INFO] Apache Oozie Distro ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:53 min
[INFO] Finished at: 2014-10-30T08:29:55+05:30
[INFO] Final Memory: 43M/105M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project oozie-sharelib-oozie: Could not resolve dependencies for project org.apache.oozie:oozie-sharelib-oozie:jar:4.0.0: The following artifacts could not be resolved: org.apache.oozie:oozie-hadoop:jar:2.5.0.oozie-4.0.0, org.apache.oozie:oozie-hadoop-test:jar:2.5.0.oozie-4.0.0: Could not find artifact org.apache.oozie:oozie-hadoop:jar:2.5.0.oozie-4.0.0 in central (http://repo1.maven.org/maven2) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :oozie-sharelib-oozie
I too faced this error while building oozie 4.0.1 in Hadoop 2.5.1. For this build, I have only changed hadoop versions in <OOZIE_BUILD_HOME>/pom.xml
Then I changed the hadoop versions in the below folders and my error is resolved.
<OOZIE_HOME>/hadooplibs/hadoop-2/pom.xml
<OOZIE_HOME>/hadooplibs/hadoop-distcp-2/pom.xml
<OOZIE_HOME>/hadooplibs/hadoop-test-2/pom.xml
+1 on dependencies issue. Try running the following before mvn at the oozie root.
find . -name pom.xml | xargs sed -ri 's/(2.2.0\-SNAPSHOT)/2.5.1/'