How to get sonar to analyze nested files? - maven

Every time I run mvn sonar:sonar it only checks the parent file and none of the nested folders.
[INFO] Reactor Summary:
[INFO]
[INFO] NAV :: XXXAppParent ........................... SUCCESS [ 51.557 s]
[INFO] NAV :: XXXNavAppWeb .............................. SKIPPED
[INFO] NAV :: XXX-navapp-web-config ..................... SKIPPED
[INFO] NAV :: XXX-navapp-assets-tar ..................... SKIPPED
How do I get sonar to get analyze all children files and folders?

Related

Cannot build Storm - Could not resolve dependencies for project org.apache.storm:storm-autocreds:jar:2.3.0

I have a problem with building a raw project wchi I got from Storm website.
I've cloned that repo: https://github.com/apache/storm/tree/v2.3.0/examples/storm-starter
and went with the instruction step by step. But it seems I have a problem with dependencies:
Can anybody try to build that repo? I'm using maven 3.x
I don't know if some dependencies changed theirs addres in repositories, or maybe I have something misconfigured.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Storm 2.3.0:
[INFO]
[INFO] Storm .............................................. FAILURE [ 10.277 s]
[INFO] Apache Storm - Checkstyle .......................... SKIPPED
[INFO] Shaded Deps for Storm Client ....................... SKIPPED
[INFO] multilang-javascript ............................... SKIPPED
[INFO] multilang-python ................................... SKIPPED
[INFO] multilang-ruby ..................................... SKIPPED
[INFO] maven-shade-clojure-transformer .................... SKIPPED
[INFO] storm-maven-plugins ................................ SKIPPED
[INFO] Storm Client ....................................... SKIPPED
[INFO] storm-server ....................................... SKIPPED
[INFO] storm-clojure ...................................... SKIPPED
[INFO] Storm Core ......................................... SKIPPED
[INFO] Storm Webapp ....................................... SKIPPED
[INFO] storm-clojure-test ................................. SKIPPED
[INFO] storm-submit-tools ................................. SKIPPED
[INFO] storm-autocreds .................................... SKIPPED
[INFO] storm-hdfs ......................................... SKIPPED
[INFO] storm-hdfs-blobstore ............................... SKIPPED
[INFO] storm-hdfs-oci ..................................... SKIPPED
[INFO] storm-hbase ........................................ SKIPPED
[INFO] storm-hive ......................................... SKIPPED
[INFO] storm-jdbc ......................................... SKIPPED
[INFO] storm-redis ........................................ SKIPPED
[INFO] storm-eventhubs .................................... SKIPPED
[INFO] storm-elasticsearch ................................ SKIPPED
[INFO] storm-solr ......................................... SKIPPED
[INFO] storm-metrics ...................................... SKIPPED
[INFO] storm-cassandra .................................... SKIPPED
[INFO] storm-mqtt ......................................... SKIPPED
[INFO] storm-mongodb ...................................... SKIPPED
[INFO] storm-kafka-client ................................. SKIPPED
[INFO] storm-kafka-migration .............................. SKIPPED
[INFO] storm-opentsdb ..................................... SKIPPED
[INFO] storm-kafka-monitor ................................ SKIPPED
[INFO] storm-kinesis ...................................... SKIPPED
[INFO] storm-jms .......................................... SKIPPED
[INFO] storm-pmml ......................................... SKIPPED
[INFO] storm-rocketmq ..................................... SKIPPED
[INFO] blobstore-migrator ................................. SKIPPED
[INFO] Storm Integration Test ............................. SKIPPED
[INFO] flux ............................................... SKIPPED
[INFO] flux-wrappers ...................................... SKIPPED
[INFO] flux-core .......................................... SKIPPED
[INFO] flux-examples ...................................... SKIPPED
[INFO] storm-sql-runtime .................................. SKIPPED
[INFO] storm-sql-core ..................................... SKIPPED
[INFO] storm-sql-kafka .................................... SKIPPED
[INFO] storm-sql-redis .................................... SKIPPED
[INFO] storm-sql-mongodb .................................. SKIPPED
[INFO] storm-sql-hdfs ..................................... SKIPPED
[INFO] sql ................................................ SKIPPED
[INFO] storm-starter ...................................... SKIPPED
[INFO] storm-loadgen ...................................... SKIPPED
[INFO] storm-mongodb-examples ............................. SKIPPED
[INFO] storm-redis-examples ............................... SKIPPED
[INFO] storm-opentsdb-examples ............................ SKIPPED
[INFO] storm-solr-examples ................................ SKIPPED
[INFO] storm-kafka-client-examples ........................ SKIPPED
[INFO] storm-jdbc-examples ................................ SKIPPED
[INFO] storm-hdfs-examples ................................ SKIPPED
[INFO] storm-hive-examples ................................ SKIPPED
[INFO] storm-elasticsearch-examples ....................... SKIPPED
[INFO] storm-mqtt-examples ................................ SKIPPED
[INFO] storm-pmml-examples ................................ SKIPPED
[INFO] storm-jms-examples ................................. SKIPPED
[INFO] storm-rocketmq-examples ............................ SKIPPED
[INFO] Storm Perf ......................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13.129 s
[INFO] Finished at: 2022-03-03T12:44:23+01:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project storm-autocreds: Could not resolve dependencies for project org.apache.storm:storm-autocreds:jar:2.3.0: Could not transfer artifact javax.jms:jms:jar:1.1 from/to maven-default-http-blocker (http://0.0.0.0/): Blocked mirror for repositories: [datanucleus (http://www.datanucleus.org/downloads/maven2, default, releases), glassfish-repository (http://maven.glassfish.org/content/groups/glassfish, default, disabled), glassfish-repo-archive (http://maven.glassfish.org/content/groups/glassfish, default, disabled), apache.snapshots (http://repository.apache.org/snapshots, default, snapshots)] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :storm-autocreds

Maven: Full reactor summary after resume a build with mvn goal -rf :module

I'm using maven to manage the build of a large project that it's divided in several modules that are built from the root. So, in this case, maven will use the reactor functionality to build each module in the correct order.Something like this:
root/pom.xml
sp1/pom.xml
sp2/pom.xml
sp3/pom.xml
sp4/pom.xml
I build the project with maven from root directory using the command
mvn clean install
If the build is OK, maven will print the reactor summary:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Root Project ....................................... SUCCESS [ 18.271 s]
[INFO] sp1 ................................................ SUCCESS [ 2.034 s]
[INFO] sp2 ................................................ SUCCESS [ 22.770 s]
[INFO] sp3 ................................................ SUCCESS [03:39 min]
[INFO] sp4 ................................................ SUCCESS [04:39 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 46:34 min
[INFO] Finished at: 2017-10-04T17:08:11+00:00
[INFO] Final Memory: 256M/1599M
[INFO] ------------------------------------------------------------------------
But, What happens if one of build phase of one of the subproject fails?. For example, some tests in the sp2 subproject fail and I fix them and relaunch the build from the sp2 with this command
mvn clean install -rf :sp2
The project is built correctly. Then, maven prints the reactor, but not the full reactor, only from sp2. Something like this:
```
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] sp2 ................................................ SUCCESS [ 22.770 s]
[INFO] sp3 ................................................ SUCCESS [03:39 min]
[INFO] sp4 ................................................ SUCCESS [04:39 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 46:34 min
[INFO] Finished at: 2017-10-04T17:08:11+00:00
[INFO] Final Memory: 256M/1599M
[INFO] ------------------------------------------------------------------------
My question is: Is it possible to obtain the full reactor summary (including the modules that have been built in the previous execution?)
Reactor Build Order is resolved during Reactor Sorting which leads into a deterministic sequence of execution for the list of projects. The command --resume-from (-rf) as used in:
mvn clean install -rf :sp2
resumes reactor build from the specified project, where the order is similar to as was already been decided, it simply traverses the remaining of the projects in the list which is the updated Reactor Build Order and hence the Reactor Summary corresponds to the new build order.

Compilation error Spark 1.3.1

I tried to compile Spark 1.3.1 using the following flags
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 \
-Dscala-2.11 \
-Phive -Phive-0.13.1 -Phive-thriftserver \
-DskipTests clean package
The compilation failed with the following the error.
Any suggestions?
Thanks
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [01:08 min]
[INFO] Spark Project Core ................................. SUCCESS [02:38 min]
[INFO] Spark Project Bagel ................................ SUCCES [ 17.700 s]
[INFO] Spark Project GraphX ............................... SUCCESS [ 35.732 s]
[INFO] Spark Project ML Library ........................... SUCCESS [01:11 min]
[INFO] Spark Project Tools ................................ SUCCESS [ 6.718 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 6.837 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 3.534 s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 43.771 s]
[INFO] Spark Project Catalyst ............................. SUCCESS [ 48.411 s]
[INFO] Spark Project SQL .................................. SUCCESS [ 56.046 s]
[INFO] Spark Project Hive ................................. SUCCESS `enter code here`[01:01 min]
[INFO] Spark Project Assembly ............................. FAILURE [ 6.365 s]
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 09:45 min
[INFO] Finished at: 2015-05-15T11:25:53-06:00
[INFO] Final Memory: 77M/1176M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project spark-assembly_2.10: Could not resolve dependencies for project org.apache.spark:spark-assembly_2.10:pom:1.3.1: Could not find artifact org.apache.spark:spark-hive-thriftserver_2.11:jar:1.3.1 in central (https://repo1.maven.org/maven2) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal on project spark-assembly_2.10: Could not resolve dependencies for project org.apache.spark:spark-assembly_2.10:pom:1.3.1: Could not find artifact org.apache.spark:spark-hive-thriftserver_2.11:jar:1.3.1 in central (https://repo1.maven.org/maven2)
Seems no dependency of spark-hive-thriftserver_2.11 exists In maven repository, I also search it manually in maven repo and find nothing. I think thrift server is not ready for scala2.11 for now.

How to build Spark 1.2 with Maven (gives java.io.IOException: Cannot run program "javac")?

I am trying to build Spark 1.2 with Maven. My goal is to use PySpark with YARN on Hadoop 2.2.
I saw that this was only possible by building Spark with Maven. First, is this true?
If it is true, what is the problem in the log below? How do I correct this?
C:\Spark\spark-1.2.0>mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests
clean package
Picked up _JAVA_OPTIONS: -Djava.net.preferIPv4Stack=true
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] Spark Project Parent POM
[INFO] Spark Project Networking
[INFO] Spark Project Shuffle Streaming Service
[INFO] Spark Project Core
[INFO] Spark Project Bagel
[INFO] Spark Project GraphX
[INFO] Spark Project Streaming
[INFO] Spark Project Catalyst
[INFO] Spark Project SQL
[INFO] Spark Project ML Library
[INFO] Spark Project Tools
[INFO] Spark Project Hive
[INFO] Spark Project REPL
[INFO] Spark Project YARN Parent POM
[INFO] Spark Project YARN Stable API
[INFO] Spark Project Assembly
[INFO] Spark Project External Twitter
[INFO] Spark Project External Flume Sink
[INFO] Spark Project External Flume
[INFO] Spark Project External MQTT
[INFO] Spark Project External ZeroMQ
[INFO] Spark Project External Kafka
[INFO] Spark Project Examples
[INFO] Spark Project YARN Shuffle Service
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Parent POM 1.2.0
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) # spark-parent ---
[INFO] Deleting C:\Spark\spark-1.2.0\target
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) # spark-parent
---
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) # spark-
parent ---
[INFO] Source directory: C:\Spark\spark-1.2.0\src\main\scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) # spark-parent --
-
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) # spark-parent
---
[INFO] No sources to compile
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-scala-test-sources
) # spark-parent ---
[INFO] Test Source directory: C:\Spark\spark-1.2.0\src\test\scala added.
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) # spa
rk-parent ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-dependency-plugin:2.9:build-classpath (default) # spark-parent
---
[INFO] Wrote classpath file 'C:\Spark\spark-1.2.0\target\spark-test-classpath.tx
t'.
[INFO]
[INFO] --- gmavenplus-plugin:1.2:execute (default) # spark-parent ---
[INFO] Using Groovy 2.3.7 to perform execute.
[INFO]
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) # spark-p
arent ---
[INFO]
[INFO] --- maven-shade-plugin:2.2:shade (default) # spark-parent ---
[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO]
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (create-source-jar) # spark-par
ent ---
[INFO]
[INFO] --- scalastyle-maven-plugin:0.4.0:check (default) # spark-parent ---
[WARNING] sourceDirectory is not specified or does not exist value=C:\Spark\spar
k-1.2.0\src\main\scala
Saving to outputFile=C:\Spark\spark-1.2.0\scalastyle-output.xml
Processed 0 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 32 ms
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Networking 1.2.0
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) # spark-network-common_2
.10 ---
[INFO] Deleting C:\Spark\spark-1.2.0\network\common\target
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) # spark-networ
k-common_2.10 ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) # spark-
network-common_2.10 ---
[INFO] Source directory: C:\Spark\spark-1.2.0\network\common\src\main\scala adde
d.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) # spark-network-c
ommon_2.10 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # spark-netw
ork-common_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory C:\Spark\spark-1.2.0\network\common\s
rc\main\resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) # spark-networ
k-common_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal increm
ental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null
)
[INFO] Compiling 42 Java sources to C:\Spark\spark-1.2.0\network\common\target\s
cala-2.10\classes...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 5.267 s]
[INFO] Spark Project Networking ........................... FAILURE [ 1.922 s]
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project Bagel ................................ SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Parent POM ...................... SKIPPED
[INFO] Spark Project YARN Stable API ...................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 8.048 s
[INFO] Finished at: 2015-02-09T10:17:47+08:00
[INFO] Final Memory: 49M/331M
[INFO] ------------------------------------------------------------------------
[**ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compi
le (scala-compile-first) on project spark-network-common_2.10: wrap: java.io.IOE
xception: Cannot run program "javac": CreateProcess error=2, The system cannot f
ind the file specified -> [Help 1]**
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e swit
ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please rea
d the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionE
xception
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-network-common_2.10
I had first installed JRE instead of JDK. My environment variables still referenced the JRE folder, and so it couldn't find the javac.exe binary.
A 'quirk' with Spark builds is that it can download its own version of Maven if it determines it is required.
When you run ./build/mvn clean package you are not running Maven directly, you are running a Spark proprietary script. The first thing that script does is check if your mvn --version is new enough for the version that the project determines it needs (which is set in the pom.xml file).
This is an important point because if you're running an old version of maven, Spark may download an additional maven version and install it and use that instead.
Some key things:
When you run ./build/mvn clean package, check which version of maven it is using
When maven runs it does its own traversal to figure out which JAVA_HOME is used
Before trying to run the spark build, check JAVA_HOME is set as a variable
Check that the JAVA_HOME version is a full jdk, not just a jre
Update your Maven to the latest version (or check it is at least as new as the version set by in the pom.xml in the root directory
Thanks
For this problem you need to set your java environment path correctly in .bashrc file. Then you need to build maven correct on set maven path for that check mvn -version.
Then it will build automaticaly without error.

Maven build failing when deploying, build succeeds when compiling

I am using Gitlab_ci to trigger a maven script when code is pushed.
The script executes:
mvn compile
[...]
[INFO] project ............................................ SUCCESS [1.312s]
[INFO] project-api ........................................ SUCCESS [1.416s]
[INFO] project-api-impl ................................... SUCCESS [0.329s]
[INFO] project-webapp ..................................... SUCCESS [0.192s]
[INFO] project-webapp-exec ................................ SUCCESS [0.026s]
[INFO] project-webapp-it .................................. SUCCESS [2.052s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
mvn tomcat7:deploy
[INFO] project............................................ SUCCESS [1.840s]
[INFO] project-api ........................................ SUCCESS [2.524s]
[INFO] project-api-impl ................................... FAILURE [0.257s]
[INFO] project-webapp ..................................... SKIPPED
[INFO] project-webapp-exec ................................ SKIPPED
[INFO] project-webapp-it .................................. SKIPPED
And I get the following error:
[ERROR] Failed to execute goal on project project-api-impl: Could not resolve dependencies for project eu.project:project-api-impl:jar:0.6-DEVELOPMENT: Failure to find eu.project:project-api:jar:0.6-DEVELOPMENT in http://repo.maven.apache.org/maven2
Since the project builds successfuly (the WAR is created), I can't understand why when executing deploy, build fails.
[Also, I am not sure whether to mark this as homework. The assignment's goal is to create a tomcat web application and manually deploy to a server to test. Since maven / continuous integration wasn't a requisit, I believe this should not be marked as a homework question]

Resources