How to write a customize xslt reports using with maven in testng - maven

Instead of generating reports in default pie charts, i want to customize the reports in own style
example : Bar charts , or any other type of representation
I'm using this code in pom.xml
Any one suggest me the way to customize using Xslt in POM.xml
<reporting>
<plugins>
<!-- TestNG-xslt related configuration. -->
<plugin>
<groupId>org.reportyng</groupId>
<artifactId>reporty-ng</artifactId>
<version>1.2</version>
<configuration>
<!-- Output directory for the testng xslt report -->
<outputdir>/target/testng-xslt-report/index.html</outputdir>
<sorttestcaselinks>true</sorttestcaselinks>
<testdetailsfilter>FAIL,SKIP,PASS,CONF,BY_CLASS</testdetailsfilter>
<showruntimetotals>true</showruntimetotals>
<cssFile>myCustomStyle.css</cssFile>
</configuration>
</plugin>
</plugins>
</reporting>

Instead of customizing reportNG you may simply develop your own custom html result files.then you may have any structure you want.
All you have to do is to write your own custom reporter class and keep logging the information in it while your test methods are running and after completion simply save and close the file in your project folder.
Although you will be writing more code but you will have more flexibility in its design.

Related

How to redirect unit test output Maven

I'm working in a Java 8 project built using maven. Whenever I do a mvn install the root of my project gets polluted with the output files produced from my unit tests.
How can I redirect that output to somewhere else (maybe target directory) rather than the root directory of my project?
I thought about rewriting the unit tests to point the output to target but that seems a bit silly to me. Perhaps there is a plugin or a maven directive which might do what I want to accomplish?
I tried configuring the surefire plugin but this didn't help :(
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<!-- Set working directory for content -->
<workingDirectory>target/test-classes</workingDirectory>
<useFile>false</useFile>
<!-- Just set to some large numbers for all tests to work -->
<argLine>-Xmx1g -Xss1m -XX:MaxPermSize=128m</argLine>
<skipTests>${skip.unit.tests}</skipTests>
</configuration>
</plugin>
Looks like my initial attempt was almost there, just upgraded the version and it worked :)
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.20.1</version>
<configuration>
<!-- Set working directory for content -->
<workingDirectory>target/test-classes</workingDirectory>
<useFile>false</useFile>
<!-- Just set to some large numbers for all tests to work -->
<argLine>-Xmx1g -Xss1m -XX:MaxPermSize=128m</argLine>
<skipTests>${skip.unit.tests}</skipTests>
</configuration>
</plugin>
There is new configuration option redirectTestOutputToFile since version 2.3 for exactly this purpose.

The generic type parameters of 'Map' are missing in Flink-CEP

Code for detecting pattern in Flink-CEP is shown below
// Generate temperature warnings for each matched warning pattern
DataStream<TemperatureEvent> warnings = tempPatternStream.select(
(Map<String, MonitoringEvent> pattern) -> {
TemperatureEvent first = (TemperatureEvent) pattern.get("first");
return new TemperatureEvent(first.getRackID(), first.getTemperature()) ;
}
);
if build using command + F9 in Mac, following error is shown
Exception in thread "main" org.apache.flink.api.common.functions.InvalidTypesException: The generic type parameters of 'Map' are missing.
It seems that your compiler has not stored them into the .class file.
Currently, only the Eclipse JDT compiler preserves the type information necessary to use the lambdas feature type-safely.
See the documentation for more information about how to compile jobs containing lambda expressions.
at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameter(TypeExtractor.java:1316)
at org.apache.flink.api.java.typeutils.TypeExtractor.validateLambdaGenericParameters(TypeExtractor.java:1302)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:346)
at org.apache.flink.cep.PatternStream.select(PatternStream.java:64)
at org.stsffap.cep.monitoring.CEPMonitoring.main(CEPMonitoring.java:85
However building usign mvn clean install and then running via Control + R shows output,
I am wondering why this is happening all the time ?
Is there any way around to do it?
PS : however I am using eclipse JDT Plugin , even then it is showing error in log . Contents of POM.XML are
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>0.21.0</version>
</dependency>
</dependencies>
</plugin>
Suggestions are most welcome.Thanks in advance
I know that Java 8 Lambdas are very convenient. However, they provide almost no type information via reflection, which is why Flink has problems with generating the underlying serializers. In order to also run your Flink programs in the IDE, I would recommend to use Java anonymous classes instead of lambdas, whenever generic types are involved.
first, check your jdk version, is 1.8? and also upgrade the version of tycho-compiler-jdt to 1.0.0 your san refer below plugin :
<plugin>
<!-- Use compiler plugin with tycho as the adapter to the JDT compiler. -->
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
<compilerId>jdt</compilerId>
</configuration>
<dependencies>
<!-- This dependency provides the implementation of compiler "jdt": -->
<dependency>
<groupId>org.eclipse.tycho</groupId>
<artifactId>tycho-compiler-jdt</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>
</plugin>
you can refer source : https://ci.apache.org/projects/flink/flink-docs-release-1.4/dev/java8.html
after this you have to do is to build the project on the cli using maven. Once the program has been built via maven, you can also run it from within IntelliJ.

Varying plugin configuration for each child module

I have a multi-module build which contains modules which can target either Java 5 or Java 6. I want to allow modules to opt-in to Java 6, and leaving the default to 5.
To set Java 5 as a target I need to configure the following:
maven-compiler-plugin: source and target set to 1.5
maven-bundle-plugin: configure the Bundle-RuntimeExecutionEnvironment to J2SE-1.5
To set Java 6 as a target I need to configure the following:
maven-compiler-plugin: source and target set to 1.6
maven-bundle-plugin: configure the Bundle-RuntimeExecutionEnvironment to JavaSE-1.6
I considered having two properties: java.compiler.source and osgi.bree which can be defined by each module, but this leaves place for error.
How can I override the configuration of these two plugins per module with a single switch?
I would personally structure your project so that Java 5 modules descend from one parent POM and Java 6 modules from another parent POM.
Global Parent (majority of global settings)
Java5 parent (just define source/bundle)
module A
module B
Java 6 parent (just define source/bundle)
module C
How about allowing child modules to set a my.java.version property (or whatever you want it named) and embedding a Groovy script that sets version properties for the compiler and bundle plugins? Something like this in the parent pom:
<project ...>
...
<properties>
<my.java.version>1.5</my.java.version> <!-- default Java version -->
</properties>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.groovy.maven</groupId>
<artifactId>gmaven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<!-- set up properties in an early lifecycle phase -->
<phase>initialize</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<!-- this can be as simple or complex as you need it to be -->
<source>
if (project.properties['my.java.version'] == '1.6') {
project.properties['my.compiler.version'] = '1.6'
project.properties['my.execution.environment.version'] = 'JavaSE-1.6'
}
else {
project.properties['my.compiler.version'] = '1.5'
project.properties['my.execution.environment.version'] = 'J2SE-1.5'
}
</source>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<!-- now use the properties from above in the plugin configurations -->
<!-- assume that both of these plugins will execute in a phase later than 'initialize' -->
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${my.compiler.version}</source>
<target>${my.compiler.version}</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<configuration>
<!-- sorry if this part isn't correct; never used this plugin before -->
<instructions>
<Bundle-RuntimeExecutionEnvironment>${my.execution.environment.version}</Bundle-RuntimeExecutionEnvironment>
</instructions>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
I don't think there is an elegant Maven way to solve this complex scenario, neither yours or Duncan's proposed solution are easy maintainable IMO, when number of sub module becomes tremendous.
For maximum maintainability, I would write shell script (and/or batch file on Windows) in case Maven can't do the job very well, for example, a set-version.sh (and set-version.bat) that loop all sub module and reset the default java.compiler.source and osgi.bree properties based on a version-feed.txt, the version-feed.txt gives you a single central place for manipulating your version varying. As you can see, the cons is this is really not a Maven solution, it requires running set-version.sh before mvn ... every time version customization is required.
In addition, For build/release standardization, I would use maven-enforcer-plugin to play/pause the build process based on a property version.set(which is flagged by set-version.sh) and prompt some warning/error message if developer is not follow the correct procedure when doing build. The version.set also gives the flexibility if you prefer to use the default values defined in every sub module, instead of running set-version.sh, just directly set it to true in the parent pom.xml or from command-line parameter.
Sample directory structure:
parent/
module-a/
module-b/
module-c/
... ...
pom.xml
set-version.sh
set-version.bat
version-feed.txt
Hope this make sense.

Killing Javadoc warnings for specific tags

Is there an easy way / option of preventing javadoc warnings when building with Maven? We use the soyatec uml plugin for eclipse and it inserts special tags which make our builds throw lots of annoying warnings; I've looked around including on the soyatec site and have come up empty.
#uml.property is an unknown tag
The only answer I could find to this is by Configuring Custom Javadoc Tags.
An example could be like this:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.8.1</version>
<configuration>
<tags>
<tag>
<name>uml.property</name>
<!-- The value X makes javadoc ignoring the tag -->
<placement>X</placement>
</tag>
<tag>
<name>some.other.property</name>
<placement>X</placement>
</tag>
<tag>
<name>some.third.property</name>
<placement>X</placement>
</tag>
</tags>
</configuration>
</plugin>
</plugins>
</build>
When running you will see this in the output:
mvn javadoc:javadoc
<lots of output>
Note: Custom tags that were not seen: #uml.property
<maybe more output>
And you can disable non-error and non-warning messages by using this command:
mvn javadoc:javadoc -Dquiet
It might be a hard job to define all these tags but once done you will no longer see the warnings.
And you should probably define these custom tags in a parent pom that everyone can use to benefit all the work.

How to control project info generation only for the parent module

<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>2.4</version>
</plugin>
...
</plugins>
</reporting>
I am on a multi module project and would like to know how to generate this project information only for the parent and not for the child modules which inherit the parent. Should I be setting the inherited or aggregated to false
This question does not really make sense since the purpose of a parent project is to establish a link between modules. It is possible to select a set of reports to generate, but there is no option to run it only on a subset of modules.
If you are using inheritance, you may want to restructure your project using an aggregation pom.xml. Like this, you would be able to run reports on a given module (or on all if you run it on the parent aggregate).
If this does not answer your question, can you clarify? Thanks.

Resources