How Does surefire decide on the Test framework? - maven

I have been trying to understand how Surefire plugin internally decides which Testing Framework to use ( TestNG, Jupiter, Junit4 etc. )
Does it use reflection and try to find the presence of each framework in the classpath ?
( Looking at the dependencies, Surefire seems to be coming with junit4 in its transitive dependencies - junit:JUnit:jar:4.12 )

It's possible to pass the provider (test-framework type) explicitly, setting an additional plugin dependency, e.g. for TestNG:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-testng</artifactId>
<version>3.0.0-M5</version>
</dependency>
</dependencies>
</plugin>
If nothing specified
Surefire normally automatically selects which test-framework provider to use based on the version of TestNG/JUnit present in your project's classpath.
From this doc:
https://maven.apache.org/surefire/maven-surefire-plugin/examples/providers.html
How Surefire plugin internally decides which Testing Framework to use
Let's look at how it's implemented.
There is ProviderInfo interface with method boolean isApplicable();
ProviderInfo.java
I've found multiple implementations in class AbstractSurefireMojo.java
AbstractSurefireMojo.java
for:
TestNgProviderInfo
JUnit3ProviderInfo
JUnit4ProviderInfo
JUnitPlatformProviderInfo
JUnitCoreProviderInfo
DynamicProviderInfo
And there is also a protected method protected List<ProviderInfo> createProviders( TestClassPath testClasspath ) which reference all this implementations.
protected List<ProviderInfo> createProviders( TestClassPath testClasspath )
throws MojoExecutionException
{
Artifact junitDepArtifact = getJunitDepArtifact();
return providerDetector.resolve( new DynamicProviderInfo( null ),
new JUnitPlatformProviderInfo( getJUnit5Artifact(), testClasspath ),
new TestNgProviderInfo( getTestNgArtifact() ),
new JUnitCoreProviderInfo( getJunitArtifact(), junitDepArtifact ),
new JUnit4ProviderInfo( getJunitArtifact(), junitDepArtifact ),
new JUnit3ProviderInfo() );
}
and ProviderDetector class invokes isApplicable() per each providerInfo in resolve method.
ProviderDetector.java
And looks like the first applicable is selected:
private Optional<ProviderInfo> autoDetectOneWellKnownProvider( ProviderInfo... wellKnownProviders )
{
Optional<ProviderInfo> providerInfo = stream( wellKnownProviders )
.filter( ProviderInfo::isApplicable )
.findFirst();
providerInfo.ifPresent( p -> logger.info( "Using auto detected provider " + p.getProviderName() ) );
return providerInfo;
}

Related

How can I ensure that the right bytecode is available to my custom sonar plugin rule, so I don't get !unknown! for every type?

I've been attempting to write a custom rules plugin for Sonarqube ~5.4, and while I've gotten a few rules implemented and working, the ones that rely on types outside the standard libraries rely on various kinds of acrobatic string matching.
I'm using the sonar-packaging-maven-plugin to do the packaging:
<plugin>
<groupId>org.sonarsource.sonar-packaging-maven-plugin</groupId>
<artifactId>sonar-packaging-maven-plugin</artifactId>
<version>1.16</version>
<configuration>
<pluginClass>${project.groupId}.sonar.BravuraRulesPlugin</pluginClass>
<pluginKey>SonarPluginBravura</pluginKey>
<skipDependenciesPackaging>false</skipDependenciesPackaging>
<basePlugin>java</basePlugin>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>sonar-plugin</goal>
</goals>
</execution>
</executions>
</plugin>
And am running the various checks using the following helper extension (kotlin):
fun <T : JavaFileScanner> T.verify() {
val workDir = System.getProperty("user.dir");
val folder = Paths.get(workDir, "src/test/samples", this.javaClass.simpleName);
Files.list(folder).forEach { sample ->
try {
if (sample.toString().endsWith(".clean.java")) {
JavaCheckVerifier.verifyNoIssue(sample.toString(), this);
} else {
JavaCheckVerifier.verify(sample.toString(), this);
}
} catch (error: Exception) {
throw VerificationFailedException(sample, error);
}
}
};
class VerificationFailedException(path: Path, error: Exception)
: Exception("Failed to verify $path.", error);
I create an IssuableSubscriptionVisitor subclass for the rule, and visit Tree.Kind.METHOD_INVOCATION, looking for uses of a static MAX, MIN, ASC, or DESC sql builder method being passed an AutoLongColumn. This is to stop the identifier field being used for ordering purposes.
Unfortunately, even though I have the requisite library on the maven 'test' classpath, when I try and get any of the types, they just show as !unknown!.
override fun visitNode(tree: Tree) {
if (tree !is MethodInvocationTree) {
return;
}
val methodSelect = tree.methodSelect();
if (methodSelect !is IdentifierTree || methodSelect.name() !in setOf("MAX", "MIN", "ASC", "DESC")) {
return;
}
val firstArg = statement.arguments().first();
if (firstArg !is MethodInvocationTree) {
return;
}
val firstArgSelect = firstArg.methodSelect();
if (firstArgSelect !is MemberSelectExpressionTree) {
return;
}
if (firstArgSelect.type is UnknownType) {
throw TableFlipException("(ノಥ益ಥ)ノ ┻━┻");
}
// It never gets here.
}
I'm sure I'm missing some vital piece of the puzzle, and I'd appreciate if someone can tell me where I'm going wrong.
EDIT: I'm using org.sonarsource.java:sonar-java-plugin:3.14 for the analyser, and while I can't release all the code for the analysis target (commercial IP and all that), here's something structurally identical to the key part:
import static com.library.UtilClass.MAX;
...
query.SELECT(biggestId = MAX(address._id())) // Noncompliant
.FROM(address)
.WHERE(address.user_id().EQ(userId)
.AND(address.type_id().EQ(typeId)));
...
The type of address.id() is an com.library.Identifier that wraps a long. I'd like to be able to visit all the method invocations, check if they match com.library.UtilCLass.MAX, and if so, make sure that the first parameter isn't a com.library.Identifier. Without the type information, I have to do a regex match on _id method references, which is prone to potentially missing things.
So, turns out that the way to get the types available is by using maven (or whatever tool you're using) to copy the needed jars into a directory, then turn the lot into a list of files, and pass that to the test verifier.
For example, lets pretend we're trying to find usages of joda-time:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy-libs</id>
<phase>generate-test-resources</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.4</version>
</artifactItem>
</artifactItems>
</configuration>
</execution>
<executions>
</plugin>
This execution will put the joda-time jar into the target/dependency directory. Next, you make sure to enumerate the jars in that directory, and add them to your test verification (we're assuming you named your verifier 'JodaCheck'):
// Not at all necessary, but it makes the code later on a lot easier to read.
fun <T> Stream<T>.toList(): List<T> = this.collect({
mutableListOf()
}, { list, item ->
list.add(item)
}, { list, otherList ->
list.addAll(otherList)
})
...
val workDir = System.getProperty("user.dir")
val sampleFile = Paths.get(workDir, "src/test/samples/JodaSample.java").toString()
val dependencies = Files.list(Paths.get(workDir, "target/dependency"))
.map { it.toFile() }.toList()
JavaCheckVerifier.verify(sampleFile, JodaChecker(), dependencies)
Once you've done that, debugging through the tests will show that the joda-time classes are available during analysis.

How to make CodeNarc force maven build to fail

I'm trying to integrate CodeNarc into a maven based project and I've been running into problems.
I want to use a custom ruleset, and when a rule is violated, I'd like my maven build to fail.
How can I configure codenarc so that violations of rules lead to a failure when I run the following?
mvn clean install
Also, the documentation for configuring CodeNarc in a POM doesn't explain how to reference where my custom ruleset is. Any advice for how to set that up? Thanks!
When I run mvn clean install with the configurations below (I have a groovy file with blatant violations in accordance with my ruleset)
My build succeeds. :(
I tried referencing my own ruleset and no violations were being produced.
I took away a rulesetfiles property in the POM and it started producing violations.
(But I don't get to choose my own)
Anyone know how to make it actually read a custom rule set file? I tried with both xml and groovy.
Here's my ruleset and plugin config from my POM:
<ruleset xmlns="http://codenarc.org/ruleset/1.0";
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
xsi:schemaLocation="http://codenarc.org/ruleset/1.0 http://codenarc.org/ruleset-schema.xsd";
xsi:noNamespaceSchemaLocation="http://codenarc.org/ruleset-schema.xsd">;
<description>Dummy rule set</description>
<rule class='org.codenarc.rule.formatting.SpaceAfterIf'>
<property name='priority' value='1'/>
</rule>
<rule class='org.codenarc.rule.basic.EmptyIfStatement'>
<property name='priority' value='1'/>
</rule>
</ruleset>
I referenced this ruleset in my POM like this:
<groupId>org.codehaus.mojo</groupId>
<artifactId>codenarc-maven-plugin</artifactId>
<version>0.18-1</version>
<configuration>
<sourceDirectory>${basedir}/src/test/groovy</sourceDirectory>
<maxPriority1Violations>0</maxPriority1Violations>
<maxPriority2Violations>0</maxPriority2Violations>
<maxPriority3Violations>0</maxPriority3Violations>
<rulesetfiles>${basedir}/rulesets/ruleset.xml</rulesetfiles>
<xmlOutputDirectory>${basedir}/</xmlOutputDirectory>
</configuration>
<executions>
<execution>
<id>execution1</id>
<phase>install</phase>
<goals>
<goal>codenarc</goal>
</goals>
</execution>
</executions>
I was struggling with the same some time ago. I remember it was possible to run with maven properly but I don't have this config. Why? Because CodeNarc needs to compile your sources for purpuse of some rules execution. But codenarc maven plugin doesn't pass classpath and compilation was failing.
So I went for different approach which is running CodeNarc as a test source with ant task. It looks like:
import spock.lang.Specification
class GroovyCodeNarcStaticAnalysisRunner extends Specification {
private static final GROOVY_FILES = '**/*.groovy'
private static final ANALYSIS_SCOPE = 'src/main/groovy'
private static final RULESET_LOCATION = 'file:tools/static-analysis/codenarc.xml'
private static final HTML_REPORT_FILE = 'target/codenarc-result.html'
private static final XML_REPORT_FILE = 'target/codenarc-result.xml'
def 'Groovy code should meet coding standards'() {
given:
def ant = new AntBuilder()
ant.taskdef(name: 'codenarc', classname: 'org.codenarc.ant.CodeNarcTask')
expect:
ant.codenarc(
ruleSetFiles: RULESET_LOCATION,
maxPriority1Violations: 0,
maxPriority2Violations: 0,
maxPriority3Violations: 0)
{
fileset(dir: ANALYSIS_SCOPE) {
include(name: GROOVY_FILES)
}
report(type: 'text') {
option(name: 'writeToStandardOut', value: true)
}
report(type: 'xml') {
option(name: 'outputFile', value: XML_REPORT_FILE)
}
report(type: 'html') {
option(name: 'outputFile', value: HTML_REPORT_FILE)
}
}
}
}
You don't need to use Spock's Specification for that. Any test runner will do. On the maven side it's enough to make CodeNarc dependency configured with scope test.

How can I provide custom logic in a Maven archetype?

I'm interested in creating a Maven archetype, and I think I have most of the basics down. However, one thing I'm stuck on is that sometimes I want to use custom logic to fill in a template. For example, if somebody generates my archetype and specifies the artifactId as hello-world, I'd like to generate a class named HelloWorld that simply prints out "Hello World!" to the console. If another person generates it with artifactId = howdy-there, the genned class would be HowdyThere and it would print out "Howdy There!".
I know that under the covers, Maven's archetype mechanism leverages the Velocity Template Engine, so I read this article on creating custom directives. This seemed to be what I was looking for, so I created a class called HyphenatedToCamelCaseDirective that extends org.apache.velocity.runtime.directive.Directive. In that class, my getName() implementation returns "hyphenatedCamelCase". In my archetype-metadata.xml file, I have the following...
<requiredProperties>
<requiredProperty key="userdirective">
<defaultValue>com.jlarge.HyphenatedToCamelCaseDirective</defaultValue>
</requiredProperty>
</requiredProperties>
My template class looks like this...
package ${package};
public class #hyphenatedToCamelCase('$artifactId') {
// userdirective = $userdirective
public static void main(String[] args) {
System.out.println("#hyphenatedToCamelCase('$artifactId')"));
}
}
After I install my archetype and then do an archetype:generate by specifying artifactId = howdy-there and groupId = f1.f2, the resulting class looks like this...
package f1.f2;
public class #hyphenatedToCamelCase('howdy-there') {
// userdirective = com.jlarge.HyphenatedToCamelCaseDirective
public static void main(String[] args) {
System.out.println("#hyphenatedToCamelCase('howdy-there')"));
}
}
The result shows that even though userdirective is being set the way I expected it to, It's not evaulating the #hyphenatedToCamelCase directives like I was hoping. In the directive class, I have the render method logging a message to System.out, but that message doesn't show up in the console, so that leads me to believe that the method never got executed during archetype:generate.
Am I missing something simple here, or is this approach just not the way to go?
The required properties section of the archetype-metatadata xml is used to pass additional properties to the velocity context, it is not meant for passing velocity engine configuration. So setting a property called userDirective will only make the variable $userDirective availble and not add a custom directive to the velocity engine.
If you see the source code, the velocity engine used by maven-archetype plugin does not depend on any external property source for its configuration. The code that generates the project relies on an autowired (by the plexus container) implementation of VelocityComponent.
This is the code where the velocity engine is initialized:
public void initialize()
throws InitializationException
{
engine = new VelocityEngine();
// avoid "unable to find resource 'VM_global_library.vm' in any resource loader."
engine.setProperty( "velocimacro.library", "" );
engine.setProperty( RuntimeConstants.RUNTIME_LOG_LOGSYSTEM, this );
if ( properties != null )
{
for ( Enumeration e = properties.propertyNames(); e.hasMoreElements(); )
{
String key = e.nextElement().toString();
String value = properties.getProperty( key );
engine.setProperty( key, value );
getLogger().debug( "Setting property: " + key + " => '" + value + "'." );
}
}
try
{
engine.init();
}
catch ( Exception e )
{
throw new InitializationException( "Cannot start the velocity engine: ", e );
}
}
There is a hacky way of adding your custom directive. The properties you see above are read from the components.xml file in the plexus-velocity-1.1.8.jar. So open this file and add your configuration property
<component-set>
<components>
<component>
<role>org.codehaus.plexus.velocity.VelocityComponent</role>
<role-hint>default</role-hint>
<implementation>org.codehaus.plexus.velocity.DefaultVelocityComponent</implementation>
<configuration>
<properties>
<property>
<name>resource.loader</name>
<value>classpath,site</value>
</property>
...
<property>
<name>userdirective</name>
<value>com.jlarge.HyphenatedToCamelCaseDirective</value>
</property>
</properties>
</configuration>
</component>
</components>
</component-set>
Next add your custom directive class file to this jar and run archetype:generate.
As you see this is very fraglie and you will need to figure a way to distribute this hacked plexus-velocity jar. Depending on what you are planning to use this archetype for it might be worth the effort.

How to programmatically list all transitive dependencies, including overridden ones in Maven using DependencyGraphBuilder?

This is similar to other questions (like this), but I want to be able to do this with the latest API's. The maven-dependency-plugin:tree verbose option has been deprecated and does nothing in the latest (2.5.1) code, so there is no good example of how to do it.
I believe Aether utility class from jcabi-aether can help you to get a list of all dependencies of any Maven artifact, for example:
File repo = this.session.getLocalRepository().getBasedir();
Collection<Artifact> deps = new Aether(this.getProject(), repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
JavaScopes.RUNTIME
);
If you're outside of Maven plugin:
File repo = new File("/tmp/local-repository");
MavenProject project = new MavenProject();
project.setRemoteProjectRepositories(
Arrays.asList(
new RemoteRepository(
"maven-central",
"default",
"http://repo1.maven.org/maven2/"
)
)
);
Collection<Artifact> deps = new Aether(project, repo).resolve(
new DefaultArtifact("junit", "junit-dep", "", "jar", "4.10"),
"runtime"
);
The only dependency you need is:
<dependency>
<groupId>com.jcabi</groupId>
<artifactId>jcabi-aether</artifactId>
<version>0.7.5</version>
</dependency>
Including my approach here, as the additional steps may become part of your actual use case, esp. if working on a composite or multi-module project.
(Maven 3, my runtime was 3.6; no direct dependency on Aether)
In my case I wanted to resolve the dependency tree of a specific artifact foo-runtime from inside my plugin; however,
some of the dependency versions were only available in its parent's foo-parent POM (i.e. absent in the foo-runtime's own POM).
The parent POM also had additional details, such as exclusions for some of foo-runtime's dependencies - via dependencyManagement.
So I had to:
explicitly load the parent's model,
link the child model to it,
fill in the missing version numbers of the child model (still not sure why Maven didn't automatically resolve these after linking the parent), and then
run dependency resolution for the child.
To avoid model building from scratch, I derived the model of of foo-runtime using an existing artifact foo-api (which in my case is always guaranteed to be present in the Maven project being built). All these artifacts share the same groupId.
#Component
public LifecycleDependencyResolver resolver;
// ...
// `artifacts` contains all artifacts of current/reactor `MavenProject` obtained via `project.getArtifacts()`
private Set<Artifact> resolveRuntimeDeps(Set<Artifact> artifacts) throws MojoExecutionException {
// foo-api will always be present; use it to derive coordinates for foo-runtime
Artifact fooApi = artifacts.stream().filter(artifact -> "foo-api".equals(artifact.getArtifactId()))
.findFirst().orElseThrow(() -> new MojoExecutionException("Unable to find foo-api"));
Collection<String> scopes = Arrays.asList("compile", "runtime");
MavenProject fooRoot = deriveProject(fooApi, "foo-parent");
Model fooRootPom = fooRoot.getModel();
MavenProject fooSrv = deriveProject(fooApi, "foo-runtime");
fooSrv.setParent(fooRoot);
// some foo-runtime deps depend on versions declared on parent pom; merge them
Map<String, Artifact> depMgt = fooRootPom.getDependencyManagement().getDependencies().stream()
.collect(Collectors.toMap(dep -> dep.getGroupId() + ":" + dep.getArtifactId() + ":" + dep.getType(), this::toArtifact));
for (Dependency d : fooSrv.getDependencies()) {
if (d.getVersion() == null) {
Artifact managed = depMgt.get(d.getGroupId() + ":" + d.getArtifactId() + ":" + d.getType());
if (managed != null) {
d.setVersion(managed.getVersion());
}
}
}
try {
resolver.resolveProjectDependencies(fooSrv, scopes, scopes, session, false, Collections.emptySet());
return fooSrv.getArtifacts();
} catch (LifecycleExecutionException e) {
throw new MojoExecutionException("Error resolving foo-runtime dependencies", e);
}
}
// load POM for another artifact based on foo-api JAR available in current project
private MavenProject deriveProject(Artifact fooApi, String artifactId) throws MojoExecutionException {
Model pom;
String pomPath = fooApi.getFile().getAbsolutePath().replaceAll("foo-api", artifactId).replaceAll("\\.jar$", ".pom");
try (InputStream fooRootPomData = new FileInputStream(pomPath)) {
pom = new MavenXpp3Reader().read(fooRootPomData);
pom.setPomFile(new File(pomPath));
} catch (IOException | XmlPullParserException e) {
throw new MojoExecutionException("Error loading " + artifactId + " metadata", e);
}
// set these params to avoid skips/errors during resolution
MavenProject proj = new MavenProject(pom);
proj.setArtifact(toArtifact(pom));
proj.setArtifactFilter(Objects::nonNull);
proj.setRemoteArtifactRepositories(Collections.emptyList());
return proj;
}
private Artifact toArtifact(Model model) {
return new DefaultArtifact(
Optional.ofNullable(model.getGroupId()).orElseGet(() -> model.getParent().getGroupId()), model.getArtifactId(),
Optional.ofNullable(model.getVersion()).orElseGet(() -> model.getParent().getVersion()), "compile", model.getPackaging(), null,
project.getArtifact().getArtifactHandler());
}
private Artifact toArtifact(Dependency dep) {
return new DefaultArtifact(dep.getGroupId(), dep.getArtifactId(), dep.getVersion(), dep.getScope(), dep.getType(), dep.getClassifier(),
project.getArtifact().getArtifactHandler());
}
(I tried almost all the other suggested approaches, however all of them ended up with some error or another. Now, looking back, I suspect many of those errors might have been due to the fact that my leaf POM was missing version numbers for some artifacts. It seems (acceptably so) the "model enrichment" phase - propagating parent versions etc. - is carried out by some earlier component in Maven's flow; and the caller has to take care of this, at least partially, when invoking the dependency resolver from scratch.)

how to get the super pom basedir in a child module pom?

I want to define a local repository in my maven project.
I've got a super pom and several child modules. My file structure is :
/root
/repository
/child
pom.xml
pom.xml
in my super pom I define :
<repository>
<id>my-local-repo</id>
<url>file://${basedir}/repository</url>
</repository>
The problem is that in my child pom, the repository defined in my super pom refers to /root/child/repository and so, dependencies cannot be found...
Is there a way to define a path always relative to the super pom ?
If not, what's the best way to solve the problem ?
In this case, at first you could try ${project.parent.basedir}.
As it seems it doesn't work, the simple(and native) way is use complete path (/root/...) or try relative path (../) instead of using ${basedir} variable.
But for me, a great solution would be externalize this configuration into a properties file.
You can use properties-maven-plugin ( http://mojo.codehaus.org/properties-maven-plugin/plugin-info.html ).
With this plugin, properties defined on the properties file can be read just like properties defined inside pom.xml.
From the plugin site:
If you have a properties file called teams.properties with this content:
toronto=raptors
miami=heat
Would be the same as declaring the following in your pom.xml:
<properties>
<toronto>raptors</toronto>
<miami>heat</miami>
</properties>
${project.parent.basedir} should do the job.
Or you can set the basedir-path of the root in a property, so it will be inherited. Something like this in the Parent
<properties>
<rootPath>${basedir}</rootPath>
</properties>
And in the Child
<repository>
<id>my-local-repo</id>
<url>file://${rootPath}/repository</url>
</repository>
I solved this many times with groovy plugin. Add a file called "basepath_marker" to your super pom's directory and add the following to your pom. You can access the property like this: ${base-path}. Read this blog post for more details.
Example:
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>groovy-maven-plugin</artifactId>
<executions>
<!-- set absolute base path from super pom -->
<execution>
<id>find-basepath</id>
<phase>validate</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
<![CDATA[
import java.io.File;
log.info('## define projects super pom absolute path through basepath_marker')
String p = "basepath_marker";
File f = null;
if( p != null ) {
def _max_child_poms = 0
while( _max_child_poms++ < 5 ) {
f = new File( p );
if( f.exists() ) {
break;
}
p = "../" + p;
}
}
if( f != null ) {
String basePath = f.getCanonicalPath();
basePath = basePath.substring( 0, basePath.lastIndexOf( File.separator ) );
project.properties['base-path'] = basePath.replace( '\\' , '/');
log.info(' - used base path = ' + project.properties['base-path'] );
} else {
log.error( 'Could not find basepath_marker marker file!' );
System.stop( 0 );
}
]]>
</source>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
I try ${basedir}/../ in child pom and it works.
${project.parent.basedir} can not be interpreted.
Also the solution as follow not work, seems ${basedir} is dynamic decided.
define a properties <rootPath> ${basedir} </rootPath> int your parent pom
use ${rootPath} in your child pom
In Parent pom -
Try to use relative path (../) instead of using ${basedir}

Resources