How can I ensure that the right bytecode is available to my custom sonar plugin rule, so I don't get !unknown! for every type? - sonarqube

I've been attempting to write a custom rules plugin for Sonarqube ~5.4, and while I've gotten a few rules implemented and working, the ones that rely on types outside the standard libraries rely on various kinds of acrobatic string matching.
I'm using the sonar-packaging-maven-plugin to do the packaging:
<plugin>
<groupId>org.sonarsource.sonar-packaging-maven-plugin</groupId>
<artifactId>sonar-packaging-maven-plugin</artifactId>
<version>1.16</version>
<configuration>
<pluginClass>${project.groupId}.sonar.BravuraRulesPlugin</pluginClass>
<pluginKey>SonarPluginBravura</pluginKey>
<skipDependenciesPackaging>false</skipDependenciesPackaging>
<basePlugin>java</basePlugin>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>sonar-plugin</goal>
</goals>
</execution>
</executions>
</plugin>
And am running the various checks using the following helper extension (kotlin):
fun <T : JavaFileScanner> T.verify() {
val workDir = System.getProperty("user.dir");
val folder = Paths.get(workDir, "src/test/samples", this.javaClass.simpleName);
Files.list(folder).forEach { sample ->
try {
if (sample.toString().endsWith(".clean.java")) {
JavaCheckVerifier.verifyNoIssue(sample.toString(), this);
} else {
JavaCheckVerifier.verify(sample.toString(), this);
}
} catch (error: Exception) {
throw VerificationFailedException(sample, error);
}
}
};
class VerificationFailedException(path: Path, error: Exception)
: Exception("Failed to verify $path.", error);
I create an IssuableSubscriptionVisitor subclass for the rule, and visit Tree.Kind.METHOD_INVOCATION, looking for uses of a static MAX, MIN, ASC, or DESC sql builder method being passed an AutoLongColumn. This is to stop the identifier field being used for ordering purposes.
Unfortunately, even though I have the requisite library on the maven 'test' classpath, when I try and get any of the types, they just show as !unknown!.
override fun visitNode(tree: Tree) {
if (tree !is MethodInvocationTree) {
return;
}
val methodSelect = tree.methodSelect();
if (methodSelect !is IdentifierTree || methodSelect.name() !in setOf("MAX", "MIN", "ASC", "DESC")) {
return;
}
val firstArg = statement.arguments().first();
if (firstArg !is MethodInvocationTree) {
return;
}
val firstArgSelect = firstArg.methodSelect();
if (firstArgSelect !is MemberSelectExpressionTree) {
return;
}
if (firstArgSelect.type is UnknownType) {
throw TableFlipException("(ノಥ益ಥ)ノ ┻━┻");
}
// It never gets here.
}
I'm sure I'm missing some vital piece of the puzzle, and I'd appreciate if someone can tell me where I'm going wrong.
EDIT: I'm using org.sonarsource.java:sonar-java-plugin:3.14 for the analyser, and while I can't release all the code for the analysis target (commercial IP and all that), here's something structurally identical to the key part:
import static com.library.UtilClass.MAX;
...
query.SELECT(biggestId = MAX(address._id())) // Noncompliant
.FROM(address)
.WHERE(address.user_id().EQ(userId)
.AND(address.type_id().EQ(typeId)));
...
The type of address.id() is an com.library.Identifier that wraps a long. I'd like to be able to visit all the method invocations, check if they match com.library.UtilCLass.MAX, and if so, make sure that the first parameter isn't a com.library.Identifier. Without the type information, I have to do a regex match on _id method references, which is prone to potentially missing things.

So, turns out that the way to get the types available is by using maven (or whatever tool you're using) to copy the needed jars into a directory, then turn the lot into a list of files, and pass that to the test verifier.
For example, lets pretend we're trying to find usages of joda-time:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy-libs</id>
<phase>generate-test-resources</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.4</version>
</artifactItem>
</artifactItems>
</configuration>
</execution>
<executions>
</plugin>
This execution will put the joda-time jar into the target/dependency directory. Next, you make sure to enumerate the jars in that directory, and add them to your test verification (we're assuming you named your verifier 'JodaCheck'):
// Not at all necessary, but it makes the code later on a lot easier to read.
fun <T> Stream<T>.toList(): List<T> = this.collect({
mutableListOf()
}, { list, item ->
list.add(item)
}, { list, otherList ->
list.addAll(otherList)
})
...
val workDir = System.getProperty("user.dir")
val sampleFile = Paths.get(workDir, "src/test/samples/JodaSample.java").toString()
val dependencies = Files.list(Paths.get(workDir, "target/dependency"))
.map { it.toFile() }.toList()
JavaCheckVerifier.verify(sampleFile, JodaChecker(), dependencies)
Once you've done that, debugging through the tests will show that the joda-time classes are available during analysis.

Related

Correct way to use shaded jar as dependency

I have a foo project which shades protobuf
<relocation>
<pattern>com.google</pattern>
<shadedPattern>shade.foo.com.google</shadedPattern>
</relocation>
<relocation>
<pattern>com.google.protobuf</pattern>
<shadedPattern>shade.foo.com.google.protobuf</shadedPattern>
</relocation>
Then I want to include the generated uber jar as depedency in another project bar.
com.google.protobuf.CodedInputStream input = com.google.protobuf.CodedInputStream.newInstance(buf);
com.google.protobuf.ExtensionRegistryLite extensionRegistry = com.google.protobuf.ExtensionRegistryLite.newInstance();
com.google.protobuf.Parser parserOk = com.bar.something.parser();
shade.foo.com.google.protobuf.Parser shadedParser = com.foo.somethinkg.parser();
input.readMessage(parserOk, extensionRegistry); // This is OK.
input.readMessage(shadedParser, extensionRegistry); // This throws error: Cannot resolve method 'readMessage'
So what's the correct way to reference methods from the shaded jar? Should I name all the reference starting with com.foo?

How Does surefire decide on the Test framework?

I have been trying to understand how Surefire plugin internally decides which Testing Framework to use ( TestNG, Jupiter, Junit4 etc. )
Does it use reflection and try to find the presence of each framework in the classpath ?
( Looking at the dependencies, Surefire seems to be coming with junit4 in its transitive dependencies - junit:JUnit:jar:4.12 )
It's possible to pass the provider (test-framework type) explicitly, setting an additional plugin dependency, e.g. for TestNG:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-testng</artifactId>
<version>3.0.0-M5</version>
</dependency>
</dependencies>
</plugin>
If nothing specified
Surefire normally automatically selects which test-framework provider to use based on the version of TestNG/JUnit present in your project's classpath.
From this doc:
https://maven.apache.org/surefire/maven-surefire-plugin/examples/providers.html
How Surefire plugin internally decides which Testing Framework to use
Let's look at how it's implemented.
There is ProviderInfo interface with method boolean isApplicable();
ProviderInfo.java
I've found multiple implementations in class AbstractSurefireMojo.java
AbstractSurefireMojo.java
for:
TestNgProviderInfo
JUnit3ProviderInfo
JUnit4ProviderInfo
JUnitPlatformProviderInfo
JUnitCoreProviderInfo
DynamicProviderInfo
And there is also a protected method protected List<ProviderInfo> createProviders( TestClassPath testClasspath ) which reference all this implementations.
protected List<ProviderInfo> createProviders( TestClassPath testClasspath )
throws MojoExecutionException
{
Artifact junitDepArtifact = getJunitDepArtifact();
return providerDetector.resolve( new DynamicProviderInfo( null ),
new JUnitPlatformProviderInfo( getJUnit5Artifact(), testClasspath ),
new TestNgProviderInfo( getTestNgArtifact() ),
new JUnitCoreProviderInfo( getJunitArtifact(), junitDepArtifact ),
new JUnit4ProviderInfo( getJunitArtifact(), junitDepArtifact ),
new JUnit3ProviderInfo() );
}
and ProviderDetector class invokes isApplicable() per each providerInfo in resolve method.
ProviderDetector.java
And looks like the first applicable is selected:
private Optional<ProviderInfo> autoDetectOneWellKnownProvider( ProviderInfo... wellKnownProviders )
{
Optional<ProviderInfo> providerInfo = stream( wellKnownProviders )
.filter( ProviderInfo::isApplicable )
.findFirst();
providerInfo.ifPresent( p -> logger.info( "Using auto detected provider " + p.getProviderName() ) );
return providerInfo;
}

How to make CodeNarc force maven build to fail

I'm trying to integrate CodeNarc into a maven based project and I've been running into problems.
I want to use a custom ruleset, and when a rule is violated, I'd like my maven build to fail.
How can I configure codenarc so that violations of rules lead to a failure when I run the following?
mvn clean install
Also, the documentation for configuring CodeNarc in a POM doesn't explain how to reference where my custom ruleset is. Any advice for how to set that up? Thanks!
When I run mvn clean install with the configurations below (I have a groovy file with blatant violations in accordance with my ruleset)
My build succeeds. :(
I tried referencing my own ruleset and no violations were being produced.
I took away a rulesetfiles property in the POM and it started producing violations.
(But I don't get to choose my own)
Anyone know how to make it actually read a custom rule set file? I tried with both xml and groovy.
Here's my ruleset and plugin config from my POM:
<ruleset xmlns="http://codenarc.org/ruleset/1.0";
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
xsi:schemaLocation="http://codenarc.org/ruleset/1.0 http://codenarc.org/ruleset-schema.xsd";
xsi:noNamespaceSchemaLocation="http://codenarc.org/ruleset-schema.xsd">;
<description>Dummy rule set</description>
<rule class='org.codenarc.rule.formatting.SpaceAfterIf'>
<property name='priority' value='1'/>
</rule>
<rule class='org.codenarc.rule.basic.EmptyIfStatement'>
<property name='priority' value='1'/>
</rule>
</ruleset>
I referenced this ruleset in my POM like this:
<groupId>org.codehaus.mojo</groupId>
<artifactId>codenarc-maven-plugin</artifactId>
<version>0.18-1</version>
<configuration>
<sourceDirectory>${basedir}/src/test/groovy</sourceDirectory>
<maxPriority1Violations>0</maxPriority1Violations>
<maxPriority2Violations>0</maxPriority2Violations>
<maxPriority3Violations>0</maxPriority3Violations>
<rulesetfiles>${basedir}/rulesets/ruleset.xml</rulesetfiles>
<xmlOutputDirectory>${basedir}/</xmlOutputDirectory>
</configuration>
<executions>
<execution>
<id>execution1</id>
<phase>install</phase>
<goals>
<goal>codenarc</goal>
</goals>
</execution>
</executions>
I was struggling with the same some time ago. I remember it was possible to run with maven properly but I don't have this config. Why? Because CodeNarc needs to compile your sources for purpuse of some rules execution. But codenarc maven plugin doesn't pass classpath and compilation was failing.
So I went for different approach which is running CodeNarc as a test source with ant task. It looks like:
import spock.lang.Specification
class GroovyCodeNarcStaticAnalysisRunner extends Specification {
private static final GROOVY_FILES = '**/*.groovy'
private static final ANALYSIS_SCOPE = 'src/main/groovy'
private static final RULESET_LOCATION = 'file:tools/static-analysis/codenarc.xml'
private static final HTML_REPORT_FILE = 'target/codenarc-result.html'
private static final XML_REPORT_FILE = 'target/codenarc-result.xml'
def 'Groovy code should meet coding standards'() {
given:
def ant = new AntBuilder()
ant.taskdef(name: 'codenarc', classname: 'org.codenarc.ant.CodeNarcTask')
expect:
ant.codenarc(
ruleSetFiles: RULESET_LOCATION,
maxPriority1Violations: 0,
maxPriority2Violations: 0,
maxPriority3Violations: 0)
{
fileset(dir: ANALYSIS_SCOPE) {
include(name: GROOVY_FILES)
}
report(type: 'text') {
option(name: 'writeToStandardOut', value: true)
}
report(type: 'xml') {
option(name: 'outputFile', value: XML_REPORT_FILE)
}
report(type: 'html') {
option(name: 'outputFile', value: HTML_REPORT_FILE)
}
}
}
}
You don't need to use Spock's Specification for that. Any test runner will do. On the maven side it's enough to make CodeNarc dependency configured with scope test.

How can I use Jenkins to run my integration tests in parallel?

Right now we've got a project that builds in two jobs. 1) Is the standard build with unit tests. 2) is the integration tests. They work like this:
build the whole project, run unit tests, start integration test job
build the whole project, deploy it to the integration server, run client side integration tests against integration server
The problem is step 2) now takes over an hour to run and I'd like to parallelize the integration tests so that they take less time. But I'm not exactly sure how I can/should do this. My first thought is that I could have two step 2)s like this:
build the whole project, run unit tests, start integration test job
build the whole project, deploy it to the integration server1, run client side integration tests against integration server1
build the whole project, deploy it to the integration server2, run client side integration tests against integration server2
But then, how do I run half the integration tests on integration server1, and the other half on integration server2? I am using maven, so I could probably figure out something with failsafe and a complex includes/excludes pattern. But that sounds like something that would take a lot of effort to maintain. EG: when someone adds a new integration test class, how do I ensure that it gets run on one of the two servers? Does the developer have to modify the maven patterns?
I found this great article on how to do this, but it gives a way to do it in Groovy code. I pretty much followed these steps, but I haven't written the code to distribute the tests evenly by duration. But this is still a useful tool so I'll share it.
import junit.framework.JUnit4TestAdapter;
import junit.framework.TestSuite;
import org.junit.Ignore;
import org.junit.extensions.cpsuite.ClassesFinder;
import org.junit.extensions.cpsuite.ClasspathFinderFactory;
import org.junit.extensions.cpsuite.SuiteType;
import org.junit.runner.RunWith;
import org.junit.runners.AllTests;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
#RunWith(AllTests.class)
public class DistributedIntegrationTestRunner {
private static Logger log = LoggerFactory.getLogger(DistributedIntegrationTestRunner.class);
public static TestSuite suite() {
TestSuite suite = new TestSuite();
ClassesFinder classesFinder = new ClasspathFinderFactory().create(true,
new String[]{".*IntegrationTest.*"},
new SuiteType[]{SuiteType.TEST_CLASSES},
new Class[]{Object.class},
new Class[]{},
"java.class.path");
int nodeNumber = systemPropertyInteger("node.number", "0");
int totalNodes = systemPropertyInteger("total.nodes", "1");
List<Class<?>> allTestsSorted = getAllTestsSorted(classesFinder);
allTestsSorted = filterIgnoredTests(allTestsSorted);
List<Class<?>> myTests = getMyTests(allTestsSorted, nodeNumber, totalNodes);
log.info("There are " + allTestsSorted.size() + " tests to choose from and I'm going to run " + myTests.size() + " of them.");
for (Class<?> myTest : myTests) {
log.info("I will run " + myTest.getName());
suite.addTest(new JUnit4TestAdapter(myTest));
}
return suite;
}
private static int systemPropertyInteger(String propertyKey, String defaultValue) {
String slaveNumberString = System.getProperty(propertyKey, defaultValue);
return Integer.parseInt(slaveNumberString);
}
private static List<Class<?>> filterIgnoredTests(List<Class<?>> allTestsSorted) {
ArrayList<Class<?>> filteredTests = new ArrayList<Class<?>>();
for (Class<?> aTest : allTestsSorted) {
if (aTest.getAnnotation(Ignore.class) == null) {
filteredTests.add(aTest);
}
}
return filteredTests;
}
/*
TODO: make this algorithm less naive. Sort each test by run duration as described here: http://blog.tradeshift.com/just-add-servers/
*/
private static List<Class<?>> getAllTestsSorted(ClassesFinder classesFinder) {
List<Class<?>> allTests = classesFinder.find();
Collections.sort(allTests, new Comparator<Class<?>>() {
#Override
public int compare(Class<?> o1, Class<?> o2) {
return o1.getSimpleName().compareTo(o2.getSimpleName());
}
});
return allTests;
}
private static List<Class<?>> getMyTests(List<Class<?>> allTests, int nodeNumber, int totalNodes) {
List<Class<?>> myTests = new ArrayList<Class<?>>();
for (int i = 0; i < allTests.size(); i++) {
Class<?> thisTest = allTests.get(i);
if (i % totalNodes == nodeNumber) {
myTests.add(thisTest);
}
}
return myTests;
}
}
The ClasspathFinderFactory is used to find all test classes that match the .*IntegrationTest pattern.
I make N jobs and they all run this Runner but they all use different values for the node.number system property, so each job runs a different set of tests. This is how the failsafe plugin looks:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.12.4</version>
<executions>
<execution>
<id>integration-tests</id>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>**/DistributedIntegrationTestRunner.java</include>
</includes>
<skipITs>${skipITs}</skipITs>
</configuration>
</plugin>
The ClasspathFinderFactory comes from
<dependency>
<groupId>cpsuite</groupId>
<artifactId>cpsuite</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
I think there should be some Jenkins plugin for this, but I haven't been able to find one. Something that's close is the Parallel Test Executor, but I don't think this does the same thing I need. It looks like it runs all the tests on a single job/server instead of multiple servers. It doesn't provide an obvious way to say, "run these tests here, and those tests there".
I believe you already found a solution by now, but I'll leave a path for the others who'll open this page asking the same question:
Parallel test executor plugin:
"This plugin adds a new builder that lets you easily execute tests defined in a separate job in parallel. This is achieved by having Jenkins look at the test execution time of the last run, split tests into multiple units of roughly equal size, then execute them in parallel."
https://wiki.jenkins-ci.org/display/JENKINS/Parallel+Test+Executor+Plugin
Yes, Parallel Test Executor is a cool plugin if you've got 2 slave or one slave with 8 executor because the this plugin based on "tests splitting" so e.g: you split your junit tests into 4 different array, these arrays will run on 4 different executor on that slave what you specified. I hope you got it :D, it depends on the number of executors on that slave where you want to run parallel testing or you should decrease split tests count to 2 from 4.

how to get the super pom basedir in a child module pom?

I want to define a local repository in my maven project.
I've got a super pom and several child modules. My file structure is :
/root
/repository
/child
pom.xml
pom.xml
in my super pom I define :
<repository>
<id>my-local-repo</id>
<url>file://${basedir}/repository</url>
</repository>
The problem is that in my child pom, the repository defined in my super pom refers to /root/child/repository and so, dependencies cannot be found...
Is there a way to define a path always relative to the super pom ?
If not, what's the best way to solve the problem ?
In this case, at first you could try ${project.parent.basedir}.
As it seems it doesn't work, the simple(and native) way is use complete path (/root/...) or try relative path (../) instead of using ${basedir} variable.
But for me, a great solution would be externalize this configuration into a properties file.
You can use properties-maven-plugin ( http://mojo.codehaus.org/properties-maven-plugin/plugin-info.html ).
With this plugin, properties defined on the properties file can be read just like properties defined inside pom.xml.
From the plugin site:
If you have a properties file called teams.properties with this content:
toronto=raptors
miami=heat
Would be the same as declaring the following in your pom.xml:
<properties>
<toronto>raptors</toronto>
<miami>heat</miami>
</properties>
${project.parent.basedir} should do the job.
Or you can set the basedir-path of the root in a property, so it will be inherited. Something like this in the Parent
<properties>
<rootPath>${basedir}</rootPath>
</properties>
And in the Child
<repository>
<id>my-local-repo</id>
<url>file://${rootPath}/repository</url>
</repository>
I solved this many times with groovy plugin. Add a file called "basepath_marker" to your super pom's directory and add the following to your pom. You can access the property like this: ${base-path}. Read this blog post for more details.
Example:
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.gmaven</groupId>
<artifactId>groovy-maven-plugin</artifactId>
<executions>
<!-- set absolute base path from super pom -->
<execution>
<id>find-basepath</id>
<phase>validate</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
<![CDATA[
import java.io.File;
log.info('## define projects super pom absolute path through basepath_marker')
String p = "basepath_marker";
File f = null;
if( p != null ) {
def _max_child_poms = 0
while( _max_child_poms++ < 5 ) {
f = new File( p );
if( f.exists() ) {
break;
}
p = "../" + p;
}
}
if( f != null ) {
String basePath = f.getCanonicalPath();
basePath = basePath.substring( 0, basePath.lastIndexOf( File.separator ) );
project.properties['base-path'] = basePath.replace( '\\' , '/');
log.info(' - used base path = ' + project.properties['base-path'] );
} else {
log.error( 'Could not find basepath_marker marker file!' );
System.stop( 0 );
}
]]>
</source>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
I try ${basedir}/../ in child pom and it works.
${project.parent.basedir} can not be interpreted.
Also the solution as follow not work, seems ${basedir} is dynamic decided.
define a properties <rootPath> ${basedir} </rootPath> int your parent pom
use ${rootPath} in your child pom
In Parent pom -
Try to use relative path (../) instead of using ${basedir}

Resources