How to make CodeNarc force maven build to fail - maven

I'm trying to integrate CodeNarc into a maven based project and I've been running into problems.
I want to use a custom ruleset, and when a rule is violated, I'd like my maven build to fail.
How can I configure codenarc so that violations of rules lead to a failure when I run the following?
mvn clean install
Also, the documentation for configuring CodeNarc in a POM doesn't explain how to reference where my custom ruleset is. Any advice for how to set that up? Thanks!
When I run mvn clean install with the configurations below (I have a groovy file with blatant violations in accordance with my ruleset)
My build succeeds. :(
I tried referencing my own ruleset and no violations were being produced.
I took away a rulesetfiles property in the POM and it started producing violations.
(But I don't get to choose my own)
Anyone know how to make it actually read a custom rule set file? I tried with both xml and groovy.
Here's my ruleset and plugin config from my POM:
<ruleset xmlns="http://codenarc.org/ruleset/1.0";
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance";
xsi:schemaLocation="http://codenarc.org/ruleset/1.0 http://codenarc.org/ruleset-schema.xsd";
xsi:noNamespaceSchemaLocation="http://codenarc.org/ruleset-schema.xsd">;
<description>Dummy rule set</description>
<rule class='org.codenarc.rule.formatting.SpaceAfterIf'>
<property name='priority' value='1'/>
</rule>
<rule class='org.codenarc.rule.basic.EmptyIfStatement'>
<property name='priority' value='1'/>
</rule>
</ruleset>
I referenced this ruleset in my POM like this:
<groupId>org.codehaus.mojo</groupId>
<artifactId>codenarc-maven-plugin</artifactId>
<version>0.18-1</version>
<configuration>
<sourceDirectory>${basedir}/src/test/groovy</sourceDirectory>
<maxPriority1Violations>0</maxPriority1Violations>
<maxPriority2Violations>0</maxPriority2Violations>
<maxPriority3Violations>0</maxPriority3Violations>
<rulesetfiles>${basedir}/rulesets/ruleset.xml</rulesetfiles>
<xmlOutputDirectory>${basedir}/</xmlOutputDirectory>
</configuration>
<executions>
<execution>
<id>execution1</id>
<phase>install</phase>
<goals>
<goal>codenarc</goal>
</goals>
</execution>
</executions>

I was struggling with the same some time ago. I remember it was possible to run with maven properly but I don't have this config. Why? Because CodeNarc needs to compile your sources for purpuse of some rules execution. But codenarc maven plugin doesn't pass classpath and compilation was failing.
So I went for different approach which is running CodeNarc as a test source with ant task. It looks like:
import spock.lang.Specification
class GroovyCodeNarcStaticAnalysisRunner extends Specification {
private static final GROOVY_FILES = '**/*.groovy'
private static final ANALYSIS_SCOPE = 'src/main/groovy'
private static final RULESET_LOCATION = 'file:tools/static-analysis/codenarc.xml'
private static final HTML_REPORT_FILE = 'target/codenarc-result.html'
private static final XML_REPORT_FILE = 'target/codenarc-result.xml'
def 'Groovy code should meet coding standards'() {
given:
def ant = new AntBuilder()
ant.taskdef(name: 'codenarc', classname: 'org.codenarc.ant.CodeNarcTask')
expect:
ant.codenarc(
ruleSetFiles: RULESET_LOCATION,
maxPriority1Violations: 0,
maxPriority2Violations: 0,
maxPriority3Violations: 0)
{
fileset(dir: ANALYSIS_SCOPE) {
include(name: GROOVY_FILES)
}
report(type: 'text') {
option(name: 'writeToStandardOut', value: true)
}
report(type: 'xml') {
option(name: 'outputFile', value: XML_REPORT_FILE)
}
report(type: 'html') {
option(name: 'outputFile', value: HTML_REPORT_FILE)
}
}
}
}
You don't need to use Spock's Specification for that. Any test runner will do. On the maven side it's enough to make CodeNarc dependency configured with scope test.

Related

How to Pass the features folder dynamically to a Jenkins->Maven->Testng Runner File

In my jenkins pipeline I am invoking my test like
"mvn test -Drun_location=US"
And my pom.xml file looks like this
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.16</version>
<configuration>
<systemPropertyVariables>
<run.location>${run_location}</run.location>
</systemPropertyVariables>
<suiteXmlFiles>
<suiteXmlFile>${basedir}/testng.xml</suiteXmlFile>
</suiteXmlFiles>
<skipTests>false</skipTests>
<testFailureIgnore>false</testFailureIgnore>
</configuration>
</plugin>
And my testng.xml file looks like this
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<suite>
<test>
<parameter name="location" value="${run_location}"/>
<classes>
<class name="com.test.Runner_Jenkins"/>
</classes>
</test>
</suite>
I need to pass this parameter to my runner file to set the feature directory dynamically. How do I accomplish this.
#CucumberOptions (
features = {"features/${run_location}"},
glue = "StepDefinitions"
)
There are two ways of achieving your requirement.
You could use the build parameters of Jenkins and pass your run_location. Then using the Execute Shell script option that runs before the maven build, you could use the sed command to add this parameter to your pom.xml file, feature file, and testng.xml file. sed command replaces the existing text with new text. You could search google about the usage of sed command.
The second option is to pass the run-location as a Jenkins build parameter. Then receive the build parameter inside your java code. Then add this run-location as a parameter to your testng xml file dynamically. This is called programmatic execution of testng. The below code will do this. The below code reads an existing testng XML file and adds a parameter based on your Jenkins input. In this way, you could receive this value inside any of your tests using the code mentioned below.
But this approach will not modify your pom.xml file and feature file. During the execution you have to figure out way to get this value to your code using java.
Programmatic execution of testng with dynamic jenkins parameter:
String run_location = System.getProperty("run_location");
TestNG tng = new TestNG();
File initialFile = new File("testng.xml");
InputStream inputStream = FileUtils.openInputStream(initialFile);
Parser p = new Parser(inputStream);
List<XmlSuite> suites = p.parseToList();
List<XmlSuite> modifiedSuites = new ArrayList<>();
for (XmlSuite suite : suites) {
XmlSuite modifiedSuite = new XmlSuite();
modifiedSuite.setParallel(suite.getParallel());
modifiedSuite.setThreadCount(deviceNames.size());
modifiedSuite.setName(suite.getName());
modifiedSuite.setListeners(suite.getListeners());
List<XmlTest> tests = suite.getTests();
for (XmlTest test : tests) {
for (int i = 0; i < deviceNames.size(); i++) {
XmlTest modifedtest = new XmlTest(modifiedSuite);
HashMap<String, String> parametersMap = new HashMap<>();
parametersMap.put("run_location", run_location);
modifedtest.setParameters(parametersMap);
modifedtest.setXmlClasses(test.getXmlClasses());
}
}
modifiedSuites.add(modifiedSuite);
}
inputStream.close();
tng.setXmlSuites(modifiedSuites);
tng.run();
Accessing the Parameter value inside your tests
#Test
#Parameters(value={"run_location"})
public void executeBeforeTest(String runlocation){
/**Your code goes her**/
}

How can I ensure that the right bytecode is available to my custom sonar plugin rule, so I don't get !unknown! for every type?

I've been attempting to write a custom rules plugin for Sonarqube ~5.4, and while I've gotten a few rules implemented and working, the ones that rely on types outside the standard libraries rely on various kinds of acrobatic string matching.
I'm using the sonar-packaging-maven-plugin to do the packaging:
<plugin>
<groupId>org.sonarsource.sonar-packaging-maven-plugin</groupId>
<artifactId>sonar-packaging-maven-plugin</artifactId>
<version>1.16</version>
<configuration>
<pluginClass>${project.groupId}.sonar.BravuraRulesPlugin</pluginClass>
<pluginKey>SonarPluginBravura</pluginKey>
<skipDependenciesPackaging>false</skipDependenciesPackaging>
<basePlugin>java</basePlugin>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>sonar-plugin</goal>
</goals>
</execution>
</executions>
</plugin>
And am running the various checks using the following helper extension (kotlin):
fun <T : JavaFileScanner> T.verify() {
val workDir = System.getProperty("user.dir");
val folder = Paths.get(workDir, "src/test/samples", this.javaClass.simpleName);
Files.list(folder).forEach { sample ->
try {
if (sample.toString().endsWith(".clean.java")) {
JavaCheckVerifier.verifyNoIssue(sample.toString(), this);
} else {
JavaCheckVerifier.verify(sample.toString(), this);
}
} catch (error: Exception) {
throw VerificationFailedException(sample, error);
}
}
};
class VerificationFailedException(path: Path, error: Exception)
: Exception("Failed to verify $path.", error);
I create an IssuableSubscriptionVisitor subclass for the rule, and visit Tree.Kind.METHOD_INVOCATION, looking for uses of a static MAX, MIN, ASC, or DESC sql builder method being passed an AutoLongColumn. This is to stop the identifier field being used for ordering purposes.
Unfortunately, even though I have the requisite library on the maven 'test' classpath, when I try and get any of the types, they just show as !unknown!.
override fun visitNode(tree: Tree) {
if (tree !is MethodInvocationTree) {
return;
}
val methodSelect = tree.methodSelect();
if (methodSelect !is IdentifierTree || methodSelect.name() !in setOf("MAX", "MIN", "ASC", "DESC")) {
return;
}
val firstArg = statement.arguments().first();
if (firstArg !is MethodInvocationTree) {
return;
}
val firstArgSelect = firstArg.methodSelect();
if (firstArgSelect !is MemberSelectExpressionTree) {
return;
}
if (firstArgSelect.type is UnknownType) {
throw TableFlipException("(ノಥ益ಥ)ノ ┻━┻");
}
// It never gets here.
}
I'm sure I'm missing some vital piece of the puzzle, and I'd appreciate if someone can tell me where I'm going wrong.
EDIT: I'm using org.sonarsource.java:sonar-java-plugin:3.14 for the analyser, and while I can't release all the code for the analysis target (commercial IP and all that), here's something structurally identical to the key part:
import static com.library.UtilClass.MAX;
...
query.SELECT(biggestId = MAX(address._id())) // Noncompliant
.FROM(address)
.WHERE(address.user_id().EQ(userId)
.AND(address.type_id().EQ(typeId)));
...
The type of address.id() is an com.library.Identifier that wraps a long. I'd like to be able to visit all the method invocations, check if they match com.library.UtilCLass.MAX, and if so, make sure that the first parameter isn't a com.library.Identifier. Without the type information, I have to do a regex match on _id method references, which is prone to potentially missing things.
So, turns out that the way to get the types available is by using maven (or whatever tool you're using) to copy the needed jars into a directory, then turn the lot into a list of files, and pass that to the test verifier.
For example, lets pretend we're trying to find usages of joda-time:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.10</version>
<executions>
<execution>
<id>copy-libs</id>
<phase>generate-test-resources</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.4</version>
</artifactItem>
</artifactItems>
</configuration>
</execution>
<executions>
</plugin>
This execution will put the joda-time jar into the target/dependency directory. Next, you make sure to enumerate the jars in that directory, and add them to your test verification (we're assuming you named your verifier 'JodaCheck'):
// Not at all necessary, but it makes the code later on a lot easier to read.
fun <T> Stream<T>.toList(): List<T> = this.collect({
mutableListOf()
}, { list, item ->
list.add(item)
}, { list, otherList ->
list.addAll(otherList)
})
...
val workDir = System.getProperty("user.dir")
val sampleFile = Paths.get(workDir, "src/test/samples/JodaSample.java").toString()
val dependencies = Files.list(Paths.get(workDir, "target/dependency"))
.map { it.toFile() }.toList()
JavaCheckVerifier.verify(sampleFile, JodaChecker(), dependencies)
Once you've done that, debugging through the tests will show that the joda-time classes are available during analysis.

Gradle plugin for XML Beans

I am trying to write a Gradle plugin for XML Beans. I have started with one of the 'Hello from Gradle' plugin examples, and also a plugin published by R. Artavia here. That plugin went straight to jar - I am trying to only generate source. The generated source must then be compiled with other project source and included in a single jar. Other goals include
- full plugin - all I should need is "apply plugin: 'xmlbean'"
- I can configure source/code gen location and some features if I want to
- It detects whether it needs to be rebuilt. (well, eventually!!!)
I am off to a pretty good start, but am blocked defining a new sourceSet. I am getting an error "No such property 'srcDirs'" (or 'srcDir'). It seems there is something I have to define someplace to make a new sourceSet work but I cannot find it. I have tried several different syntaxes (with/without equal sign, brackets, srcDir/srcDirs, etc. - nothing is working...
What do I need to do inside a plugin to make a new sourceSet entry be properly recognized?
Thank you!
JKE
File: xmlbean.gradle (includes greeting plugin for the moment for debugging)
apply plugin: xmlbean
apply plugin: 'java'
xmlbean {
message = 'Hi'
greeter = 'Gradle'
}
class xmlbean implements Plugin<Project> {
void apply(Project project) {
project.extensions.create("xmlbean", xmlbeanExtension)
Task xmlbeanTask = project.task('xmlbean')
xmlbeanTask << {
project.configurations {
xmlbeans
}
project.dependencies {
xmlbeans 'org.apache.xmlbeans:xmlbeans:2.5.0'
}
project.sourceSets {
main {
java {
srcDirs += '$project.buildDir/generated-source/xmlbeans'
}
}
xmlbeans {
srcDirs = ['src/main/xsd']
}
}
ant.taskdef(name: 'xmlbean',
classname: 'org.apache.xmlbeans.impl.tool.XMLBean',
classpath: project.configurations.xmlbeans.asPath)
ant.xmlbean(schema: project.sourceSets.xmlbean.srcDir,
srconly: true,
srcgendir: "$project.buildDir/generated-sources/xmlbeans",
classpath: project.configurations.xmlbeans.asPath)
println "${project.xmlbean.message} from ${project.xmlbean.greeter}"
}
project.compileJava.dependsOn(xmlbeanTask)
}
}
class xmlbeanExtension {
String message
String greeter
}
File: build.gradle
apply from: '../gradle/xmlbeans.gradle'
dependencies {
compile "xalan:xalan:$ver_xalan",
":viz-common:0.0.1",
":uform-repository:0.1.0"
}
Console: Error message:
:idk:xmlbean FAILED
FAILURE: Build failed with an exception.
* Where:
Script 'C:\jdev\cpc-maven\try.g2\comotion\gradle\xmlbeans.gradle' line: 32
* What went wrong:
Execution failed for task ':idk:xmlbean'.
> No such property: srcDirs for class: org.gradle.api.internal.tasks.DefaultSourceSet_Decorated
...
BUILD FAILED
Gradle info: version 2.5 / groovy 2.3.10 / JVM 7u55 on Windows 7 AMD64
You should try to become familiar with the Gradle DSL reference guide, because it's a huge help in situations like this. For example, if you click on the sourceSets { } link in the left navigation bar, you're taken to this section on source sets.
From there, you'll discover that the sourceSets {} block is backed by a class, SourceSetContainer. The next level of configuration nested inside is backed by a SourceSet object, and then within that you have one or more SourceDirectorySet configurations. When you follow the link to SourceDirectorySet, you'll see that there are getSrcDirs() and setSrcDirs() methods.
So how does this help? If you look closely at the exception, you'll see that Gradle is saying it can't find a srcDirs property on DefaultSourceSet_Decorated, which you can hopefully infer is an instance of SourceSet. That interface does not have an srcDirs property. That's because your xmlbeans {} block is configuring a SourceSet, not a SourceDirectorySet. You need to add another nested configuration to gain access to srcDirs.
At this point, I'm wondering whether a new source set is the appropriate solution. Unfortunately it's not clear to me exactly what the plugin should be doing, so I can't offer any alternatives at this point.

SONAR 3.7.3 - PMD XPath rule <my custom rule> can't be imported automatically. The rule must be created manually through the SonarQube web interface

I am working on trying to get some custom pmd rules onto our SONAR server so they will show up in our nightly tests. I have an xml file with a bunch of custom rules like so:
<rule class="net.sourceforge.pmd.rules.XPathRule" dfa="false" externalInfoUrl="" message="System.out.print is used" name="MyOrganisation_SystemPrintln" typeResolution="true">
<description>System.(out|err).print is used, consider using a logger.</description>
<priority>5</priority>
<properties>
<property name="xpath">
<value><![CDATA[
//Name[
starts-with(#Image, 'System.out.print')
or
starts-with(#Image, 'System.err.print')
]
]]></value>
</property>
</properties>
<example><![CDATA[
class Foo{
Logger log = Logger.getLogger(Foo.class.getName());
public void testA () {
System.out.println("Entering test");
// Better use this
log.fine("Entering test");
}
}
]]></example>
When I go to the Quality profiles page and make a new profile, giving it the xml file I get a bunch of errors like this:
PMD XPath rule 'MyOrganisation_SystemPrintln' can't be imported
automatically. The rule must be created manually through the SonarQube
web interface.
Which seems clear enough, however when I try and create a new rule by copying the generic xpath rule that is there already and changing it there is nowhere to put the "example" part. (There is only Name, message, xpathQuery and Description) I was wondering if I am missing something that might be the cause of this, and how I can get these rules onto the sonar server?
Thanks very much.
Edit: The PMD version is 1.3, as is the java plugin
Edit2: Another example of a rule:
<rule class="net.sourceforge.pmd.rules.UnusedPrivateFieldRule" dfa="false" externalInfoUrl="" message="Avoid unused private fields such as ''{0}''" name="MyOrganisation_UnusedPrivateField" typeResolution="true">
<description>Detects when a private field is declared and/or assigned a value, but not used.</description>
<priority>5</priority>
<example><![CDATA[
public class Something {
private static int FOO = 2; // Unused
private int i = 5; // Unused
private int j = 6;
public int addOne() {
return j++;
}
}
]]></example>
</rule>
Indeed, there is no possibility in the SonarQube custom rules to declare examples like on the PMD custom rules, you will have to put the examples in the description, e.g using blockquote elements.

How can I use Jenkins to run my integration tests in parallel?

Right now we've got a project that builds in two jobs. 1) Is the standard build with unit tests. 2) is the integration tests. They work like this:
build the whole project, run unit tests, start integration test job
build the whole project, deploy it to the integration server, run client side integration tests against integration server
The problem is step 2) now takes over an hour to run and I'd like to parallelize the integration tests so that they take less time. But I'm not exactly sure how I can/should do this. My first thought is that I could have two step 2)s like this:
build the whole project, run unit tests, start integration test job
build the whole project, deploy it to the integration server1, run client side integration tests against integration server1
build the whole project, deploy it to the integration server2, run client side integration tests against integration server2
But then, how do I run half the integration tests on integration server1, and the other half on integration server2? I am using maven, so I could probably figure out something with failsafe and a complex includes/excludes pattern. But that sounds like something that would take a lot of effort to maintain. EG: when someone adds a new integration test class, how do I ensure that it gets run on one of the two servers? Does the developer have to modify the maven patterns?
I found this great article on how to do this, but it gives a way to do it in Groovy code. I pretty much followed these steps, but I haven't written the code to distribute the tests evenly by duration. But this is still a useful tool so I'll share it.
import junit.framework.JUnit4TestAdapter;
import junit.framework.TestSuite;
import org.junit.Ignore;
import org.junit.extensions.cpsuite.ClassesFinder;
import org.junit.extensions.cpsuite.ClasspathFinderFactory;
import org.junit.extensions.cpsuite.SuiteType;
import org.junit.runner.RunWith;
import org.junit.runners.AllTests;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.List;
#RunWith(AllTests.class)
public class DistributedIntegrationTestRunner {
private static Logger log = LoggerFactory.getLogger(DistributedIntegrationTestRunner.class);
public static TestSuite suite() {
TestSuite suite = new TestSuite();
ClassesFinder classesFinder = new ClasspathFinderFactory().create(true,
new String[]{".*IntegrationTest.*"},
new SuiteType[]{SuiteType.TEST_CLASSES},
new Class[]{Object.class},
new Class[]{},
"java.class.path");
int nodeNumber = systemPropertyInteger("node.number", "0");
int totalNodes = systemPropertyInteger("total.nodes", "1");
List<Class<?>> allTestsSorted = getAllTestsSorted(classesFinder);
allTestsSorted = filterIgnoredTests(allTestsSorted);
List<Class<?>> myTests = getMyTests(allTestsSorted, nodeNumber, totalNodes);
log.info("There are " + allTestsSorted.size() + " tests to choose from and I'm going to run " + myTests.size() + " of them.");
for (Class<?> myTest : myTests) {
log.info("I will run " + myTest.getName());
suite.addTest(new JUnit4TestAdapter(myTest));
}
return suite;
}
private static int systemPropertyInteger(String propertyKey, String defaultValue) {
String slaveNumberString = System.getProperty(propertyKey, defaultValue);
return Integer.parseInt(slaveNumberString);
}
private static List<Class<?>> filterIgnoredTests(List<Class<?>> allTestsSorted) {
ArrayList<Class<?>> filteredTests = new ArrayList<Class<?>>();
for (Class<?> aTest : allTestsSorted) {
if (aTest.getAnnotation(Ignore.class) == null) {
filteredTests.add(aTest);
}
}
return filteredTests;
}
/*
TODO: make this algorithm less naive. Sort each test by run duration as described here: http://blog.tradeshift.com/just-add-servers/
*/
private static List<Class<?>> getAllTestsSorted(ClassesFinder classesFinder) {
List<Class<?>> allTests = classesFinder.find();
Collections.sort(allTests, new Comparator<Class<?>>() {
#Override
public int compare(Class<?> o1, Class<?> o2) {
return o1.getSimpleName().compareTo(o2.getSimpleName());
}
});
return allTests;
}
private static List<Class<?>> getMyTests(List<Class<?>> allTests, int nodeNumber, int totalNodes) {
List<Class<?>> myTests = new ArrayList<Class<?>>();
for (int i = 0; i < allTests.size(); i++) {
Class<?> thisTest = allTests.get(i);
if (i % totalNodes == nodeNumber) {
myTests.add(thisTest);
}
}
return myTests;
}
}
The ClasspathFinderFactory is used to find all test classes that match the .*IntegrationTest pattern.
I make N jobs and they all run this Runner but they all use different values for the node.number system property, so each job runs a different set of tests. This is how the failsafe plugin looks:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.12.4</version>
<executions>
<execution>
<id>integration-tests</id>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>**/DistributedIntegrationTestRunner.java</include>
</includes>
<skipITs>${skipITs}</skipITs>
</configuration>
</plugin>
The ClasspathFinderFactory comes from
<dependency>
<groupId>cpsuite</groupId>
<artifactId>cpsuite</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
I think there should be some Jenkins plugin for this, but I haven't been able to find one. Something that's close is the Parallel Test Executor, but I don't think this does the same thing I need. It looks like it runs all the tests on a single job/server instead of multiple servers. It doesn't provide an obvious way to say, "run these tests here, and those tests there".
I believe you already found a solution by now, but I'll leave a path for the others who'll open this page asking the same question:
Parallel test executor plugin:
"This plugin adds a new builder that lets you easily execute tests defined in a separate job in parallel. This is achieved by having Jenkins look at the test execution time of the last run, split tests into multiple units of roughly equal size, then execute them in parallel."
https://wiki.jenkins-ci.org/display/JENKINS/Parallel+Test+Executor+Plugin
Yes, Parallel Test Executor is a cool plugin if you've got 2 slave or one slave with 8 executor because the this plugin based on "tests splitting" so e.g: you split your junit tests into 4 different array, these arrays will run on 4 different executor on that slave what you specified. I hope you got it :D, it depends on the number of executors on that slave where you want to run parallel testing or you should decrease split tests count to 2 from 4.

Resources