i have below code
package runner;
import java.util.stream.Stream;
public class PiccoloRunnerMain {
private static String[] defaultOptions = {
"--plugin", "pretty",
"--plugin", "html:target/fix-api-results.html",
"--plugin", "summary",
"--glue", "cukestep",
"--tags", "#T1",
"src/main/resources/features/"
};
public static void main(String args[]){
try{
Stream<String> cucumberOptions = Stream.concat(Stream.of(defaultOptions), Stream.of(args));
io.cucumber.core.cli.Main.main(cucumberOptions.toArray(String[]::new));
}catch(Exception e){
System.out.println(e);
}
}
}
but after generating uberjar file using gradle , when i run test its throwing error
java.lang.IllegalArgumentException: path must exist: /Users/abcd/repo/qa-orchestra/src/main/resources/features
how I to pass feature file to cucumber cli?
task uberJar(type: Jar) {
manifest {
attributes 'Main-Class': "$mainClassName"
}
from configurations.runtimeClasspath.findAll { it.name.endsWith('jar') }.collect { zipTree(it) }
baseName(project.name)
with jar
}
this is how I am generating jar file. or this is the issue with jar?
answer is here in one of the comments.
Running Cucumber tests directly from executable jar
Related
What I need
We package our products with Gradle and shadowJar. Some of the libraries we use, utilize individual sections in Jar Manifests, specifically attributes like Implementation-Title and
Implementation-Version. These sometimes show in (the outputs of) our products, so I'd like them to survive the shawdowJar-Process.
Example
lib1.jar/META-INF/MANIFEST.MF
Manifest-Version: 1.0
...
Name: org/some/lib
...
Implementation-Title: someLib
Implementation-Version: 2.3
...
lib2.jar/META-INF/MANIFEST.MF
Manifest-Version: 1.0
...
Name: org/some/other/lib
...
Implementation-Title: someOtherLib
Implementation-Version: 5.7-RC
...
=>
product.jar/META-INF/MANIFEST.MF
Manifest-Version: 1.0
...
Name: org/some/lib
...
Implementation-Title: someLib
Implementation-Version: 2.3
...
Name: org/some/other/lib
...
Implementation-Title: someOtherLib
Implementation-Version: 5.7-RC
...
What I found out
It is rather easy to manipulate the resulting Manifest with shadowJar:
project.shadowJar {
manifest {
attributes(["Implementation-Title" : "someLib"], "org/some/lib")
attributes(["Implementation-Title" : "someOtherLib"], "org/some/other/lib")
}
}
generates exactly what I want, statically.
shadowJar can provide me with a list of dependencies. However, when I iterate over the FileCollection like this
project.shadowJar {
manifest {
for (dependency in includedDependencies) {
// read in jar file and set attributes
}
}
}
Gradle is not happy: "Cannot change dependencies of dependency configuration ':project:products:<ProductName>:compile' after it has been included in dependency resolution."
When I define a new task
def dependencies = [];
project.tasks.register('resolveDependencies') {
doFirst {
gradleProject.configurations.compile.resolvedConfiguration.resolvedArtifacts.each {
dependencies.add(it.file)
}
}
}
project.tasks['shadowJar'].dependsOn(project.tasks['resolveDependencies']);
project.shadowJar {
manifest {
// dependencies will be empty when this code is called
for (dependency in dependencies) {
// read in jar file and set attributes
}
}
}
The dependencies are not resolved in time.
What I'd like to know
How can I access the dependencies without upsetting Gradle? Alternatively, is there another way to merge the named individual sections with shadowJar?
According to https://github.com/johnrengelman/shadow/issues/369 the Transformer interface of shadowJar should be used to do this.
So here comes:
import com.github.jengelman.gradle.plugins.shadow.transformers.Transformer;
import com.github.jengelman.gradle.plugins.shadow.transformers.TransformerContext;
import java.io.ByteArrayOutputStream;
import java.util.jar.Attributes;
import java.util.jar.Manifest;
import java.util.Map.Entry;
import shadow.org.apache.tools.zip.ZipOutputStream;
import shadow.org.apache.tools.zip.ZipEntry;
import shadow.org.codehaus.plexus.util.IOUtil;
import org.gradle.api.file.FileTreeElement;
import static java.nio.charset.StandardCharsets.*
import static java.util.jar.JarFile.*;
/**ManifestVersionMergeTransformer appends all version information sections from manifest files to the resulting manifest file.
* #author Robert Lichtenberger
*/
public class ManifestMergeTransformer implements Transformer {
String includePackages; // regular expression that must match a given package
String excludePackages; // regular expression that must not match a given package
private Manifest manifest;
#Override
public boolean canTransformResource(FileTreeElement element) {
MANIFEST_NAME.equalsIgnoreCase(element.relativePath.pathString);
}
#Override
public void transform(TransformerContext context) {
if (manifest == null) {
manifest = new Manifest(context.is);
} else {
Manifest toMerge = new Manifest(context.is);
for (Entry<String, Attributes> entry : toMerge.getEntries().entrySet()) {
if (mustInclude(entry.getKey())) {
manifest.getEntries().put(entry.getKey(), entry.getValue());
}
}
}
IOUtil.close(context.is);
}
private boolean mustInclude(String packageName) {
return (includePackages == null || packageName.matches(includePackages)) && (excludePackages == null || !packageName.matches(excludePackages));
}
#Override
public boolean hasTransformedResource() {
return true;
}
#Override
public void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
ZipEntry entry = new ZipEntry(MANIFEST_NAME);
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time);
os.putNextEntry(entry);
if (manifest != null) {
ByteArrayOutputStream manifestContents = new ByteArrayOutputStream();
manifest.write(manifestContents);
os.write(manifestContents.toByteArray());
}
}
}
I'm trying to configure Jacoco to exclude some classes from analysis but can't find any working example :(
I found some samples with afterEvaluate but no success
src/main/java/org/example/A.java:
package org.example;
class A {
}
src/main/java/org/example/B.java:
package org.example;
class B {
}
src/test/java/org/example/ExampleTest.java:
package org.example;
public class ExampleTest {
#org.junit.Test
public void test() {
new A();
new B();
}
}
build.gradle.kts:
plugins {
java
jacoco
}
repositories {
mavenCentral()
}
dependencies {
testCompile("junit:junit:4.12")
}
using Gradle 5.4.1 execution of gradle test jacocoTestReport produces following report
after addition to build.gradle.kts
tasks.withType<JacocoReport> {
classDirectories.setFrom(
sourceSets.main.get().output.asFileTree.matching {
exclude("org/example/B.class")
}
)
}
execution of the same command produces following report
Just to add to #Godin's awesome answer:
The way #Godin explained it, you would have to run gradle test jacocoTestReport which isn't bad but If you want jacoco to run when you run just with gradle test add this to your build.gralde.kts:
tasks.test {
finalizedBy("jacocoTestReport")
doLast {
println("View code coverage at:")
println("file://$buildDir/reports/jacoco/test/html/index.html")
}
}
I've managed to exclude this way:
tasks.jacocoTestReport {
classDirectories.setFrom(
files(classDirectories.files.map {
fileTree(it) {
exclude(
"com/example/integration/**",
"com/example/application/*Ext*"
)
}
})
)
}
Taken from here
Groovy/Gradle project here that uses Spock for unit testing.
Does Spock and/or Gradle support test suites or named sets of tests? For reasons outside the scope of this question, there are certain Spock tests (Specifications) that the CI server just can't run.
So it would be great to divide all my app's Spock tests into two groups:
"ci-tests"; and
"local-only-tests"
And then perhaps we could invoke them via:
./gradlew test --suite ci-tests
etc. Is this possible? If so, what does the setup/config look like?
You can annotate the tests that should not run in your CI server with the Spock annotation #IgnoreIf( ).
See the documentation here: https://spockframework.github.io/spock/docs/1.0/extensions.html#_ignoreif
All you need to do is let the CI server set an environment variable, and exclude the test class if that variable is set.
Spock even have properties inside the closure to make it easy:
#IgnoreIf({ sys.isCiServer })
I would set up a submodule my-app-ci-test, with the following in build.gradle:
test {
enabled = false
}
task functionalTest(type: Test) {
}
Then you place your tests in src/test/groovy and run ./gradlew functionalTest.
Alternatively, you could include them in the same module and configure the test and functionalTest tasks with includes / excludes
test {
exclude '**/*FunctionalTest.groovy'
}
task functionalTest(type: Test) {
include '**/*FunctionalTest.groovy'
}
If you use Junit test-runner for Spock tests, you may use #Category annotation. Example by article and official documentation:
public interface FastTests {
}
public interface SlowTests {
}
public interface SmokeTests
}
public static class A {
#Test
public void a() {
fail();
}
#Category(SlowTests.class)
#Test
public void b() {
}
#Category({FastTests.class, SmokeTests.class})
#Test
public void c() {
}
}
#Category({SlowTests.class, FastTests.class})
public static class B {
#Test
public void d() {
}
}
test {
useJUnit {
includeCategories 'package.FastTests'
}
testLogging {
showStandardStreams = true
}
}
You can use the following SpockConfiguration.groovy to allow passing include/exclude via system properties
runner {
exclude {
System.properties['spock.exclude.annotations']
?.split(',')
*.trim()
?.each {
try {
annotation Class.forName(it)
println "Excluding ${it}"
} catch (ClassNotFoundException e) {
println "Can't load ${it}: ${e.message}"
}
}
}
include {
System.properties['spock.include.annotations']
?.split(',')
*.trim()
?.each {
try {
annotation Class.forName(it)
println "Including ${it}"
} catch (ClassNotFoundException e) {
println "Can't load ${it}: ${e.message}"
}
}
}
}
I'm trying to use the Play Gradle Plugin to compile/package a Play 2.3.x app that uses Ebean.
Everything works fine during compilation and packaging, but when I run the app I get the well known error
Entity type class SomeEntity is not an enhanced entity bean.
Subclassing is not longer supported in Ebean
So how can I can make Gradle run the enhancer during compilation?
This is how i have done it. I am using play 2.4 but should be able to work for you.
First add a configuration in your build.gradle as follows -
configurations {
enhance
}
Next add a dependency on ebeanorm agent as shown below:
dependencies {
enhance group: 'org.avaje.ebeanorm', name: 'avaje-ebeanorm-agent', version: '4.5.3'
}
Ensure you have the required play dependencies in your build.gradle as shown below:
dependencies {
play 'org.avaje:avaje-agentloader:2.1.2'
play "org.avaje.ebeanorm:avaje-ebeanorm-agent:4.5.3"
}
Finally add the following to do the enhancement after the compile task has executed:
model {
components {
play {
binaries.all{binary ->
tasks.withType(PlatformScalaCompile) {
doLast {
ant.taskdef(name: 'ebean', classname: 'com.avaje.ebean.enhance.ant.AntEnhanceTask', classpath: project.configurations.enhance.asPath)
ant.ebean(classSource: "${project.buildDir}/playBinary/classes", packages: 'models.package.name', transformArgs: 'debug=1')
}
}
}
}
}
#koolrich, i had tried the solution and when it didn't compile i moved on, only later to find the only problem was the dbmodels/* expected path while my path was different.
Initially what seemed like magic and confusing jargon about enhancements, the following helped me understand what is going on:
https://openjpa.apache.org/builds/1.2.3/apache-openjpa/docs/ref_guide_pc_enhance.html
Essentially, enhancement is adding some more methods and properties to work with persistance.
I was converting Play 2.5.2 (Java) project from sbt to gradle and facing the same problem, then tried with the solution given by #koolrich. but it did not work well. Every thing was fine but it failed to return data for relation object(it was returning null for relational object). Then i compared the enhanced bytecode generated by sbt and gradle, find out delta. Then find out how the play enhance the bytecode. play enhance bytecode in three steps.
It generates getters and setters for fields it they aren't any in place yet and is done by play-enhancements-plugins(play.core.enhancers.PropertiesEnhancer.generateAccessors)
It rewrites classes that directly access fields to use the accessors instead and is done by play-enhancements-plugins(play.core.enhancers.PropertiesEnhancer.rewriteAccess)
If using Ebean, the Ebean enhancer will be applied to the classes configured via application.conf (ebean-enhancement plugins)
Eaxmple:
Employee employee=Employee.find.byId(1);
Company company=employee.company;
After Step 1&2, this will be converted to
Company company=employee.getCompany();
With Employee#getCompany() being something like
#PropertiesEnhancer.GeneratedAccessor
public Company getCompany(){
return this.company;
}
After step 3, the getter will be modified to be something like
#PropertiesEnhancer.GeneratedAccessor
public Company getCompany(){
return _ebean_get_company();
}
protected Company _ebean_get_company() {
this._ebean_intercept.preGetter("company");
return this.company;
}
So converting sbt to gradle, you have to perform this three steps as gradle play plugins does not support this three steps. For step 3, ebean has enhancement class(ant Task) that can be used(solution given by #koolrich), for step 1 & 2, I wrote another enhancement ant Task which add accessor and rewrite access. here is gradle.build file look like.
configurations {
enhance
playEnhance
}
dependencies {
enhance "org.avaje.ebeanorm:avaje-ebeanorm-agent:4.9.1"
playEnhance 'com.typesafe.play:play-enhancer:1.1.0'
}
model {
components {
play {
binaries.all{binary ->
tasks.withType(PlatformScalaCompile) {
doLast {
ant.taskdef(name: "playenhancetask", classname:"com.xxx.gradlehelper.PlayGradleEnhancherTask", classpath:"${project.buildDir}/playBinary/classes/:${project.configurations.playEnhance.asPath}")
ant.playenhancetask(classSource: "${project.buildDir}/playBinary/classes", packages: 'com.xxx.xxx.*', classpath:"${project.configurations.play.asPath}")
ant.taskdef(name: 'ebean', classname: 'com.avaje.ebean.enhance.ant.AntEnhanceTask', classpath: project.configurations.enhance.asPath)
ant.ebean(classSource: "${project.buildDir}/playBinary/classes", packages: 'com.xxx.xxx.xxx.*', transformArgs: 'debug=1')
}
}
}
}
}
}
dependencies {
play 'org.avaje:avaje-agentloader:2.1.2'
play 'org.avaje.ebeanorm:avaje-ebeanorm:6.18.1'
play 'com.typesafe.play:play-ebean_2.11:3.0.0'
play 'com.typesafe.play:play-enhancer:1.1.0'
play "org.avaje.ebeanorm:avaje-ebeanorm-agent:4.9.1"
play group: 'org.apache.ant', name: 'ant', version: '1.8.2'
}
Here is my ant Task PlayGradleEnhancherTask.java
package com.xxx.gradlehelper;
import org.apache.tools.ant.BuildException;
import org.apache.tools.ant.Task;
import play.core.enhancers.*;
import java.io.File;
import java.io.IOException;
import java.nio.file.DirectoryStream;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
public class PlayGradleEnhancherTask extends Task {
String classpath;
String classSource;
String transformArgs;
String packages;
public String getClasspath() {
return classpath;
}
public void setClasspath(String classpath) {
this.classpath = classpath;
}
public void setClassSource(String source) {
this.classSource = source;
}
public void setTransformArgs(String transformArgs) {
this.transformArgs = transformArgs;
}
public void setPackages(String packages) {
this.packages = packages;
}
public void execute() {
if (packages == null) {
throw new BuildException("No message set.");
}
log("classSource: " + classSource + ", packages: " + packages + "\n classPath: "+ classpath);
String dir = packages.trim().replace('.', '/');
dir = dir.substring(0, dir.length() - 1);
String dirPath = classSource + "/" + dir;
File d = new File(dirPath);
if (!d.exists()) {
throw new RuntimeException("File not found " + dirPath + " currentDir:" + new File(".").getAbsolutePath());
}
Path path = Paths.get(dirPath);
List<File> fileNames = new ArrayList();
List<File> files = getFiles(fileNames, path);
//generate property accessor
generateAccessors(files);
//rewrite access
rewriteAccess(files);
}
private void rewriteAccess(List<File> files) {
for (File file: files) {
try{
PropertiesEnhancer.rewriteAccess(classSource+":"+classpath,file);
}catch (Exception e){
String fileName = file == null ? "null" : file.getName() +", e: "+ e.getMessage();
System.err.println("Could not enhance[rewriteAccess]:"+fileName);
}
}
}
private void generateAccessors(List<File> files) {
for (File file: files) {
try{
PropertiesEnhancer.generateAccessors(classSource+":"+classpath,file);
}catch (Exception e){
e.printStackTrace();
String fileName = file == null ? "null" : file.getName();
System.err.println("Could not enhance[generateAccessors]: "+fileName);
}
}
}
private List<File> getFiles(List<File> files, Path dir) {
try(DirectoryStream<Path> stream = Files.newDirectoryStream(dir)) {
for (Path path : stream) {
if(path.toFile().isDirectory()) {
getFiles(files, path);
} else {
File file = path.toFile();
if(!file.getName().startsWith("Reverse")&& file.getName().endsWith(".class")) {
files.add(file);
}
}
}
} catch(IOException e) {
e.printStackTrace();
}
return files;
}
}
I have implemented a Gradle plugin using the Gradle DSL style. The plugin is adding multiple aspects such as a adding a custom task, and configuring more other tasks. Overall, the plugin is generating some metadata property file in a source folder that must be configured by a plugin extension.
apply plugin: 'artifactMetadata'
// configure the path for the metadata
artifactMetadata {
destinationDirectory = "src/main/generated/otherlocation/resources"
}
I have been able to figure out how to configure the task using the extension properties, however it's tricking me with the remaining stuff. What is a good approach to configure the source set, the clean task and the idea plugin (see the #n: TODO comments in the plugin code below)? The implementation below will always use the default value, not the one injected through the plugin extension.
class ArtifactMetadataPlugin implements Plugin<Project> {
public static final String EXTENSION_NAME = 'artifactMetadata'
public static final String TASK_NAME = 'generateArtifactMetadata'
void apply(Project project) {
createExtension(project)
project.configure (project) {
task (TASK_NAME, type: GenerateArtifactMetadata) {
group = project.group
artifact = project.name
version = project.version.toString()
}
sourceSets {
main {
// #1:TODO to get the plugin extension property current value here output.dir(project.artifactMetadata.destinationDirectory, builtBy: TASK_NAME)
resources.srcDirs += file(project.artifactMetadata.destinationDirectory)
}
}
clean {
// #2:TODO get the plugin extension property here
delete file(project.artifactMetadata.destinationDirectory)
}
if (project.plugins.hasPlugin(IdeaPlugin)) {
idea {
module {
// #3:TODO get the plugin extension property here
sourceDirs += file(project.artifactMetadata.destinationDirectory)
}
}
}
}
project.afterEvaluate {
def extension = project.extensions.findByName(EXTENSION_NAME)
project.tasks.withType(GenerateArtifactMetadata).all { task ->
task.destinationDirectory = project.file(extension.destinationDirectory)
}
}
}
private static void createExtension(Project project) {
def extension = project.extensions.create(EXTENSION_NAME, ArtifactMetadataPluginExtension)
extension.with {
destinationDirectory = "src/main/generated/artifactinfo/resources"
}
}
}