Add dependence of the project into my custom gradle plugin - gradle

I'm new in the Gradle world and I'm writing a personal plugin for executing the operation on the database, an example:
create a database, delete the database, create a table e insert a value into database, but I have a problem with import dependence for the project that uses my plugin, an example for creating a database using a JDBC I have to need the driver JDBC for the database, this driver is content into project main.
My question is: How getting a dependency jar for the database into my Gradle plugin?
This is my code
package io.vincentpalazzo.gradledatabase.task;
import io.vincentpalazzo.gradledatabase.exstension.GradleDatabaseExstension;
import io.vincentpalazzo.gradledatabase.persistence.DataSurce;
import org.gradle.api.DefaultTask;
import org.gradle.api.artifacts.Configuration;
import org.gradle.api.tasks.TaskAction;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLClassLoader;
import java.util.Iterator;
import java.io.File;
import java.util.Set;
/**
* #author https://github.com/vincenzopalazzo
*/
public class CreateDatabaseTask extends DefaultTask {
#TaskAction
public void createAction() {
GradleDatabaseExstension project = getProject().getExtensions().findByType(GradleDatabaseExstension.class);
String url = project.getUrl();
String driverClass = project.getDriver(); //The drive name database is different
String username = project.getUsername();
String password = project.getPassword();
String nameDatabase = project.getNameDatabase();
String nameJar = project.getNameJar();
if (findDependecyFileJarForDriver(nameJar)) {
System.out.println("Jar findend");
} else {
System.out.println("Jar not found");
}
DataSurce dataSource = new DataSurce();
if (dataSource.connectionDatabase(driverClass, url, username, password)) {
if (dataSource.createDatabese(nameDatabase)) {
System.out.println("Database " + nameDatabase + " created");
}
}
}
private boolean findDependecyFileJarForDriver(String nameJar) {
if (nameJar == null || nameJar.isEmpty()) {
throw new IllegalArgumentException("The input parameter is null");
}
Iterator<Configuration> iterable = getProject().getConfigurations().iterator();
boolean finded = false;
while ((!finded) || (iterable.hasNext())) {
Configuration configuration = iterable.next();
Set<File> filesSet = configuration.resolve();
for (File file : filesSet) {
String nameFile = file.getName();
if (nameFile.contains(nameJar)) {
//Now?;
finded = true;
}
}
}
return finded;
}
}
And this is my project and this is the referend for my post on Gradle forum
Sorry for my terrible English but I'm learning

I want to add the answer to this post.
The better solution I found is using this plugin
I used the plugin inside the my code, this is an example
public abstract class AbstractTaskGradleDatabase extends DefaultTask {
protected JarHelper jarHelper;
protected Optional<File> jar;
protected void init(){
jarHelper = new JarHelper(getProject());
jar = jarHelper.fetch("nameDependence");
}
}
inside the builld.gradle
dependencies {
implementation gradleApi()
implementation 'com.lingocoder:jarexec.plugin:0.3'
}
ps: the answer can be changed in the time because the version of the plugin is an beta

Related

Wiremock request templating in standalone mode: can I use a XML file as response template and inject value with XPATH?

I know that request template supports XPath, so that I can get value from request like {{xPath request.body '/outer/inner/text()'}}. I already have a XML file as response, and I want to inject this value I got from request, but keep the other parts of this response XML intact. For example, I want to inject it to XPATH /svc_result/slia/pos/msid.
And I need to use it in standalone mode.
I see another question(Wiremock Stand alone - How to manipulate response with request data) but that was with JSON, I have XML request/response.
How can it be done? Thanks.
For example, I have this definition of mapping:
{
"request": {
"method": "POST",
"bodyPatterns": [
{
"matchesXPath": {
"expression": "/svc_init/slir/msids/msid[#type='MSISDN']/text()",
"equalTo": "200853000105614"
}
},
{
"matchesXPath": "/svc_init/hdr/client[id and pwd]"
}
]
},
"response": {
"status": 200,
"bodyFileName": "slia.xml",
"headers": {
"Content-Type": "application/xml;charset=UTF-8"
}
}
}
And this request:
<?xml version="1.0"?>
<!DOCTYPE svc_init>
<svc_init ver="3.2.0">
<hdr ver="3.2.0">
<client>
<id>dummy</id>
<pwd>dummy</pwd>
</client>
</hdr>
<slir ver="3.2.0" res_type="SYNC">
<msids>
<msid type="MSISDN">200853000105614</msid>
</msids>
</slir>
</svc_init>
I expect this response, with xxxxxxxxxxx replaced with the <msid> in the request.
<?xml version="1.0" ?>
<!DOCTYPE svc_result SYSTEM "MLP_SVC_RESULT_320.DTD">
<svc_result ver="3.2.0">
<slia ver="3.0.0">
<pos>
<msid type="MSISDN" enc="ASC">xxxxxxxxxxx</msid>
<pd>
<time utc_off="+0800">20111122144915</time>
<shape>
<EllipticalArea srsName="www.epsg.org#4326">
<coord>
<X>00 01 01N</X>
<Y>016 31 53E</Y>
</coord>
<angle>0</angle>
<semiMajor>2091</semiMajor>
<semiMinor>2091</semiMinor>
<angularUnit>Degrees</angularUnit>
</EllipticalArea>
</shape>
<lev_conf>90</lev_conf>
</pd>
<gsm_net_param>
<cgi>
<mcc>100</mcc>
<mnc>01</mnc>
<lac>2222</lac>
<cellid>10002</cellid>
</cgi>
<neid>
<vmscid>
<vmscno>00004946000</vmscno>
</vmscid>
<vlrid>
<vlrno>99994946000</vlrno>
</vlrid>
</neid>
</gsm_net_param>
</pos>
</slia>
</svc_result>
My first thought was to use transformerParameters to change the response file by inserting the value from the body. Unfortunately, WireMock doesn't resolve the helpers before inserting them into the body response. So while we can reference that MSID value via an xpath helper like
{{xPath request.body '/svc_init/slir/msids/msid/text()'}}
if we try to insert that as a custom transformer parameter, it won't resolve. (I've written up an issue on the WireMock github about this.)
Unfortunately, I think this leaves us with having to write a custom extension that will take the request and find the value and then modify the response file. More information on creating a custom transformer extensions can be found here.
At last I created my own transformer:
package com.company.department.app.extensions;
import com.github.tomakehurst.wiremock.common.FileSource;
import com.github.tomakehurst.wiremock.extension.Parameters;
import com.github.tomakehurst.wiremock.extension.ResponseTransformer;
import com.github.tomakehurst.wiremock.http.Request;
import com.github.tomakehurst.wiremock.http.Response;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.w3c.dom.Document;
import org.w3c.dom.Node;
import org.xml.sax.EntityResolver;
import org.xml.sax.InputSource;
import org.xml.sax.SAXException;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.StringReader;
import java.io.StringWriter;
import java.time.Instant;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.time.format.DateTimeFormatter;
import java.util.List;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.transform.OutputKeys;
import javax.xml.transform.Transformer;
import javax.xml.transform.TransformerException;
import javax.xml.transform.TransformerFactory;
import javax.xml.transform.dom.DOMSource;
import javax.xml.transform.stream.StreamResult;
import javax.xml.xpath.XPath;
import javax.xml.xpath.XPathConstants;
import javax.xml.xpath.XPathExpressionException;
import javax.xml.xpath.XPathFactory;
public class NLGResponseTransformer extends ResponseTransformer {
private static final Logger LOG = LoggerFactory.getLogger(NLGResponseTransformer.class);
private static final String SLIA_FILE = "/stubs/__files/slia.xml";
private static final String REQ_IMSI_XPATH = "/svc_init/slir/msids/msid";
private static final String[] RES_IMSI_XPATHS = {
"/svc_result/slia/pos/msid",
"/svc_result/slia/company_mlp320_slia/company_netinfo/company_ms_netinfo/msid"
};
private static final String[] RES_TIME_XPATHS = {
// for slia.xml
"/svc_result/slia/company_mlp320_slia/company_netinfo/company_ms_netinfo/time",
// for slia_poserror.xml
"/svc_result/slia/pos/poserror/time"
};
private static final DocumentBuilderFactory DOCUMENT_BUILDER_FACTORY = DocumentBuilderFactory.newInstance();
private static final DateTimeFormatter TIME_FORMAT = DateTimeFormatter.ofPattern("yyyyMMddHHmmss");
private static final String UTC_OFF = "utc_off";
private static final String TRANSFORM_FACTORY_ATTRIBUTE_INDENT_NUMBER = "indent-number";
protected static final String COMPANY_MLP_320_SLIA_EXTENSION_DTD = "company_mlp320_slia_extension.dtd";
protected static final String MLP_SVC_RESULT_320_DTD = "MLP_SVC_RESULT_320.DTD";
#Override
public String getName() {
return "inject-request-values";
}
#Override
public Response transform(Request request, Response response, FileSource fileSource, Parameters parameters) {
Document responseDocument = injectValuesFromRequest(request);
String transformedResponse = transformToString(responseDocument);
if (transformedResponse == null) {
return response;
}
return Response.Builder.like(response)
.but()
.body(transformedResponse)
.build();
}
private Document injectValuesFromRequest(Request request) {
// NOTE: according to quickscan:
// "time" element in the MLP is the time MME reports cell_id to GMLC (NLG), NOT the time when MME got the cell_id.
LocalDateTime now = LocalDateTime.now();
Document responseTemplate = readDocument(SLIA_FILE);
Document requestDocument = readDocumentFromBytes(request.getBody());
if (responseTemplate == null || requestDocument == null) {
return null;
}
try {
injectIMSI(responseTemplate, requestDocument);
injectTime(responseTemplate, now);
} catch (XPathExpressionException e) {
LOG.error("Cannot parse XPath expression {}. Cause: ", REQ_IMSI_XPATH, e);
}
return responseTemplate;
}
private Document readDocument(String inputStreamPath) {
try {
DocumentBuilder builder = DOCUMENT_BUILDER_FACTORY.newDocumentBuilder();
// ignore missing dtd
builder.setEntityResolver((publicId, systemId) -> {
if (systemId.contains(COMPANY_MLP_320_SLIA_EXTENSION_DTD) ||
systemId.contains(MLP_SVC_RESULT_320_DTD)) {
return new InputSource(new StringReader(""));
} else {
return null;
}
});
return builder.parse(this.getClass().getResourceAsStream(inputStreamPath));
} catch (Exception e) {
LOG.error("Cannot construct document from resource path. ", e);
return null;
}
}
private Document readDocumentFromBytes(byte[] array) {
try {
DocumentBuilder builder = DOCUMENT_BUILDER_FACTORY.newDocumentBuilder();
// ignore missing dtd
builder.setEntityResolver((publicId, systemId) -> {
if (systemId.contains(COMPANY_MLP_320_SLIA_EXTENSION_DTD) ||
systemId.contains(MLP_SVC_RESULT_320_DTD)) {
return new InputSource(new StringReader(""));
} else {
return null;
}
});
return builder.parse(new ByteArrayInputStream(array));
} catch (Exception e) {
LOG.error("Cannot construct document from byte array. ", e);
return null;
}
}
private XPath newXPath() {
return XPathFactory.newInstance().newXPath();
}
private void injectTime(Document responseTemplate, LocalDateTime now) throws XPathExpressionException {
for (String timeXPath: RES_TIME_XPATHS) {
Node timeTarget = (Node) (newXPath().evaluate(timeXPath, responseTemplate, XPathConstants.NODE));
if (timeTarget != null) {
// set offset in attribute
Node offset = timeTarget.getAttributes().getNamedItem(UTC_OFF);
offset.setNodeValue(getOffsetString());
// set value
timeTarget.setTextContent(TIME_FORMAT.format(now));
}
}
}
private void injectIMSI(Document responseTemplate, Document requestDocument) throws XPathExpressionException {
Node imsiSource = (Node) (newXPath().evaluate(REQ_IMSI_XPATH, requestDocument, XPathConstants.NODE));
String imsi = imsiSource.getTextContent();
for (String xpath : RES_IMSI_XPATHS) {
Node imsiTarget = (Node) (newXPath().evaluate(xpath, responseTemplate, XPathConstants.NODE));
if (imsiTarget != null) {
imsiTarget.setTextContent(imsi);
}
}
}
private String transformToString(Document document) {
if (document == null) {
return null;
}
document.setXmlStandalone(true); // make document to be standalone, so we can avoid outputing standalone="no" in first line
TransformerFactory tf = TransformerFactory.newInstance();
Transformer trans;
try {
trans = tf.newTransformer();
trans.setOutputProperty(OutputKeys.INDENT, "no"); // no extra indent; file already has intent of 4
// cannot find a workaround to inject dtd in doctype line. TODO
//trans.setOutputProperty(OutputKeys.DOCTYPE_SYSTEM, "MLP_SVC_RESULT_320.DTD [<!ENTITY % extension SYSTEM \"company_mlp320_slia_extension.dtd\"> %extension;]");
StringWriter sw = new StringWriter();
trans.transform(new DOMSource(document), new StreamResult(sw));
// Spaces between tags are considered as text node, so when outputing we need to remove the extra empty lines
return sw.toString().replaceAll("\\n\\s*\\n", "\n");
} catch (TransformerException e) {
LOG.error("Cannot transform response document to String. ", e);
return null;
}
}
/**
* Compare system default timezone with UTC and get zone offset in form of (+/-)XXXX.
* Dependent on the machine default timezone/locale.
* #return
*/
private String getOffsetString() {
// getting offset in (+/-)XX:XX format, or "Z" if is UTC
String offset = ZonedDateTime.ofInstant(Instant.now(), ZoneId.systemDefault()).getOffset().toString();
if (offset.equals("Z")) {
return "+0000";
}
return offset.replace(":", "");
}
}
And use it like this:
mvn package it as a JAR(non-runnable), put it aside wiremock standalone jar, for example libs
Run this:
java -cp libs/* com.github.tomakehurst.wiremock.standalone.WireMockServerRunner --extensions com.company.department.app.extensions NLGResponseTransformer --https-port 8443 --verbose
Put the whole command on the same line.
Notice the app jar which contains this transformer and wiremock standalone jar should be among classpath. Also, other dependencies under libs are needed. (I use jib maven plugin which copies all dependencies under libs/; I also move app and wiremock jars to libs/, so I can put "-cp libs/*"). If that does not work, try to specify the location of these two jars in -cp. Be ware that Wiremock will runs OK even when the extension class is not found. So maybe add some loggings.
You can use --root-dir to point to stubs files root, for example --root-dir resources/stubs in my case. By default it points to .(where java runs).

How to merge Manifest sections with Gradle and shadowJar

What I need
We package our products with Gradle and shadowJar. Some of the libraries we use, utilize individual sections in Jar Manifests, specifically attributes like Implementation-Title and
Implementation-Version. These sometimes show in (the outputs of) our products, so I'd like them to survive the shawdowJar-Process.
Example
lib1.jar/META-INF/MANIFEST.MF
Manifest-Version: 1.0
...
Name: org/some/lib
...
Implementation-Title: someLib
Implementation-Version: 2.3
...
lib2.jar/META-INF/MANIFEST.MF
Manifest-Version: 1.0
...
Name: org/some/other/lib
...
Implementation-Title: someOtherLib
Implementation-Version: 5.7-RC
...
=>
product.jar/META-INF/MANIFEST.MF
Manifest-Version: 1.0
...
Name: org/some/lib
...
Implementation-Title: someLib
Implementation-Version: 2.3
...
Name: org/some/other/lib
...
Implementation-Title: someOtherLib
Implementation-Version: 5.7-RC
...
What I found out
It is rather easy to manipulate the resulting Manifest with shadowJar:
project.shadowJar {
manifest {
attributes(["Implementation-Title" : "someLib"], "org/some/lib")
attributes(["Implementation-Title" : "someOtherLib"], "org/some/other/lib")
}
}
generates exactly what I want, statically.
shadowJar can provide me with a list of dependencies. However, when I iterate over the FileCollection like this
project.shadowJar {
manifest {
for (dependency in includedDependencies) {
// read in jar file and set attributes
}
}
}
Gradle is not happy: "Cannot change dependencies of dependency configuration ':project:products:<ProductName>:compile' after it has been included in dependency resolution."
When I define a new task
def dependencies = [];
project.tasks.register('resolveDependencies') {
doFirst {
gradleProject.configurations.compile.resolvedConfiguration.resolvedArtifacts.each {
dependencies.add(it.file)
}
}
}
project.tasks['shadowJar'].dependsOn(project.tasks['resolveDependencies']);
project.shadowJar {
manifest {
// dependencies will be empty when this code is called
for (dependency in dependencies) {
// read in jar file and set attributes
}
}
}
The dependencies are not resolved in time.
What I'd like to know
How can I access the dependencies without upsetting Gradle? Alternatively, is there another way to merge the named individual sections with shadowJar?
According to https://github.com/johnrengelman/shadow/issues/369 the Transformer interface of shadowJar should be used to do this.
So here comes:
import com.github.jengelman.gradle.plugins.shadow.transformers.Transformer;
import com.github.jengelman.gradle.plugins.shadow.transformers.TransformerContext;
import java.io.ByteArrayOutputStream;
import java.util.jar.Attributes;
import java.util.jar.Manifest;
import java.util.Map.Entry;
import shadow.org.apache.tools.zip.ZipOutputStream;
import shadow.org.apache.tools.zip.ZipEntry;
import shadow.org.codehaus.plexus.util.IOUtil;
import org.gradle.api.file.FileTreeElement;
import static java.nio.charset.StandardCharsets.*
import static java.util.jar.JarFile.*;
/**ManifestVersionMergeTransformer appends all version information sections from manifest files to the resulting manifest file.
* #author Robert Lichtenberger
*/
public class ManifestMergeTransformer implements Transformer {
String includePackages; // regular expression that must match a given package
String excludePackages; // regular expression that must not match a given package
private Manifest manifest;
#Override
public boolean canTransformResource(FileTreeElement element) {
MANIFEST_NAME.equalsIgnoreCase(element.relativePath.pathString);
}
#Override
public void transform(TransformerContext context) {
if (manifest == null) {
manifest = new Manifest(context.is);
} else {
Manifest toMerge = new Manifest(context.is);
for (Entry<String, Attributes> entry : toMerge.getEntries().entrySet()) {
if (mustInclude(entry.getKey())) {
manifest.getEntries().put(entry.getKey(), entry.getValue());
}
}
}
IOUtil.close(context.is);
}
private boolean mustInclude(String packageName) {
return (includePackages == null || packageName.matches(includePackages)) && (excludePackages == null || !packageName.matches(excludePackages));
}
#Override
public boolean hasTransformedResource() {
return true;
}
#Override
public void modifyOutputStream(ZipOutputStream os, boolean preserveFileTimestamps) {
ZipEntry entry = new ZipEntry(MANIFEST_NAME);
entry.time = TransformerContext.getEntryTimestamp(preserveFileTimestamps, entry.time);
os.putNextEntry(entry);
if (manifest != null) {
ByteArrayOutputStream manifestContents = new ByteArrayOutputStream();
manifest.write(manifestContents);
os.write(manifestContents.toByteArray());
}
}
}

BDD Jbehave stories while executing results in Pending

Recently I started working on BDD using JBehave.
So far if I run using maven, my maven project is getting successfully build. And then its coming into the story file but then its not proceeding further.
I tried by running with junit but I am getting the same result..
I think my problem is with executor file.
I searched in many sites and even Jbehave.org and many stackoverflow queries..But in vain
Help me to come out of this problem...Let me know if you need any additional information
I spent so much time rectifying this.But couldn't able to find the solution.
Here is my runner file..
package runnerFile;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import org.jbehave.core.configuration.Configuration;
import org.jbehave.core.configuration.MostUsefulConfiguration;
import org.jbehave.core.io.CodeLocations;
import org.jbehave.core.io.LoadFromClasspath;
import org.jbehave.core.io.StoryFinder;
import org.jbehave.core.junit.JUnitStories;
import org.jbehave.core.junit.JUnitStory;
import org.jbehave.core.reporters.Format;
import org.jbehave.core.reporters.StoryReporterBuilder;
import org.jbehave.core.steps.InjectableStepsFactory;
import org.jbehave.core.steps.InstanceStepsFactory;
import org.jbehave.core.steps.ScanningStepsFactory;
import org.jbehave.core.steps.Steps;
public class TestRunner extends JUnitStories{
#Override
public Configuration configuration() {
return new MostUsefulConfiguration()
.useStoryLoader(
new LoadFromClasspath(this.getClass().getClassLoader()))
.useStoryReporterBuilder(
new StoryReporterBuilder()
.withDefaultFormats()
.withFormats(Format.HTML, Format.CONSOLE)
.withRelativeDirectory("jbehave-report")
);
}
#Override
public InjectableStepsFactory stepsFactory() {
// ArrayList<Object> stepFileList = new ArrayList<Object>();
ArrayList<Steps> stepFileList = new ArrayList<Steps>();
stepFileList.add(new Steps(configuration()));
return new InstanceStepsFactory(configuration(), stepFileList);
//return new ScanningStepsFactory(configuration(), "org.jbehave.examples.core.steps", "my.other.steps"`enter code here` ).matchingNames(".*Steps").notMatchingNames(".*SkipSteps");
}
#Override
protected List<String> storyPaths() {
return new StoryFinder().
findPaths(CodeLocations.codeLocationFromClass(
this.getClass()),
Arrays.asList("**/TC_2.story"),
Arrays.asList(""));
}
}
I kept my story file inside src/test/resources . and step definition inside src/test/java
****story:****
**src/test/resources**
Narrative:
In order to communicate effectively to the business some functionality
As a development team
I want to use Behaviour-Driven Development
Scenario: A scenario is a collection of executable steps of different type
Given I launch the url
When I login with username <Username> and password <Password>
Then I should see the homepage
Examples:
|Username|Password|
|test#gmail.com|test1234|
**stepDefinition**
**src/test/java:**
package definition;
import org.jbehave.core.annotations.Given;
import org.jbehave.core.annotations.Named;
import org.jbehave.core.annotations.Then;
import org.jbehave.core.annotations.When;
import pages.Homepage_Pages;
public class HomePage {
Homepage_Pages home;
#Given("I launch the url")
public void url()
{
home.launchUrl();
}
#When("I login with username <Username> and password <Password>")
public void login(#Named("Username") String Username, #Named("Password") String Password)
{
System.out.println(Username);
}
#Then("I should see the homepage")
public void homePageVerification()
{
System.out.println("Heello");
}
}
Maven Console:
Try the following code, which is a stripped-down simple testrunner that does nothing fancy, but simply runs all stories found in sub-folders of the main folder, and includes all step classes in the define steps files location. My original had a lot of those things hard-coded but I changed them to final Strings so it should be easy enough to replace your situation and run with this file. Obviously, change "com.yourpackage.steps" with whatever package folder you place your steps files in. Hope this helps.
package testrunner;
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import org.jbehave.core.configuration.Configuration;
import org.jbehave.core.configuration.MostUsefulConfiguration;
import org.jbehave.core.embedder.EmbedderControls;
import org.jbehave.core.io.CodeLocations;
import org.jbehave.core.io.StoryFinder;
import org.jbehave.core.junit.JUnitStories;
import org.jbehave.core.reporters.CrossReference;
import org.jbehave.core.reporters.Format;
import org.jbehave.core.reporters.StoryReporterBuilder;
import org.jbehave.core.steps.InjectableStepsFactory;
import org.jbehave.core.steps.InstanceStepsFactory;
import org.junit.runner.RunWith;
import de.codecentric.jbehave.junit.monitoring.JUnitReportingRunner;
#RunWith(JUnitReportingRunner.class)
public class TestRunner extends JUnitStories {
private Configuration configuration;
public TestRunner() {
super();
CrossReference crossReference = new CrossReference();
configuration = new MostUsefulConfiguration();
configuration.useStoryReporterBuilder(
new StoryReporterBuilder().withFormats(Format.HTML, Format.STATS, Format.CONSOLE)
.withCodeLocation(CodeLocations.codeLocationFromPath("target/."))
.withCrossReference(crossReference));
EmbedderControls embedderControls = configuredEmbedder().embedderControls();
embedderControls.doBatch(false);
embedderControls.doGenerateViewAfterStories(true);
embedderControls.doSkip(false);
embedderControls.doVerboseFailures(true);
embedderControls.doVerboseFiltering(true);
embedderControls.useThreads(1);
embedderControls.useStoryTimeouts("1800");
}
#Override
protected List<String> storyPaths()
{
return new StoryFinder().findPaths(CodeLocations.codeLocationFromClass(this.getClass()), "**/*.story", "");
}
#Override
public Configuration configuration() {
return configuration;
}
#Override
public InjectableStepsFactory stepsFactory() {
final String stepsPackage = "com.yourpackage.steps";
final String stepsLoc = "src/test/java/" + stepsPackage.replace(".", "/");
List<Object> stepList = new ArrayList<Object>();
File steps = new File(stepsLoc);
File[] fileList = steps.listFiles();
int size = fileList.length;
for (int i = 0; i < size; i++) {
if (fileList[i].isFile()) { // also returns folders (directories)
String value = fileList[i].getName().replace(".java", ""); // strip extensions
if (!value.toLowerCase().contains("testrunner")) { // ignore testrunner itself
try {
Object stepObject = Class.forName((stepsPackage + "." + value)).newInstance();
stepList.add(stepObject);
} catch (InstantiationException e) {
e.printStackTrace();
} catch (IllegalAccessException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
}
}
}
return new InstanceStepsFactory(configuration(), stepList);
}
}

CXF InInterceptor not firing

I have created web service. It works fine. Now I'm trying to implement authentication to it. I'm using CXF interceptors for that purpose. For some reason interceptors won't fire. What am I missing? This is my first web service.
import javax.annotation.Resource;
import javax.inject.Inject;
import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;
import javax.xml.ws.WebServiceContext;
import org.apache.cxf.interceptor.InInterceptors;
#WebService
#InInterceptors(interceptors = "ws.BasicAuthAuthorizationInterceptor")
public class Service {
#WebMethod
public void test(#WebParam(name = "value") Integer value) throws Exception {
System.out.println("Value = " + value);
}
}
-
package ws;
import java.io.IOException;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.apache.cxf.binding.soap.interceptor.SoapHeaderInterceptor;
import org.apache.cxf.configuration.security.AuthorizationPolicy;
import org.apache.cxf.endpoint.Endpoint;
import org.apache.cxf.interceptor.Fault;
import org.apache.cxf.message.Exchange;
import org.apache.cxf.message.Message;
import org.apache.cxf.transport.Conduit;
import org.apache.cxf.ws.addressing.EndpointReferenceType;
public class BasicAuthAuthorizationInterceptor extends SoapHeaderInterceptor {
#Override
public void handleMessage(Message message) throws Fault {
System.out.println("**** GET THIS LINE TO CONSOLE TO SEE IF INTERCEPTOR IS FIRING!!!");
AuthorizationPolicy policy = message.get(AuthorizationPolicy.class);
// If the policy is not set, the user did not specify credentials.
// 401 is sent to the client to indicate that authentication is required.
if (policy == null) {
sendErrorResponse(message, HttpURLConnection.HTTP_UNAUTHORIZED);
return;
}
String username = policy.getUserName();
String password = policy.getPassword();
// CHECK USERNAME AND PASSWORD
if (!checkLogin(username, password)) {
System.out.println("handleMessage: Invalid username or password for user: "
+ policy.getUserName());
sendErrorResponse(message, HttpURLConnection.HTTP_FORBIDDEN);
}
}
private boolean checkLogin(String username, String password) {
if (username.equals("admin") && password.equals("admin")) {
return true;
}
return false;
}
private void sendErrorResponse(Message message, int responseCode) {
Message outMessage = getOutMessage(message);
outMessage.put(Message.RESPONSE_CODE, responseCode);
// Set the response headers
#SuppressWarnings("unchecked")
Map<String, List<String>> responseHeaders = (Map<String, List<String>>) message
.get(Message.PROTOCOL_HEADERS);
if (responseHeaders != null) {
responseHeaders.put("WWW-Authenticate", Arrays.asList(new String[] { "Basic realm=realm" }));
responseHeaders.put("Content-Length", Arrays.asList(new String[] { "0" }));
}
message.getInterceptorChain().abort();
try {
getConduit(message).prepare(outMessage);
close(outMessage);
} catch (IOException e) {
e.printStackTrace();
}
}
private Message getOutMessage(Message inMessage) {
Exchange exchange = inMessage.getExchange();
Message outMessage = exchange.getOutMessage();
if (outMessage == null) {
Endpoint endpoint = exchange.get(Endpoint.class);
outMessage = endpoint.getBinding().createMessage();
exchange.setOutMessage(outMessage);
}
outMessage.putAll(inMessage);
return outMessage;
}
private Conduit getConduit(Message inMessage) throws IOException {
Exchange exchange = inMessage.getExchange();
EndpointReferenceType target = exchange.get(EndpointReferenceType.class);
Conduit conduit = exchange.getDestination().getBackChannel(inMessage, null, target);
exchange.setConduit(conduit);
return conduit;
}
private void close(Message outMessage) throws IOException {
OutputStream os = outMessage.getContent(OutputStream.class);
os.flush();
os.close();
}
}
I'm fighting with this for few days now. Don't know what to google any more. Help is appreciated.
I've found solution. I was missing the following line in MANIFEST.MF file in war project:
Dependencies: org.apache.cxf
Maven wasn't includint this line by himself so I had to find workaround. I found about that here. It says: When using annotations on your endpoints / handlers such as the Apache CXF ones (#InInterceptor, #GZIP, ...) remember to add the proper module dependency in your manifest. Otherwise your annotations are not picked up and added to the annotation index by JBoss Application Server 7, resulting in them being completely and silently ignored.
This is where I found out how to change MANIFEST.MF file.
In short, I added custom manifest file to my project and referenced it in pom.xml. Hope this helps someone.
The answer provided by Felix is accurate. I managed to solve the problem using his instructions. Just for completion here is the maven config that lets you use your own MANIFEST.MF file placed in the META-INF folder.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<archive>
<manifestFile>src/main/resources/META-INF/MANIFEST.MF</manifestFile>
</archive>
</configuration>
</plugin>
and here is the relevant content of the content of the MANIFEST.MF file I was using.
Manifest-Version: 1.0
Description: yourdescription
Dependencies: org.apache.ws.security,org.apache.cxf

Felix lists OSGI Bundle as Active but Gogo Shell Command Not accessible (dependency related)

This basic code succeeds at making the command scopeA:test accessible in the shell:
package com.A;
import org.apache.felix.ipojo.annotations.Component;
import org.apache.felix.ipojo.annotations.Instantiate;
import org.apache.felix.ipojo.annotations.Provides;
import org.apache.felix.ipojo.annotations.Requires;
import org.apache.felix.ipojo.annotations.ServiceProperty;
import org.apache.felix.service.command.Descriptor;
#Component(immediate = true)
#Instantiate
#Provides(specifications = Commands.class)
public final class Commands {
#ServiceProperty(name = "osgi.command.scope", value = "scopeA")
String scope;
#ServiceProperty(name = "osgi.command.function", value = "{}")
String[] function = new String[] {
"test"
};
#Descriptor("Example")
public void test() {
System.out.println("hello");
}
}
However, if I add a constructor that depends on another OSGI component, it the command is no longer accessible and "help" doesn't list it. Yet the bundle can still be loading into an active state.
package com.A;
import org.apache.felix.ipojo.annotations.Component;
import org.apache.felix.ipojo.annotations.Instantiate;
import org.apache.felix.ipojo.annotations.Provides;
import org.apache.felix.ipojo.annotations.Requires;
import org.apache.felix.ipojo.annotations.ServiceProperty;
import org.apache.felix.service.command.Descriptor;
import com.B;
#Component(immediate = true)
#Instantiate
#Provides(specifications = Commands.class)
public final class Commands {
public Commands(#Requires B b) {
}
#ServiceProperty(name = "osgi.command.scope", value = "scopeA")
String scope;
#ServiceProperty(name = "osgi.command.function", value = "{}")
String[] function = new String[] {
"test"
};
#Descriptor("Example")
public void test() {
System.out.println("hello");
}
}
The contents of B is simply:
import org.apache.felix.ipojo.annotations.Component;
import org.apache.felix.ipojo.annotations.Instantiate;
import org.apache.felix.ipojo.annotations.Provides;
#Component(immediate = true)
#Instantiate
#Provides
final class B {
}
Any ideas why the command is no longer listed? Tips to find more information on the state so that I can better debug this?
The problem is that commands needs the #Requires to be on a field rather than in the constructor.
#Requires
B b;
The constructor also must be removed.
This is because gogo has a special method of invoking the component.
also for me this needs to be changed
#ServiceProperty(name = "osgi.command.function", value = "{}")
String[] function = new String[] {
"test"
};
to
#ServiceProperty(name = "osgi.command.function", value = "{test}")
String[] function;

Resources