How to set up testsuite files for execution across environments - maven

I have a test suite file that needs ability to be executed from both mvn command line (from Jenkins) as well as on-demand from Eclipse.
The test suite file must have ability to support parameters, ie:
<suite name="test run1">
<parameter name="testEnv" value="dev"></parameter>
<parameter name="proxyServer" value="x"></parameter>
<parameter name="proxyPort" value="y"></parameter>
If I leave as is, then mvn command line parameters don't work, as the values in the test suite file will override the parameters. i.e. this will not work:
mvn test ... -dtestEnv=E1QA -dproxyServer= -dproxyPort=
How can I write the test suite file so it supports both ad-hoc execution from Eclipse and mvn command line execution?

If you want configurable test property, use #DataProvider instead of hardcoding suite xml.
Provider class:
public class EnvProvider {
#DataProvider(name = "envProvider")
public static Object[][] createData() {
return new Object[][] { new Object[] {
System.getProperty("testEnv", "eclipse-default") }
};
}
Test method:
#Test(dataProvider = "envProvider", dataProviderClass = EnvProvider.class)
public void myTest(String currentEnv) {
System.out.println("Current env is : " + currentEnv);
}
pom.xml
<properties>
<testEnv>default-pom</testEnv>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<systemPropertyVariables>
<testEnv>${testEnv}</testEnv>
</systemPropertyVariables>
...
Result from eclipse right click
Current env is : eclipse-default
Result from mvn test
Current env is : default-pom
Result from mvn test -DtestEnv=jenkins
Current env is : jenkins
References: http://testng.org/doc/documentation-main.html#parameters-dataproviders

You can override the suite xml params with system level args if those are available - which would allow you to run from xml as well as from mvn.
All the params are basically assigned to some constant properties to be used across tests. The properties file is initialized in the onBeforeSuite method of the suite listener something to the effect
SuiteList extends SuiteListener{
public void onStart(ISuite suite) {
LProperties.loadProperties(suite);
}
//loadProperties implementation
LProperties{
public static void loadProperties(ISuite suite){
//This can read from a properties file and load properties
//or
// from the suite.getParameter - pick the params you need (reflection or list) and assign to the constants
//For all the properties, do System.getProperty and override values if found
}
Use LProperties constants in your tests.

based on a combination of above answers, I've figured it out.
First remove the hard-coded parameters in the test suite file.
Next, ensure the parameters are supported in the pom file.
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.4.2</version>
<configuration>
<systemPropertyVariables>
<environment>${testEnv}</environment>
<environment>${proxyServer}</environment>
<environment>${proxyPort}</environment>
</systemPropertyVariables>
</configuration>
</plugin>
</plugins>
</build>
Create constants file to contain default values for the parameters
public class Constants {
public static final String DEFAULT_TEST_ENV = "dev";
public static final String DEFAULT_PROXY_SERVER = "a-dev.phx.com";
public static final String DEFAULT_PROXY_PORT = "8585";
}
In the testNG setup() method, use System.getproperty():
#BeforeClass(alwaysRun = true)
protected static void setUp() throws Exception {
String testEnv = System.getProperty("testEnv", Constants.DEFAULT_TEST_ENV);
String proxyServer = System.getProperty("proxyServer", Constants.DEFAULT_PROXY_SERVER);
String proxyPort = System.getProperty("proxyPort", Constants.DEFAULT_PROXY_PORT);
System.out.println("testEnv: " + testEnv);
System.out.println("proxyServer: " + proxyServer);
System.out.println("proxyPort: " + proxyPort);
}
I could put the default values in a config file, but for now, constants file seems easiest in its own class.

Related

Accessing files in a Jar using ClassPathResource

I have a spring application that i must convert to jar. In this application I have a unit test:
#BeforeEach
void setUp() throws IOException {
//facturxHelper = new FacturxHelper();
facturxService = new FacturxService();
// String pdf = "facture.pdf"; // invalid pdfa1
String pdf = "resources/VALID PDFA1.pdf";
// InputStream sourceStream = new FileInputStream(pdf); //
InputStream sourceStream = getClass().getClassLoader().getResourceAsStream(pdf);
byte[] sourceBytes = IOUtils.toByteArray(sourceStream);
this.b64Pdf = Base64.getEncoder().encodeToString(sourceBytes);
}
#Test
void createFacturxMin() throws Exception {
// on va créer une facturX avec l'objet request
FacturxRequestMin request = FacturxRequestMin.builder()
.pdf(this.b64Pdf)
.chorusPro(Boolean.FALSE)
.invoiceNumber("FA-2017-0010")
.issueDate("13/11/2017")
.buyerReference("SERVEXEC")
.seller(TradeParty.builder()
.name("Au bon moulin")
.specifiedLegalOrganization(LegalOrganization.builder()
.id("99999999800010") .scheme(SpecifiedLegalOrganizationScheme.FR_SIRENE.getSpecifiedLegalOrganizationScheme())
.build())
.postalAddress(PostalAddress.builder()
.countryId(CountryIso.FR.name())
.build())
.vatId("FR11999999998")
.build())
.buyer(TradeParty.builder()
.name("Ma jolie boutique")
.specifiedLegalOrganization(LegalOrganization.builder()
.id("78787878400035")
.scheme(SpecifiedLegalOrganizationScheme.FR_SIRENE.getSpecifiedLegalOrganizationScheme())
.build())
.build())
.headerMonetarySummation(HeaderMonetarySummation.builder()
.taxBasisTotalAmount("624.90")
.taxTotalAmount("46.25")
.prepaidAmount("201.00")
.grandTotalAmount("671.15")
.duePayableAmount("470.15")
.build())
.build();
FacturXAppManager facturXAppManager = new FacturXAppManager(facturxService);
FacturxResponse facturxResponse = facturXAppManager.createFacturxMin(request);
Gson gson = new GsonBuilder().setPrettyPrinting().create();
String json = gson.toJson(facturxResponse);
System.out.println(json);
}
The aim of the application is to create an xml and to embed it into the pdf file.
My issue is concerning an xml validation through xsd.
Here is an abstract of the code :
public static boolean xmlValidator(String fxGuideLine, String xmlString) throws Exception {
System.out.println("xmlValidator() called");
File xsdFile = null;
Source source = new StreamSource(new StringReader(xmlString));
// i removed a lot of if else statement concerning files which allow to validate xml
try {
xsdFile = new ClassPathResource(FacturxConstants.FACTUR_X_MINIMUM_XSD).getFile();
} catch (IOException e) {
throw new FacturxException(e.getMessage());
}
// validation du contenu XML
try {
SchemaFactory schemaFactory = SchemaFactory
.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Schema schema = schemaFactory.newSchema(xsdFile);
Validator validator = schema.newValidator();
validator.validate(source);
return true;
} catch (SAXException | IOException e) {
throw new FacturxException(e.getLocalizedMessage());
}
...
}
In constants class, I added path to the xsd file:
public static final String FACTUR_X_MINIMUM_XSD = "resources/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
In my POM file I do want to put the resources files in the built jar.
<build>
<finalName>${project.artifactId}</finalName>
<resources>
<resource>
<directory>src/main/resources</directory>
<includes>
<include>*</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<outputDirectory> ${project.build.outputDirectory}\resources</outputDirectory>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.4.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
When I do a simple maven clean package, everything is running perfectly.
So far so good.
Next step is where my problem comes. Let's consider i want to use this dependency in an another application (a spring boot application). The previous jar compiled is a high level API that i want to integrate.
I launched the following command line :
mvn install:install-file -Dfile=myapi.jar -DgroupId=fr.myapi -DartifactId=graph-api-sharepoint -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar
I do add my dependency correctly in my new project. that's perfect.
To check if my import worked correctly, i created a simple unit test with the same code (I do have a VALID PDFA1 in my resources folder. So far so good.
When running the test I do have the following error:
class path resource [resources/xsd/BASIC-WL_XSD/FACTUR-X_BASIC-WL.xsd] cannot be resolved to absolute file path because it does not reside in the file system: jar:file:/.m2/repository/fr/myapi/1.1.0/myapi-1.1.0.jar!/resources/xsd/BASIC-WL_XSD/FACTUR-X_BASIC-WL.xsd
How can i fix this issue ? I read many post but not fixes solved my issue. I do also think that i will have an issue also while compiling the springboot app as a jar
As mentionned, using a File won't work.
In the current code I updated it using InputStream:
InputStream is = new ClassPathResource(FacturxConstants.FACTUR_X_MINIMUM_XSD).getInputStream();
xsdSource = new StreamSource(is);
if my xsd path doesn't have resources:
public static final String FACTUR_X_MINIMUM_XSD = "xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
I have the following exception:
class path resource [xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd] cannot be opened because it does not exist
If i do put
public static final String FACTUR_X_MINIMUM_XSD = "resources/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
the response is the following:
src-resolve: Cannot resolve the name 'ram:ExchangedDocumentContextType' to a(n) 'type definition' component.
I updated also the SchemaFactory and schema implementation:
SchemaFactory schemaFactory =
SchemaFactory.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Schema schema = schemaFactory.newSchema(xsdSource);
Validator validator = schema.newValidator();
validator.validate(source);
return true;
public static final String FACTUR_X_MINIMUM_XSD = "resources/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
Is wrong it should be (assuming src/main/resources/xsd is the actual location you are using).
public static final String FACTUR_X_MINIMUM_XSD = "/xsd/MINIMUM_XSD/FACTUR-X_MINIMUM.xsd";
Then your code is using a java.io.File which won't work, as a java.io.File needs to be a physical file on the file system. Which this isn't as it is inside a jar file. You need to use an InputStream.
public static boolean xmlValidator(String fxGuideLine, String xmlString) throws Exception {
System.out.println("xmlValidator() called");
Source source = new StreamSource(new StringReader(xmlString));
// i removed a lot of if else statement concerning files which allow to validate xml
try {
InputStream xsd = new ClassPathResource(FacturxConstants.FACTUR_X_MINIMUM_XSD).getInputStream();
StreamSource xsdSource = new StreamSource(xsd);
SchemaFactory schemaFactory = SchemaFactory
.newInstance(XMLConstants.W3C_XML_SCHEMA_NS_URI);
Schema schema = schemaFactory.newSchema(xsdSource);
Validator validator = schema.newValidator();
validator.validate(source);
return true;
} catch (SAXException | IOException e) {
throw new FacturxException(e.getLocalizedMessage());
}
...
}
Which loads the schema using an inputstream.
Thanks to M. Deinum, I was able to find out a solution. I had to use indeed StreamSource. This didn't solve the following issue:
src-resolve: Cannot resolve the name 'ram:ExchangedDocumentContextType' to a(n) 'type definition' component.
As I used several xsd files, I implemented a way to retrieve a list of sources using PathMatchingResourcePatternResolver (from spring)
private static Source[] buildSources(String fxGuideLine, String pattern) throws SAXException, IOException {
List<Source> sources = new ArrayList<>();
PathMatchingResourcePatternResolver patternResolver = new PathMatchingResourcePatternResolver();
Resource[] resources = patternResolver.getResources(pattern);
for (Resource resource : resources) {
StreamSource dtd = new StreamSource(resource.getInputStream());
dtd.setSystemId(resource.getURI().toString());
sources.add(dtd);
}
return sources.toArray(new Source[sources.size()]);
}

OpenAPI generator returns 501 for implemented method

I've generated rest api with openAPI generator maven plugin and I've overridden the default method from MyApiDelegate interface, but POST request on /endpoint provides 501 NOT IMPLEMENTED as if I hadn't given my own implementation of that method in MyApiDelegateImpl.
Maven plugin configuration:
<plugin>
<groupId>org.openapitools</groupId>
<artifactId>openapi-generator-maven-plugin</artifactId>
<version>4.3.1</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<configOptions>
<inputSpec>${project.basedir}/src/main/resources/latest.yaml</inputSpec>
<generatorName>spring</generatorName>
<apiPackage>my.rest.api</apiPackage>
<modelPackage>my.rest.model</modelPackage>
<supportingFilesToGenerate>ApiUtil.java</supportingFilesToGenerate>
<delegatePattern>true</delegatePattern>
<useBeanValidation>false</useBeanValidation>
</configOptions>
</configuration>
</execution>
</executions>
</plugin>
/* code generated by plugin */
package my.rest;
public interface MyApiDelegate {
default Optional<NativeWebRequest> getRequest() {
return Optional.empty();
}
default ResponseEntity<Void> doSmth(Smth smth) {
return new ResponseEntity<>(HttpStatus.NOT_IMPLEMENTED);
}
}
package my.rest.api;
public interface MyApi {
default MyApiDelegate getDelegate() {
return new MyApiDelegate() {};
}
/*...Api operations annotations...*/
#RequestMapping(value = "/endpoint",
produces = { "application/json" },
consumes = { "application/json", "application/xml" },
method = RequestMethod.POST)
default ResponseEntity<Void> doSmth(#ApiParam(value = "" ,required=true) #RequestBody Smth smth) {
return getDelegate().doSmth(smth);
}
}
my implementation:
package my.rest.api;
#Service
#RequiredArgsConstructor
public class MyApiDelegateImpl implements MyApiDelegate {
private final MyService s;
#Override
public ResponseEntity<Void> doSmth(Smth smth) {
s.doIt(smth);
return ResponseEntity.ok().build();
}
}
How can I make my program use my own implementation of the method in concrete class, not the default implementation, which is provided in interface?
Implementing the MyApi interface directly and hence the method doSmth
in it, is one way of doing that. Your concrete class need not have all the web related annotations but just the paramters and return value like a normal method.
I don't understand how can an interface MyApiDelegate can be initialized but since getDelegate returns an implementation of it, the default implementation of doSmth is called which returns HttpStatus.NOT_IMPLEMENTED.
One more thing to take care of is making sure the deployment knows to use the implementation class. If you're using spring web than just marking your concrete class #RestController should suffice.

.jar created from maven shade plugin throws error when accessing resources under src/main/resources, but running main from exploded .jar works?

Updated Exec Summary of Solution
Following up from the answer provided by Victor, I implemented a Java class that lists the contents of a folder resource in the classpath. Most critical for me was that this had to work when the class path resource is discovered when executing from the IDE, from an exploded uberjar, or from within an unexploded uberjar (which I typically create with the maven shade plugin.) Class and associated unit test available here.
Original Question
I am seeing strange behavior with the maven-shade-plugin and class path resources when I run very simple
java Test program that access a directory structure in a standard maven project like this:
src/main
Test.java
resources/
resource-directory
spark
junk1
zeppelin
junk2
When run from the IDE or the exploded maven shaded .jar (please see below)
it works correctly, which means it prints this:.
result of directory contents as classpath resource:[spark, zeppelin]
The source is as follows:
import org.apache.commons.io.IOUtils;
import java.io.IOException;
import java.io.InputStream;
public class Tester {
public void test(String resourceName) throws IOException {
InputStream in = this.getClass().getClassLoader().getResourceAsStream(resourceName);
System.out.println("input stream: " + in);
Object result = IOUtils.readLines(in);
System.out.println("result of directory contents as classpath resource:" + result);
}
public static void main(String[] args) throws IOException {
new Tester().test("resource-directory");
}
}
Now, if I run mvn clean install in my project and run the
maven shaded .jar under ${project.dir}target, I see the following exception:
> java -jar target/sample.jar
Exception in thread "main" java.lang.NullPointerException
at java.io.FilterInputStream.read(FilterInputStream.java:133)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at org.apache.commons.io.IOUtils.readLines(IOUtils.java:1030)
at org.apache.commons.io.IOUtils.readLines(IOUtils.java:987)
at org.apache.commons.io.IOUtils.readLines(IOUtils.java:968)
at Tester.test(Tester.java:16)
at Tester.main(Tester.java:24)
Running with Exploded .jar
> mkdir explode/
> cd explode/
> jar xvf ../sample.jar
......
inflated: META-INF/MANIFEST.MF
created: META-INF/
etc etc.
> ls # look at contents of exploded .jar:
logback.xml META-INF org resource-directory Tester.class
#
# now run class with CLASSPATH="."
(master) /tmp/maven-shade-non-working-example/target/explode > java Tester
input stream: java.io.ByteArrayInputStream#70dea4e
result of directory contents as classpath resource:[spark, zeppelin] # <<<- works !
I have the whole project here: https://github.com/buildlackey/maven-shade-non-working-example
but for convenience, here is the pom.xml(below), with two maven shade configs that I tried.
Note: I don't think the IncludeResourceTransformer would be of any use because my resources are appearing
at the appropriate levels in the .jar file.
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.foo.core</groupId>
<artifactId>sample</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>sample</name>
<url>http://maven.apache.org</url>
<properties>
<jdk.version>1.8</jdk.version>
<junit.version>4.11</junit.version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency><!-- commons-io: Easy conversion from stream to string list, etc.-->
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.4</version>
</dependency>
</dependencies>
<build>
<finalName>sample</finalName>
<plugins>
<!-- Set a compiler level -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>${jdk.version}</source>
<target>${jdk.version}</target>
</configuration>
</plugin>
<!-- Maven Shade Plugin -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<!-- add Main-Class to manifest file -->
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>Tester</mainClass>
</transformer>
<!-- tried with the stanza below enabled, and also disabled: in both cases, got exceptions from runs -->
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>src/main/resources/</resource>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
anyway, thanks in advance for any help you can provide ~
chris
UPDATE
This didn't work for me in Spring when I tried it (but I'd be interested if anyone has success with a Spring approach). I have a working alternative which I will post shortly. But if you care to comment on how to fix this broken Spring attempt, I'd be very interested.
import org.springframework.core.io.Resource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import org.springframework.core.io.support.ResourcePatternResolver;
import java.io.IOException;
public class Tester {
public void test(String resourceName) throws IOException {
ResourcePatternResolver resourceResolver = new PathMatchingResourcePatternResolver();
Resource[] resources = resourceResolver.getResources("resource-directory/*");
for (Resource resource : resources) {
System.out.println("resource: " + resource.getDescription());
}
}
public static void main(String[] args) throws IOException {
new Tester().test("resource-directory/*");
}
}
The problem is that getResourceAsStream can read only files as a stream, not folders, from a jar file.
To read folder contents from a jar file you might need to use the approach, like described in the accepted answer to this question:
How can I get a resource "Folder" from inside my jar File?
To supplement the answer from my good friend Victor, here is a full code solution. below. The full project is available here
import java.io.File;
import java.io.IOException;
import java.util.*;
import java.util.zip.ZipEntry;
import java.util.zip.ZipException;
import java.util.zip.ZipFile;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* List entries of a subfolder of an entry in the class path, which may consist of file system folders and .jars.
*/
public class ClassPathResourceFolderLister {
private static final Logger LOGGER = LoggerFactory.getLogger(ClassPathResourceFolderLister.class);
/**
* For each entry in the classpath, verify that (a) "folder" exists, and (b) "folder" has child content, and if
* these conditions hold, return the child entries (be they files, or folders). If neither (a) nor (b) are true for
* a particular class path entry, move on to the next entry and try again.
*
* #param folder the folder to match within the class path entry
*
* #return the subfolder items of the first matching class path entry, with a no duplicates guarantee
*/
public static Collection<String> getFolderListing(final String folder) {
final String classPath = System.getProperty("java.class.path", ".");
final String[] classPathElements = classPath.split(System.getProperty("path.separator"));
List<String> classPathElementsList = new ArrayList<String> ( Arrays.asList(classPathElements));
return getFolderListingForFirstMatchInClassPath(folder, classPathElementsList);
}
private static Collection<String>
getFolderListingForFirstMatchInClassPath(final String folder, List<String> classPathElementsList) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("getFolderListing for " + folder + " with classpath elements " + classPathElementsList);
}
Collection<String> retval = new HashSet<String>();
String cleanedFolder = stripTrailingAndLeadingSlashes(folder);
for (final String element : classPathElementsList) {
System.out.println("class path element:" + element);
retval = getFolderListing(element, cleanedFolder);
if (retval.size() > 0) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("found matching folder in class path list. returning: " + retval);
}
return retval;
}
}
return retval;
}
private static String stripTrailingAndLeadingSlashes(final String folder) {
String stripped = folder;
if (stripped.equals("/")) { // handle degenerate case:
return "";
} else { // handle cases for strings starting or ending with "/", confident that we have at least two characters
if (stripped.endsWith("/")) {
stripped = stripped.substring(0, stripped.length()-1);
}
if (stripped.startsWith("/")) {
stripped = stripped.substring(1, stripped.length());
}
if (stripped.startsWith("/") || stripped.endsWith("/")) {
throw new IllegalArgumentException("too many consecutive slashes in folder specification: " + stripped);
}
}
return stripped;
}
private static Collection<String> getFolderListing( final String element, final String folderName) {
final File file = new File(element);
if (file.isDirectory()) {
return getFolderContentsListingFromSubfolder(file, folderName);
} else {
return getResourcesFromJarFile(file, folderName);
}
}
private static Collection<String> getResourcesFromJarFile(final File file, final String folderName) {
final String leadingPathOfZipEntry = folderName + "/";
final HashSet<String> retval = new HashSet<String>();
ZipFile zf = null;
try {
zf = new ZipFile(file);
final Enumeration e = zf.entries();
while (e.hasMoreElements()) {
final ZipEntry ze = (ZipEntry) e.nextElement();
final String fileName = ze.getName();
if (LOGGER.isTraceEnabled()) {
LOGGER.trace("zip entry fileName:" + fileName);
}
if (fileName.startsWith(leadingPathOfZipEntry)) {
final String justLeafPartOfEntry = fileName.replaceFirst(leadingPathOfZipEntry,"");
final String initSegmentOfPath = justLeafPartOfEntry.replaceFirst("/.*", "");
if (initSegmentOfPath.length() > 0) {
LOGGER.trace(initSegmentOfPath);
retval.add(initSegmentOfPath);
}
}
}
} catch (Exception e) {
throw new RuntimeException("getResourcesFromJarFile failed. file=" + file + " folder=" + folderName, e);
} finally {
if (zf != null) {
try {
zf.close();
} catch (IOException e) {
LOGGER.error("getResourcesFromJarFile close failed. file=" + file + " folder=" + folderName, e);
}
}
}
return retval;
}
private static Collection<String> getFolderContentsListingFromSubfolder(final File directory, String folderName) {
final HashSet<String> retval = new HashSet<String>();
try {
final String fullPath = directory.getCanonicalPath() + "/" + folderName;
final File subFolder = new File(fullPath);
System.out.println("fullPath:" + fullPath);
if (subFolder.isDirectory()) {
final File[] fileList = subFolder.listFiles();
for (final File file : fileList) {
retval .add(file.getName());
}
}
} catch (final IOException e) {
throw new Error(e);
}
return retval;
}
}

TestNG. Need to run specific method before all tests and specifiic test after all tests

Used Selenium + TestNG + Maven.
I want to automate testing vulnerabilities using OWASP ZAP. For this I need to start ZAProxyScanner before all tests - execute method before all tests.
public void initZap(){
zapScanner = new ZAProxyScanner(ZAP_PROXYHOST,ZAP_PROXYPORT,ZAP_APIKEY);
zapScanner.clear(); //Start a new session
zapSpider = (Spider)zapScanner;
}
and when all functional tests were executed - run test for searching vulnerabilities
#Test
public void scanning() throws ClientApiException{
spiderWithZap();
setAlertAndAttackStrength();
zapScanner.setEnablePassiveScan(true);
scanWithZap();
}
Method and test located in one class, e.g. public class TestSecurity
Here is sample of my testng.xml with packages containing functional tests
<suite name="Chrome" thread-count="1" parallel="tests" configfailurepolicy="continue">
<test name="chrome">
<parameter name="browser" value="chrome"/>
<packages>
<package name="tests.suiteLogIn"></package>
<package name="tests.suiteSettings"></package>
<package name="tests.suiteSearch"></package>
</packages>
</test>
UPD. post modified code with AfterTest in it.
I use only Before/AfterMethod annotations
#BeforeMethod(alwaysRun=true)
#Parameters({"browser", "environment"})
public void setUp(#Optional ("firefox") String browser, #Optional ("local") String environment, Method method) throws IOException {
System.out.println("Test name: " + method.getName());
WebDriver driver = getMyDriver(browser, environment);
System.setProperty(ESCAPE_PROPERTY, "false");
}
#AfterMethod(alwaysRun=true)
#Parameters("browser")
public void tearDown(#Optional ("firefox") String browser){
DriverMaster.stopDriver();
}
#BeforeSuite
#Parameters("browser")
public void startZap(#Optional ("firefox") String browser){
if(browser.equals("firefox")){
sec.initZap();
}
}
#AfterSuite
#Parameters("browser")
public void scanZap(#Optional ("firefox") String browser) throws ClientApiException{
if(browser.equals("firefox")){
LoginPage lp = new LoginPage(getDriverInstance()).load();
lp.login("name", "pass");
sec.scanning();
}
}
You basically have two options:
Use #BeforeSuite and #AfterSuite and include that in the files to run or make all your classes extend it
Use ITestListener or ISuiteListener and put the setup and teardown code in their before and after methods.
With listeners, one advantage that I can see is if you want to do conditional teardown (scanning) based on some testresults you can control that too.

ECMASCRIPT 5 with wro4j and Google Closure Compiler

We are using wro4j with Google Closure and Maven to minify our JS. By default it does not suport strict mode in the JS ("use strict";).. it just strips it out. Is there any configuration I can do in pom.xml or somewhere else to get it to leave use strict in there?
This is the configuration for google closure complier to do it:
--language_in=ECMASCRIPT5_STRICT
Not sure how to plug that in to Wro4j. Any ideas?
Create a custom implementation of the manager factory which adds ECMAScript5:
public class MyCustomWroManagerFactory
extends DefaultStandaloneContextAwareManagerFactory
{
#Override
protected ProcessorsFactory newProcessorsFactory()
{
final SimpleProcessorsFactory factory = new SimpleProcessorsFactory();
factory.addPreProcessor(
new GoogleClosureCompressorProcessor(
CompilerOptions.LanguageMode.ECMASCRIPT5_STRICT
)
);
return factory;
}
}
Reference it in the pom.xml as the value of the wroManagerFactory node:
<configuration>
<wroManagerFactory>com.mycompany.MyCustomWroManagerFactory</wroManagerFactory>
</configuration>
According to John Lenz from the Closure Compiler project, if you are using the Compiler API directly, you should specify a CodingConvention.
References
GoogleClosureCompressorProcessor.java - method setCompilerOptions
GoogleClosureCompressorProcessor.java - optionsPool method
Closure Compiler Service API Reference - language |  Closure Compiler  |  Google Developers
It's a bit more complicated in wro4j-maven-plugin 1.8, but not that bad.
You need to add two Java classes. First override newCompilerOptions of GoogleClosureCompressorProcessor like so:
package com.example.package.wro;
import com.google.javascript.jscomp.CheckLevel;
import com.google.javascript.jscomp.ClosureCodingConvention;
import com.google.javascript.jscomp.CompilerOptions;
import com.google.javascript.jscomp.DiagnosticGroups;
import java.nio.charset.Charset;
import org.apache.commons.lang3.CharEncoding;
import ro.isdc.wro.extensions.processor.js.GoogleClosureCompressorProcessor;
/**
* Custom processor overriding `newCompilerOptions` to add custom compiler options.
*
* Original author: Alex Objelean.
*/
public class CustomGoogleClosureCompressorProcessor extends GoogleClosureCompressorProcessor {
/**
* Encoding to use.
*/
public static final String ENCODING = CharEncoding.UTF_8;
#Override
protected CompilerOptions newCompilerOptions() {
final CompilerOptions options = new CompilerOptions();
// Set the language_in option on the Google Closure Compiler to prevent errors like:
// "JSC_TRAILING_COMMA. Parse error. IE8 (and below)"
options.setLanguageIn(CompilerOptions.LanguageMode.ECMASCRIPT5);
/**
* According to John Lenz from the Closure Compiler project, if you are using the Compiler API directly, you should
* specify a CodingConvention. {#link http://code.google.com/p/wro4j/issues/detail?id=155}
*/
options.setCodingConvention(new ClosureCodingConvention());
// use the wro4j encoding by default
//options.setOutputCharset(Charset.forName(getEncoding()));
setEncoding(ENCODING);
options.setOutputCharset(Charset.forName(ENCODING));
// set it to warning, otherwise compiler will fail
options.setWarningLevel(DiagnosticGroups.CHECK_VARIABLES, CheckLevel.WARNING);
return options;
}
}
You'll notice I've commented out the line getEncoding. This is because it's private. I also added setEncoding just in case.
Then we need the Custom manger:
package com.example.package.wro;
import ro.isdc.wro.manager.factory.standalone.DefaultStandaloneContextAwareManagerFactory;
import ro.isdc.wro.model.resource.processor.factory.ProcessorsFactory;
import ro.isdc.wro.model.resource.processor.factory.SimpleProcessorsFactory;
/**
* Custom manger adding custom processor.
*/
public class CustomWroManagerFactory extends DefaultStandaloneContextAwareManagerFactory {
#Override
protected ProcessorsFactory newProcessorsFactory() {
final SimpleProcessorsFactory factory = new SimpleProcessorsFactory();
factory.addPreProcessor(
new CustomGoogleClosureCompressorProcessor()
);
return factory;
}
}
And then use it in your pom.xml in wroManagerFactory. Something like so:
<plugin>
<groupId>ro.isdc.wro4j</groupId>
<artifactId>wro4j-maven-plugin</artifactId>
<version>1.8.0</version>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<!-- Google Closure Compiler -->
<!-- http://www.gzfs020.com/using-google-closure-compiler-with-wro4j-maven-plugin.html -->
<configuration>
<contextFolder>${basedir}/src/main</contextFolder>
<wroFile>${basedir}/src/main/config/wro.xml</wroFile>
<destinationFolder>${project.build.directory}/${project.build.finalName}/min</destinationFolder>
<!--
<wroManagerFactory>ro.isdc.wro.extensions.manager.standalone.GoogleStandaloneManagerFactory</wroManagerFactory>
-->
<wroManagerFactory>com.example.package.wro.CustomWroManagerFactory</wroManagerFactory>
</configuration>
</plugin>

Resources