How to get dynamic suiteXmlFile name from pom.xml in java file - maven

My pom.xml reads as :-
<property>
<name>haltonfailure</name>
<value>false</value>
</property>
<property>
<name>delegateCommandSystemProperties</name>
<value>true</value>
</property>
</properties>
<reportsDirectory>test-output\${buildTag}</reportsDirectory>
<suiteXmlFiles>
<suiteXmlFile>${inputXML}</suiteXmlFile>
</suiteXmlFiles>
</configuration>
</plugin>
</plugins>
And all the dynamic parameters have been passed using Jenkins. But how can I read the dynamic inputXML name in my testbase class so that I can apply some conditions on the basis of this xml file? As whenever I tried to read this file, I always get ${inputXML} but I need the value which I have passed from Jenkins.
Please help here.

It's been long time so, I'm no sure you are still looking for the solution.
But...
You will need to create a Suite xml dynamically before running your test.
Create a java program(It doesn't really need to be java though) to generate suite xml file.
Run Maven with parameters
Your java program would be like this.
String suiteName = System.getProperty("suiteName", ""); // <-- get argument value from your VM
XmlSuite suite1 = new XmlSuite();
suite1.setName(suiteName);
// XmlTest
XmlTest test1 = new XmlTest(suite1);
test1.setName("TmpTest1");
Map<String, String> parameterMap = new HashMap<String, String>();
parameterMap.put("datasheetPathListPath", "This is the parameter");
test1.setParameters(parameterMap);
List<XmlClass> classes1 = new ArrayList<XmlClass>();
XmlClass xmlClass1 = new XmlClass("your.package.ClassName");
classes1.add(0, xmlClass1);
test1.setXmlClasses(classes1);
String fileContent = suite1.toXml();
FileWriter fileWriter;
try {
fileWriter = new FileWriter("filepath/generated_testNg_file.xml");
fileWriter.write(fileContent);
fileWriter.close();
} catch (IOException e) {
e.printStackTrace();
}
System.out.println(suite1.toXml());
}
Then send values with mvn like this
mvn
clean
compile
test-compile
exec:java
-Dexec.cleanupDaemonThreads=false
-Dexec.classpathScope="test"
-Dexec.mainClass="package.YourJavaClassNameThatGeneratsSuiteXmlFile"
-DsuiteName=YourSuiteName
test
If your java program is not under test (/src/test/) remove the line for -Dexec.classpathScope
The mvn command will compile your code before running your program. Your program will create a suite XML file. Then it will run tests with the suite XML file.

Related

Passing -J parameters programmatically to JMeter

I'm using programmatic way to run JMeter defined in the step 4 of this post.
The code looks as follows:
final StandardJMeterEngine jmeter = new StandardJMeterEngine();
JMeterUtils.setJMeterHome(getAbsolutePath("/jmeter"));
JMeterUtils.loadJMeterProperties(getAbsolutePath("/jmeter/bin/jmeter.properties"));
JMeterUtils.initLocale();
try {
SaveService.loadProperties();
final File jmeterConfig = new File(getAbsolutePath(pathToJmx));
final HashTree testPlanTree = SaveService.loadTree(jmeterConfig);
jmeter.configure(testPlanTree);
} catch (final IOException e) {
throw new JMeterConfigurationException(e);
}
jmeter.run();
I want to provide the values for ${__P(parameter_name)} parameters I specified in .jmx file, that can be done using -J parameter in console.
How can I pass values for this parameters in the code above?
Given you already use JMeterUtils class you should be able to call JMeterUtils.setProperty() function like:
JMeterUtils.setProperty("parameter_name","foo");
And then in your script refer the property using __P() function as ${__P(parameter_name,)}
You can also add the next line:
parameter_name=foo
to the jmeter.properties file which you're loading with JMeterUtils.loadJMeterProperties function.
Dont' forget to add ApacheJMeter_functions.jar to your project classpath otherwise __P() function will not be resolved.
More information: Apache JMeter Properties Customization Guide

How to set the project root directory as the destination directory for a custom Maven Mojo plugin?

I have writen a custom Mojo plugin to extract a zip file to a directory in the a project.
This is my destination directory in the mojo class.
String destinationDirectory = "lib";
It is properly working and create a directory and extract the content of the zip to the file.
Now what I want is extract that content into the project's root directory.
If I give it like following it gives an error since destination directory is empty.
String destinationDirectory = "";
How can I get this done?
Following are the method that I created to extract the zip.
public void unpack(String zipFilePath, String destDirectory) throws IOException {
File destDir = new File(destDirectory);
//make the units directory if it is not exists.
if (!destDir.exists()) {
destDir.mkdir();
}
ZipInputStream zipIn = new ZipInputStream(new FileInputStream(zipFilePath));
ZipEntry entry = zipIn.getNextEntry();
// iterates over entries in the zip file
while (entry != null) {
String filePath = destDirectory + File.separator + entry.getName();
if (!entry.isDirectory()) {
// if the entry is a file, extracts it
extractFile(zipIn, filePath);
} else {
// if the entry is a directory, make the directory
File dir = new File(filePath);
dir.mkdir();
}
zipIn.closeEntry();
entry = zipIn.getNextEntry();
}
zipIn.close();
}
/**
* Extracts a zip entry
* #param zipIn zip entry found in the main zip file
* #param filePath destination directory of the zip entry
* #throws IOException
*/
private void extractFile(ZipInputStream zipIn, String filePath) throws IOException {
BufferedOutputStream bos = new BufferedOutputStream(new FileOutputStream(filePath));
byte[] bytesIn = new byte[BUFFER_SIZE];
int read = 0;
while ((read = zipIn.read(bytesIn)) != -1) {
bos.write(bytesIn, 0, read);
}
bos.close();
}
How can I get the existing project's root directory in the run-time when the Mojo is executing ?
I solved the problem by adding the dependency-plugin to the pom.xml in the src/main/resources/archetype-resourcesdirectory.
Since the maven-dependency-plugin is a plugin and cannot include in the archetype and on the other hand, the project which get created from the archetype is the one that should include the capability of extracting the zip file to the project I included the dependency plugin as follows in the project pom as follows which is the correct place to include the plugin.
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>truezip-maven-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>copy-package</id>
<goals>
<goal>copy</goal>
</goals>
<phase>package</phase>
<configuration>
<verbose>true</verbose>
<fileset>
<!--This directory is to be modified-->
<directory>${path-to-the-zip-file}
</directory>
<outputDirectory>${project.basedir}</outputDirectory>
</fileset>
</configuration>
</execution>
</executions>
</plugin>

Append to file in HDFS (CDH 5.4.5)

Brand new to HDFS here.
I've got this small section of code to test out appending to a file:
val path: Path = new Path("/tmp", "myFile")
val config = new Configuration()
val fileSystem: FileSystem = FileSystem.get(config)
val outputStream = fileSystem.append(path)
outputStream.writeChars("what's up")
outputStream.close()
It is failing with this message:
Not supported
java.io.IOException: Not supported
at org.apache.hadoop.fs.ChecksumFileSystem.append(ChecksumFileSystem.java:352)
at org.apache.hadoop.fs.FileSystem.append(FileSystem.java:1163)
I looked at the source for ChecksumFileSystem.java, and it seems to be hardcoded to not support appending:
#Override
public FSDataOutputStream append(Path f, int bufferSize,
Progressable progress) throws IOException {
throw new IOException("Not supported");
}
How to make this work? Is there some way to change the default file system to some other implementation that does support append?
It turned out that I needed to actually run a real hadoop namenode and datanode. I am new to hadoop and did not realize this. Without this, it will use your local filesystem which is a ChecksumFileSystem, which does not support append. So I followed the blog post here to get it up and running on my system, and now I am able to append.
The append method has to be called on outputstream not on filesystem. filesystem.get() is just used to connect to your HDFS. First set dfs.support.append as true in hdfs-site.xml
<property>
<name>dfs.support.append</name>
<value>true</value>
</property>
stop all your demon services using stop-all.sh and restart it again using start-all.sh. Put this in your main method.
String fileuri = "hdfs/file/path"
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(fileuri),conf);
FSDataOutputStream out = fs.append(new Path(fileuri));
PrintWriter writer = new PrintWriter(out);
writer.append("I am appending this to my file");
writer.close();
fs.close();

How to get the JUnit surefire report programatically

I need to use JUnitCore to run unit tests from a standalone jar test tool. I have added the surefire jars as dependencies and there I found the class org.apache.maven.surefire.junitcore.JUnitCoreRunListener. In this github this is a sample of its usage.
My intention is to get the xml report that would result from running maven without maven at runtime.
RunListener reporter = new JUnitCoreRunListener( new MockReporter(), new HashMap<String, TestSet>() );
JUnitCore jc = new JUnitCore();
jc.addListener( reporter );
Result r = jc.run( new Computer(), AppTest.class);
jc.removeListener( reporter );
System.out.println( r );
This is not working. What is the correct approach to get the XML output from that reporter?

Maven module dependency and resources

I have a maven Project say ProjectA, and there is a class say ClassA in this project which uses the files from resource folder, I am using following way to read from resource:
String fileName = IPToGeoMapper.class.getResource("/resourceFile.txt").getFile();
File resourceFile = new File(fileName);
and it works like charm.
Now, when I create artifact out of this project (I have tried extracting the created jar and it has resourceFile.txt packed in the jar), and use that as a dependency in other project say ProjectB, ClassA no more finds the resourceFile.txt as it tries to browse though resources folder of ProjectB.
I want the best global solution which will work in all the projects where I import artifact created from ProjectA
What is the best way to handle this?
Try this way , I am taking an example of reading a property file.
Properties propfile = new Properties();
propfile.load(PropertyUtils.class.getClassLoader()
.getResourceAsStream(
"applicationprops.properties"));
BufferedReader reader = new BufferedReader(new InputStreamReader(ClassA.class.getClassLoader().getResourceAsStream("file.txt")));
StringBuilder out = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
out.append(line);
}
System.out.println(out.toString()); //Prints the string content read from input stream
reader.close();

Resources