Maven: How to manage test code module dependencies - maven

We've migrated an ant project (codename one) to a maven project which happened through a migration tool. Unfortunately there is still a problem. Somehow, the test source directory (set via testSourceDirectory in pom.xml) becomes part of the wrong module (it is at least shown so in the project view of IntelliJ IDEA). Therefore the test source code is missing neccessary core dependencies. The core code (actual implementation) is in the "common" module. Even though the test code is located (in IntelliJ) under the "common" module, it is itself marked as being part of the "cn1libs" module. We have no idea, how the ide or maven concludes this from the pom config.
snippet from the pom:
<testSourceDirectory>${project.basedir}/common/src/test/java</testSourceDirectory>
the dependencies in the test sources can't get resolved
How can we fix this?

Sometimes IntelliJ will give you erroneous labels like this if there is more than one module that references that directory in their pom file. Check your cn1libs/pom.xml file and make sure that it doesn't have <testSourceDirectory> specified anywhere. Its packaging type should also be pom.
Codename One projects are set up to do unit tests using the Codename One test runner. It uses its own "test" goal. You are importing junit's Test class which may be problematic here unless you really know what you are doing. E.g. The Codename One testrunner will set up the test environment and run the tests in a simulated environment. Junit will just run them raw.
If you need to add test dependencies, however, you should be able to just add them in the dependencies section of the common/pom.xml file, just make sure you set the dependency scope to "test".

Related

IntelliJ + Gradle Multiproject: How to tell how the target source set is called?

we are currently checking, whether we can switch from Eclipse to IntelliJ as an IDE.
In this project we are using gradle multi projects whose structure looks something like this:
Project
|-ProjectA
|-ProjectAImpl
|-main*
|-ProjectATest
|-test*
|-ProjectB
|-ProjectBImpl
|-main*
|-ProjectBTest
|-test*
= Source set or in IntelliJ it seems a module.
The ProjectBTest has a dependency to ProjectATest, which is configured as
compile project(":ProjectA:ProjectATest")
This always worked properly with Eclipse but in IntelliJ I'm having the problem, that the ProjectBTest is configured such, that it is looking for a module named "Project.ProjectA.ProjectATest.main", instead of "Project.ProjectA.ProjectATest.test"
This module can obviously not be found, leading to a lot of compiler errors.
Can maybe somebody give me a hint how I can tell IntelliJ or gradle here to take the proper module?
Thank you very much.
This is standard Gradle functionality. Unless you have other Gradle customizations (like feature variants or changing the source directories for a source set), project dependencies will naturally target the main source set.
There are several ways to solve this, but two primary ones that stand out to me:
Use Gradle's Java test fixtures.
The "test" source set is not naturally inheritable in any way in Gradle. There is no built-in consumable configuration that provides test classes to downstream projects. However, Java test fixtures allow you to use a separate testFixtures source set which is shareable. To do this, you would do the following:
Add the java-test-fixtures plugin to all projects which need to produce shared test sources
Move your shared test sources to <project directory>/src/testFixtures (ideally this would include as few actual test classes as possible, but rather just shareable test logic instead)
Change your dependency references to point to the upstream project(s)' test fixtures artifact: testImplementation(testFixtures(project(":ProjectA:ProjectATest"))
Register a tests configuration which includes the test classes as an output.
project.configurations.register("tests") {
extendsFrom(project.configurations[JavaPlugin.TEST_RUNTIME_CONFIGURATION_NAME])
}
tasks.register("testJar", Jar::class) {
classifier.set("test-classes")
from(project.the<SourceSetContainer>()[SourceSet.TEST_SOURCE_SET_NAME].output)
}
project.artifacts.add("tests", project.tasks.named("testJar"))
Downstream projects:
dependencies {
compile(project(":ProjectA:ProjectATest", "tests"))
}
None of the above code is tested. It may require some adjustments.
Java test fixtures are a supported way to produce shareable test sources, so they should be preferred, but the tests configuration may be quicker to implement, depending on your use case.

IntelliJ TestNG Maven Test automation project structure

i'm thinking about test automation structure using selenium, intelliJ ide, testng and maven. What You think about below:
I used one project and many directories beacuse i want to have just one pom file. If You could help me with testng file. How it should look like if i want to run all tests which are available in all "Tests" directories? What means click run and fire up all tests with "Test" testng annotations. Helpers, pages and tests directories exist becasue i will want to do this with POM & Page Factory.
#Sid below my pom. My testng.xml is empty currently because i do not know how to configure it to run all what i have in "Tests" directories.
Thank you for reply.
My tests are just examples with beforetests, test and aftertest annotations. Nothing to admire ;)
Too long for a comment:
I would assume your helper class is goin to have common functions. Also, depending on the size of your modules you may want to create more sub-module folders. You can also add a commons folder which contains generic steps and methods.
Now, if your modules are deployed completely independent of each other, you want to take a call on whether the code should reside with the app code or in one place like you have.
The structure would work out fine either ways. To run all the tests you need to include the folders / classes path in your testng files. IDE/Maven/testng dont care about your folder structure so long as you include all the paths correctly. Check out https://www.mkyong.com/unittest/testng-tutorial-5-suite-test/ for how to do that.

Maven module dependency source instead of repository jars

I have a multi-module project, i.e.
parent
module1
module2
In one dev cycle, I added a class mod1.A to module1. Class mod2.B in module2 depends on it.
I do not have the artifacts in my local .m2/repository. Running this:
$ cd prj/module2
$ mvn -o exec:java -Dexec.mainClass=mod2.B
results in an error along the lines of:
The following artifacts could not be resolved: com.example:module1:jar:1.0-SNAPSHOT
After I install the artifacts via mvn install while in the prj folder, it all works as expected.
However, this presents an issue in at least two ways:
I have to go through the slower install phase instead of the faster compile phase
I have two versions of the same project and conflicting modifications in these. I cannot run the same Java class with their respective modifications, only the currently installed modifications, considering they are both the same SNAPSHOT version
There are workaround for both (skip parts of the build for the first, different snapshot versions for the second), but they are far from usable in practice.
Is there a way to make maven use the local modules, instead of using artifacts from local maven repository?
If I understand your question correctly, it seems like you are living a bit outside the norm here: you have two local "copies" of the project with different modifications, that you want to work with alternately when running "exec:java". And Maven is getting in your way: it expects your local .m2 repository area to be in play, but the version strings in each copy are the same, so you end up with the changes interfering among the copies.
To me, it sounds like what you are trying to do is to test your changes. I suggest you just write an actual JUnit or TestNG test in module2 that tests what you want (it can just call mod2.B Main if you want). Then, from your chosen project directory, you can run mvn test -Dtest=MyTestName. It won't "install" anything and it will find the dependencies the way you want it to.
Otherwise, I can see three options.
Change the version string locally in one of the copies (mvn versions:set -DnewVersion=B-SNAPSHOT can do this for you). That way any "installed" jars from your work on that copy will not be considered by the other copy, and vice-versa. You refer to this as being "far from usable" ... I think it should be fine? These are different versions of the project! They should have different version strings! I strongly recommend this option out of the three. (You can do mvn versions:revert when done if you used :set, or you can rely on version control to undo the change.)
Select a different local repository used by Maven when working on one of the projects, with a command-line flag as per https://stackoverflow.com/a/7071791/58549. I don't really think this is a good solution, since you would have to be very careful about using the right flags every time with both projects. Also you'd end up having to re-download Maven plugins and any other dependencies into your new local repository anyway, which is kind of a waste of time.
Try to avoid using any local repository at all. You seem to be trying to make this option work. I don't think this is a great approach either; you're fighting against Maven's expectations, and it limits your flexibility a lot. Maven will indeed find dependencies from the "reactor" (i.e., the executing mvn process) first, but this means all of the required modules must be available in the reactor to be found, which means you can only run mvn at the top level. So if instead you want to just do "mvn exec:java" inside a single module, mvn needs to find that module's dependencies somewhere ... and that's what the local repo is generally used for.
If you're dead set on going with option 3 (instead of option 1), then I suggest you follow the comments on your question and create a profile that runs your exec selectively against module2 and binds it to a lifecycle phase. But this is in practice very close to just wrapping it with a test.
For IntelliJ users:
I solved this problem using IntelliJ's Run configuration. It has the options Resolve workspace artifacts and Add before launch task -> Build. See this picture for clarification:
Run configuration example
The whole point of modules in Maven is to create decoupling between them. You either build each module independently, so that you can work on one module without touching the other, or include both modules as sub-modules in the parent pom and build the parent, which will resolve dependencies between its sub-modules and trigger their builds.
It looks like you have two options here:
Review the structure of your project. Do you really need to split it into two separate modules, if you change code in both of them simultaneously?
Import the project into a Maven-aware IDE (IntelliJ IDEA is very good at working with Maven), and let the IDE handle the compilation. Once finished and stabilized the code-base, build normally with Maven.

Difference between the main and the test folder in Maven

I am a bit confused on the difference between the use of the folder main and the folder test in Maven. As of now, I just copy and paste my source code in both of them and it works fine. I don't get what the point of having another folder with exactly the same thing as the main folder is? Can someone please explain this to me.
Also:
What is the difference between install and compile.
So for this command: mvn archetype:generate, is generate the goal? then what is archetype?
Thanks
The main folder contains your application code and resources, and the test folder contains, well, test code and resources. So don't copy your application code there, but only the tests. The test sources are then automatically added to the classpath in the test phases.
For the difference between install and compile have a look at https://maven.apache.org/guides/introduction/introduction-to-the-lifecycle.html. Basically install also contains compile and a lot more goals (like execution of tests, packaging, installing into local repository.
generate would be the goal, correct. archetype is the short form for maven-archetype-plugin and means the plugin, which contains the goal. By default plugins with the name pattern maven-*-plugin or *-maven-plugin can be shortened that way.
Separation between src and test folders is a standard practice where same package structure under both guarantees your com.some.Class finds its way and it's visible when com.some.ClassTest unit test runs.
Difference between install and compile. Read the documentation around the Maven lifecycle. Essentially everytime you are invoking one build phase, every other build phase defined before it in the lifecycle gets called in the defined order.
Documentation about what is Archetype

Maven: Dealing with a truly circular dependency

I have a somewhat complex situation and I'm not sure what the best way to set up my Maven environment is.
I'm writing a framework to allow the creation of tests for a particular system. The system can host a variety of applications, and I have a set of tests for each application, which I'd like to keep separate from the framework (which handles the general concept of a "test", and message sending/receiving etc). Also, the framework provides a generic UI, so it can be built as a war and deployed allowing you to run and configure tests.
What I'm currently doing is building the framework both as a jar and war, listing the framework as a dependency in each application test suite, and pulling in all the framework code using an overlay so each suite can build its own war to deploy. This is gross.
What I'd like is to be able to specify (probably via profiles) which test suites to build in the framework, and end up with a single framework.war file with the specified suites included. Trying to build the poms for this I keep running into a circular dependency because:
To build the tests, the test projects must depend on the framework
To pull in the specified test jars, the framework must depend on those test projects
Things I've tried:
Make the test suites sub-projects of the framework:
This doesn't work as (I think) I can't package the final result as war (only pom packaging for aggregator projects)
List the test .jars as system dependencies:
This works, but it's gross to have to manually specify a path to the jar
Put the tests as java packages inside the framework and compile only what you want via filters:
technically possible, but I would really prefer the logical separation into separate maven projects as each test suite can be configured too, and I'd like to keep all that config out of the framework pom
What would be ideal would be a parent project pom that would:
compile the framework with no tests
compile the specified test suites
rebuild the framework .war, including the specified test suite jars
I'm not sure if this is possible and/or even advisable, but it seems the best solution to me. Thanks in advance for any suggestions for organizing this project.

Resources