Multiple #Before/#After with Cucumber and Spring? - spring

I don't understand the Cucumber configuration and cannot find any example on this.
Folder structure:
src/
|-test/
|-resources/
| |-cucumber/
| |-auth/
| | |-Login.feature
| |-contributions/
| |-ClearanceCertificates.feature
|-java/
|-de/
|-vbg/
|-other/
|-again/
|-important/
|-cucumber/
| |-auth/
| | |-LoginStepDefinitions.java
| |-contributions/
| |-ClearanceCertificatesStepDefinitions.java
|-CucumberBase.java
|-SeleniumApplicationTests.java
CucumberBase.java:
#RunWith(Cucumber.class)
#CucumberOptions(
features = "src/test/resources",
plugin = {"pretty", "html:build/reports/tests/test/cucumber.html"})
public class CucumberBase {}
SeleniumApplicationTests.java:
#CucumberContextConfiguration
#SpringBootTest
class SeleniumApplicationTests {}
Both *StepDefinitions.java have a #Before and #After defined. My assumption was that the StepDefinitions are matched to the *feature-files based on location/package.
In my tests though, the Login.feature executes the #After defined in the ClearanceCertificatesStepDefinitions.java.
How should this be configured properly?

All step definitions and hooks on the glue path are global. So all scenarios can access all step definitions on the glue path and all hooks on the glue path are executed before/after each scenario.
If you have hooks that should only be executed for a particular scenario you could use conditional hooks. To run a particular hook only for certain scenarios, you can associate a Before or After hook with a tag expression.
Feature: Example
#browser
Scenario: Open a browser window
...
#headless
Scenario: Make a http call
...
#After("#browser and not #headless")
public void doSomethingAfter(Scenario scenario){
// only executed after "Open a browser window"
}
https://cucumber.io/docs/cucumber/api/#conditional-hooks
Alternatively you can change the organization of your code.
|- src/test/java/com/example/one/RunCucumberTest1.java
|- src/test/resources/com/example/one/example-1.feature
|- src/test/java/com/example/two/RunCucumberTest2.java
|- src/test/resources/com/example/two/example-2.feature
|- src/test/java/com/example/common/AbstractCucumberTest.java
|- src/test/java/com/example/common/SeleniumApplicationTests.java
#RunWith(Cucumber.class)
#CucumberOptions(extraGlue = "com.example.common")
public abstract class AbstractCucumberTest {}
public class RunCucumberTest1 extends AbstractCucumberTest {}
public class RunCucumberTest2 extends AbstractCucumberTest {}
Cucumber will scan the package of the runner for glue and features so the glue and feature path can be ommited. By setting the extraGlue property the common configuration doesn't have to be duplicated.

Related

How to make SpringBoot application and tests work with same path resolving service?

A custom class PathConfigurationService sets a current path of running application to C:\project. But since test is inside module x it's user.dir system property is different from main application thus returning C:\project\moduleX thus making PathConfigurationService generate wrong paths to external folder data How to make test pickup correct path of data folder?
- data
- fileA
- fileB
- core
-src/main/java
-PathConfigurationService
-src/test/java
- AbstractTest
- moduleX
-src/test/java
- Test(extending AbstractTest)
ModuleX test:
#ActiveProfiles("test")
#SpringBootTest(classes = SpringBootApplication11.class)
#ContextConfiguration(classes = SpringBootApplication11.class,
initializers = ConfigFileApplicationContextInitializer.class)
class Test extends AbstractTest {
#Autowired
SpringBootApplication11 app;
}
Abstract test:
#ExtendWith(SpringExtension.class)
public abstract class AbstractTest {
#Test
void test() {
}
}
You should make use of the resources folder.
In fact, this will let you access your data folder the same way from your sources and your tests.
The structure of your project may look like this:
Project
|-- pom.xml
`-- src
`-- main
`-- java
`-- resources
`-- test
`-- java
`-- resources
Removing $MODULE_WORKING_DIR$ from Working directory input box in IntelliJ Run/Debug Configuration solved the problem.

Separating integration vs. unit tests in gradle based on abstract class

I'm trying to separate my gradle/spock tests into two groups:
unit tests
integration tests
My attempt was with jUnit's #Category. In build.gradle I created task for integration/e2e tests:
task e2eTest(type: Test) {
useJUnit {
includeCategories 'com.foo.bar.baz.E2ESpec'
}
}
And marked my base abstract class with #Category(E2ESpec), but it doesn't work.
I've noticed that inheritance works, but only with single level inheritance:
#Category(E2ESpec)
abstract class AbstractSpec {...}
class ActualSpec extends AbstractSpec {...}
but doesn't work for cases like:
#Category(E2ESpec)
abstract class AbstractSpect {...}
abstract class AnotherAbstractSpec extends AbstractSpec {...}
class ActualSpec extends AnotherAbstractSpec {...}
Any idea how to fix it?
PS. I've many classes extending AbstractSpec and new classes appears, so I don't want #Category on each spec. Maybe pure gradle solution exists?
Create a new sourceset for integration tests, with a corresponding task. See How do I add a new sourceset to Gradle?

How to handle a modular spring project with flyway and single db [duplicate]

This question already has answers here:
Flyway multiple metadata tables in one schema
(2 answers)
Closed 4 years ago.
Situation
I have a modular Spring Boot project. As a database schema manager, I would like to use Flyway.
As already stated, the project is modular. This is, because there will be different configurations which use different modules. This means, that I would like to pack everything that is related to a module to it's specific project. With Flyway this seems not that simple.
Issue
What I ideally imagine:
ApplicationA
|
|_Module1
| |
| |_db/migration
| |
| |_V1__InitModule1Tables
| |_V2__MigrateSomeTable
|
|_Module2
|
|_db/migration
|
|_V1__InitModule2Tables
|_V2__MigrateSomeTable
Each module defines its own flyaway script independently, as they don't know from each others existence anyway. Each module would obviously need his own flyway history table inside of the share db. This way the whole system is decoupled and configuring the next application ApplicationB with Module1 and Module3 won't be a hassle.
Well I didn't find any configuration possibility for Flyway to reach this solution.
What I've tried
Doing something like this within each module is obviously a bad idea, as the the initialization/execution order of beans is rather random, leading in not having the tables created when I need them for other configurations. Also it seems messy.
#Configuration
public class Module1Config {
#Autowired
public void flyway(DataSource dataSource) {
Flyway flyway = new Flyway();
flyway.setBaselineOnMigrate(true);
flyway.setTable(flyway.getTable() + "-module1");
flyway.setDataSource(dataSource);
flyway.migrate();
}
}
I don't think that I'm the first person which tries to achieve that. How could I reach the desired modular Flywayconfiguration?
* Update *
Solution
The following solution, which works in the way as the duplicate topic suggests, is working for me:
I crated a configuration template in my base module which is used by any other module as it provides global functions as logging and journaling.
public abstract class FlywayConfig {
private final String moduleName;
public FlywayConfig(String moduleName) {
this.moduleName = moduleName;
}
private final String baseScriptLocation = "classpath:db.migration.";
#Autowired
public void migrate(DataSource dataSource) {
Flyway flyway = new Flyway();
flyway.setDataSource(dataSource);
flyway.setSchemas(moduleName.toUpperCase());
flyway.setLocations(baseScriptLocation + moduleName.toLowerCase());
flyway.migrate();
}
}
In each module, I simply extend this configuration class
#Configuration
public class BaseConfig extends FlywayConfig {
public static final String MODULE_NAME = "base";
public BaseConfig() {
super(MODULE_NAME);
}
}
Whereas I save my flyway scripts in db.migration.*MODULE_NAME*
There are really only two possible scenarios here:
Your modules are fully independent and there are no relationships in the database between objects belonging to different modules: use a separate schema history table per module.
Your modules do have database objects with relationships to objects belonging to other modules: you now have in effect a single global lifecycle for the database and should therefore use a single schema history table for all modules.
More info about modular applications: https://speakerdeck.com/axelfontaine/majestic-modular-monoliths

Spring profiles on integration tests class

we have selenium tests which are ran by java test class.
On local environment everything is ok, but I want to switch off those tests when run on jenkins.
So I use:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = Application.class)
#WebIntegrationTest("server.port=1234")
#Profile("!jenkins")
#ActiveProfiles("integrationtests")
public class LoginAndEditProfileSeleniumTest {
...
What works:
running mvn clean test run all tests locally, with integrationtests profile active. I dont want to pass any additional parameter.
What I want to achieve:
running mvn clean test -Dspring.profiles.active=jenkins switch off this test.
Can I merge somehow profile passed by parameter, ActiveProfile annotation and take Profile annotation into consideration? :)
//update:
Its possible to use class extending ActiveProfilesResolver:
public class ActiveProfileResolver implements ActiveProfilesResolver {
#Override
public String[] resolve(Class<?> testClass) {
final String profileFromConsole = System.getProperty("spring.profiles.active");
List<String> activeProfiles = new ArrayList<>();
activeProfiles.add("integrationtests");
if("jenkins".contains(profileFromConsole)){
activeProfiles.add("jenkins");
}
return activeProfiles.toArray(new String[activeProfiles.size()]);
}
}
but it seems to not to cooperate with #Profile anyway ( jenkins profile is active but test is still running ) .
#Profile has zero affect on test classes. Thus, you should simply remove that annotation.
If you want to enable a test class only if a given system property is present with a specific value, you could use #IfProfileValue.
However, in your scenario, you want to disable a test class if a given system property is present with a specific value (i.e., if spring.profiles.active contains jenkins).
Instead of implementing a custom ActiveProfileResolver, a more elegant solution would be to use a JUnit assumption to cause the entire test class to be ignored if the assumption fails.
This should work nicely for you:
import static org.junit.Assume.*;
// ...
#BeforeClass
public static void disableTestsOnCiServer() {
String profilesFromConsole = System.getProperty("spring.profiles.active", "");
assumeFalse(profilesFromConsole.contains("jenkins"));
}
Regards,
Sam (author of the Spring TestContext Framework)

Spring #Transactional not starting a transaction

I have a multi-module Maven project with the following structure:
project
|
-- data
|
-- DepartmentRepository.java
|
-- domain
|
-- Department.java
|
-- service
|
-- DepartmentService.java
|
-- DepartmentServiceImpl.java
|
-- web
|
-- DepartmentController.java
The project uses Spring 4.1.4, Spring Data JPA 1.7.2 and Maven 3.1.0. The following classes are included:
#Entity class Department {}
interface DepartmentRepository extends JpaRepository<Department, Long> {}
interface DepartmentService {
List<Department> getAll();
}
#Service
#Transactional(readOnly = true)
class DepartmentServiceImpl implements DepartmentService {
#Autowired
private DepartmentRepository departmentRepository;
#Transactional
public List<Department> getAll() {
return departmentRepository.findAll();
}
}
I was hoping that as soon as the code enters DepartmentServiceImpl.getAll, a transaction would have been started. However, I am finding that this is not the case. No transaction is started. I have checked this by examining TransactionSynchronizationManager.isActualTransactionActive() inside this method, which prints false. I have also checked by putting break points in TransactionAspectSupport.invokeWithinTransaction. However, as soon as departmentRepository.findAll is invoked, a transaction is correctly started (since SimpleJpaRepository, the class which provides an implementation of the JPA repository interface is also annotated with #Transactional).
A complete sample application demonstrating the problem is available on Github.
I noticed your annotation-driven mode is set to aspectj
<transaction:annotation-driven mode="aspectj"/>
but you don't seem to have load-time-weaver defined anywhere in your context.
That may or may not be the issue as I only took a quick look. Also, I don't see why you would need aspectj vs the default proxy mode with what you have, so you might be fine just removing the mode="aspectj" altogether and default to proxy.

Resources