JBehave with Serenity - report can't find test scenarios - maven

Using Jbehave, my runner class extends JUnitStories, I can generate the plain-style report with the following:
#Override
public Configuration configuration() {
Class<? extends Embeddable > embeddableClass = this.getClass();
return new MostUsefulConfiguration().useStoryLoader(new LoadFromClasspath(embeddableClass))
.useStoryControls(new StoryControls().doResetStateBeforeScenario(false).useStoryMetaPrefix("story_").useScenarioMetaPrefix("scenario_"))
.useStoryReporterBuilder(
new StoryReporterBuilder()
.withCodeLocation(CodeLocations.codeLocationFromClass(embeddableClass))
.withDefaultFormats().withFormats(CONSOLE, HTML).withFailureTrace(true)
.withFailureTraceCompression(true));
}
Now I want to integrate JBehave with Serenity for better looking reports ^_^. So I changed my runner class to inherit from SerenityStories instead. After adding dependencies and running via maven, the tests pass. However, the Serenity generated report always sees '0 test scenarios'.
I saw that SerenityStories inherits JUnitStories, and overrides the configuration() method as well.
How can I make Serenity see my test scenarios? Do I need to override the configuration() method differently? And how?
Thank you very much!

I was able to make it work by creating a new maven project instead.
Used the archetype: 'serenity-jbehave-archetype'.
There will be a pre-created & empty runner class that inherits SerenityStories.
I just then merged my files to this new project.
As for the runner class, I've overridden the methods stepsFactory() and storyPaths() to match my steps/stories.
Hope that made sense. Thanks!

Related

Run integration tests only if specific spring profile is set

We have multiple test classes in our spring boot application. Some of the classes contain integration tests, some contain unit tests.
These means that if I (e.g. with maven) let all tests to be executed, it will run all tests in all classes.
What I like to achieve is that the integration tests are executed only, if a specific spring profile is set, e.g. via application.yml.
I like e.g. to annotate the whole test class to define that the tests in this class are only executed if the specified spring profile is set.
If it is not set, these tests shall be ignored.
The topic How can I use #IfProfileValue to test if a Profile is active? goes in exactly this direction. #IfProfileValue looks at first glance exactly like it is what I need.
But as it is pointed out, it is not. I could use it, if I would set a specific system property. But I need to use a real spring profile (and not the system property spring.profiles.active - this would ignore a profile set via application.yml)
#Profile seems to look also to be what I need but as the topic Use #Profile to decide to execute test class shows, we should not use it.
So what can be done to achieve this?
Note that there are a lot of questions about tests and spring profiles on stack overflow. But most of them point out how to set configurations in tests specific to spring profiles. That is not would I am looking for.
I would like to execute or ignore the tests.
I don't know exactly how you want to achieve it, but here is a way if you are using junit to conditionally ignore some tests at runtime simply using a configuration property:
application.properties:
test.enabled=true
then in your test code you can use org.junit.Assume and a property like the following:
#Value("${test.enabled}")
private Boolean testEnabled;
#Test
public void test {
org.junit.Assume.assumeTrue(testEnabled);
// your test code
}
now if you set the property test.enabled to true the test will run, otherwise it will be ignored.
Source: Conditionally ignoring tests in JUnit 4
Using JUnit 5, you can use an #Autowired Environment to check if a profile is active #BeforeEach test is run:
Assumptions.assumeTrue(Arrays.asList(this.environment.getActiveProfiles()).contains("integration"));
This checks for a profile named "integration" and works regardless of how the profile was set (system property, environment variable, application.yml, etc.).
If the profile is not active, the test will be ignored, which is similar to using the #Disabled annotation.
It is very easy. My solution in kotlin:
Create annotation
import org.springframework.test.context.junit.jupiter.EnabledIf
import kotlin.annotation.AnnotationRetention.RUNTIME
import kotlin.annotation.AnnotationTarget.CLASS
import kotlin.annotation.AnnotationTarget.FUNCTION
#Target(CLASS, FUNCTION)
#Retention(RUNTIME)
#EnabledIf(
expression = "#{environment.acceptsProfiles('integration')}",
reason = "🏋🏻‍ Because spring.profiles.active = integration",
loadContext = true)
annotation class Integration
Use it:
import by.package.Integration
#Integration
internal class IntegrationTest {
#Test
// #Integration
fun test() {
assertEquals(4, 2 + 2)
}
#DisableIf annotation has opposite logic

Can I use OSGi Mocks with Declarative Services Annotations

I'm trying to test an OSGi service annodated with Declaratice Services annotations (org.osgi.service.component.annotations). I have generated my project based on the AEM Multi-Project Example.
public class PostServiceTest {
#Rule
public AemContext context = new AemContext((AemContextCallback) context -> {
context.registerInjectActivateService(new PostService());
}, ResourceResolverType.RESOURCERESOLVER_MOCK);
#Test
public void shouldFetchRandomPosts() {
final PostService postsService = context.getService(PostService.class);
final List<Post> posts = postsService.randomPosts(100);
assertEquals(100, posts.size());
}
}
Whenever I run this test in IntelliJ, OSGi Mocks complain about hte absence of SCR metadata on the tested class.
org.apache.sling.testing.mock.osgi.NoScrMetadataException: No OSGi SCR metadata found for class com.example.PostServiceTest
at org.apache.sling.testing.mock.osgi.OsgiServiceUtil.injectServices(OsgiServiceUtil.java:381)
at org.apache.sling.testing.mock.osgi.MockOsgi.injectServices(MockOsgi.java:148)
at org.apache.sling.testing.mock.osgi.context.OsgiContextImpl.registerInjectActivateService(OsgiContextImpl.java:153)
at org.apache.sling.testing.mock.osgi.context.OsgiContextImpl.registerInjectActivateService(OsgiContextImpl.java:168)
at com.example.PostServiceTest.shouldReturnTheEnpointNamesWithValidConfigurationAsTheListOfAcceptableKeys(PostServiceTest.java:23)
Does this mean I can only test classes annotated with the older SCR annotations that come with Apache Felix? The documentation for OSGi Mocks suggests that Declarative Services annotations is supported in version 2.0.0 and higher. The version I'm using meets this criterion.
Interestingly enough, this only happened when I ran the test directly form the IDE. It turns out that IntelliJ was not generating the SCR metadata when compiling my classes for tests.
When I compile the class under test with Gradle, the 'com.cognifide.aem.bundle' plugin is used to generate the SCR descriptor and place it in the resulting Java Archive. That's why unit tests executed by Gradle work fine. Just clicking the Run button in IntelliJ caused this step to be missed.
In order to get this working, I ended up setting up IntelliJ to allow me to run unit tests via Gradle.
I went to Settings > Build, Execution, Deployment > Gradle > Runner and used the dropdown menu so that I could decide whether to use Gradle on a test-by-test basis.

Intellij, cucumber test and two spring Context Configuration

My project has two different cucumber test and each one needs a different spring context configuration.
The problem I have is that when I run each test individually from Intellij, they load the right Spring Context and the tests are passing, but I press run all test, none of them are passing.
Running a maven test, both test are passing.
this is my code:
#RunWith(FCSCucumber.class)
#Cucumber.Options(strict = true
// , tags = {"~#ignore"}
// , tags = {"#Only"}
, glue ="feature.scenario1"
, features = "src/test/resources/feature/scenario1/"
)
#FCSApplicationProperties(commonProps="config/scenario1/exec.common.properties",
environmentProps="src/test/resources/scenario1-test.properties")
public class TesScenario1Features {
}
#ContextConfiguration("/cucumber-scenario1.xml")
public class scenario1Steps {
......
}
#RunWith(FCSCucumber.class)
#Cucumber.Options(strict = true
// , tags = {"~#ignore"}
// , tags = {"#Only"}
, glue ="feature.scenario2"
, features = "src/test/resources/feature/scenario2/"
)
#FCSApplicationProperties(commonProps="config/scenario2/exec.common.properties",
environmentProps="src/test/resources/scenario2-test.properties")
public class TesScenario2Features {
}
#ContextConfiguration("/cucumber-scenario2.xml")
public class scenario2Steps {
......
}
Thank you very much for your help
The issue is that the IntelliJ cucumber plugin is using the cucumber cli to run tests, without using the JUnit runner at all. This causes several limitations, like requiring the spring annotations on the step definition classes instead of the runner, or by default requiring the steps definitions to be in the same package as the scenario files.
In your example I would actually expect also running a single test to fail, unless the correct application properties are also referenced by the /cucumber-scenario{1,2}.xml files.
The only option I see with the standard cucumber implementation would be to extract the tests into separate projects.
I'm actually working on an alternative implementation of cucumber with an improved spring integration that you might want to try. It's not fully integrated with IntelliJ yet though.

maven: Running the same tests for different configurations

In my spring + maven app, I have created some tests for the Data Access Layer that I would like now to run against multiple datasources. I have something like:
#ContextConfiguration(locations={"file:src/test/resources/testAppConfigMysql.xml"})
public class TestFooDao extends AbstractTransactionalJUnit38SpringContextTests {
public void testFoo(){
...
}
}
It has currently the config location hardcoded, so it can be used only against one datasource.
What is the best way to invoke the test twice and pass two different configs (say testAppConfigMysql.xml and testMyConfigHsqlDb.xml)?
I've seen suggestions to do this via system properties. How can I tell maven to invoke the tests twice, with different values of a system property?
I don't know if there is some sexy and fancy solution, being simple as well, for this. I would just implement base class with all testing stuff and then inherit it into 2 classes with different annotation-based configuration, like this:
#ContextConfiguration(locations={"firstDs.xml"})
public class TestFooDaoUsingFirstDs extends TestFooDao {
}
#ContextConfiguration(locations={"secondDs.xml"})
public class TestFooDaoUsingSecondDs extends TestFooDao {
}
Unless you have to handle really high number of different datasources this way, that is OK for me.
Rather than file:..., you can use classpath:... (remove the src/test/resources, it's implicit if you use classpath). Then you can have a single master context with the line:
<import resource="dao-${datasource}.xml" />
If you run the Maven build with the option -Ddatasource=foo, it will replace the ${datasource} in the master context with the whatever you specify. So you can have datasource-foo.xml, datasource-bar.xml etc. for your different configurations.
(You need to enable Maven resource filtering in the POM for this to work).
Alternatively, check out the new stuff in Spring 3.1: http://www.baeldung.com/2012/03/12/project-configuration-with-spring/
Edit: A third option would be to have all the test classes extend some superclass, and use
Junit's #Parameterised, where the parameters are the different Spring contexts. You couldn't use #ContextConfiguration in that case, but you can always create the Spring context manually, then autowire the test class using org.springframework.beans.factory.config.AutowireCapableBeanFactory.autowireBean()
Check maven invoker plugin. It supports profiles also.

Integration testing against an http server - junit?

I want to to integration tests against an http server. So far I have only experiences with junit for unit testing.
I have two requirements: The framework must have a maven plugin and the tests cases code must be clean - so no dirty hacks and no boilerplate code.
Plain JUnit is good for unit testings, #Test methods are individual. But for integration testing I have to process several dependant steps which must exchange some kind of state (variables).
I already read:
Can we use JUNIT for Automated Integration Testing? and Passing JUnit data between tests and came to the conclusion that I don't like static fields in unit test and I don't want to use TestNG and add dependency annotations on tests and I don't want to put my test into one long unreadable test method.
I though more about some syntax like:
public class MyIntegrationTest() {
#Step
public void testCreate(Context context) {context.put("foo");}
#Step
public void testUpdate(Context context) {context.get();}
#Step
public void testDelete(Context context) {context.get()}
}
So I want to enhance/use ?Unit in a way that it executes #Step methods with a context instance as argument. The methods must be called by the framework in order and cannot be called individually. In a perfect world, all ?Unit guis would show the #Step like an #Test but this is optional...
Any hints how to do this?
Jan
The first point is to check the Maven Failsafe Plugin which is intended for doing integration tests with Maven. Second you have to name your Integration tests based on the conventions used by Maven FailSafe Plugin after that you should be able to run your integration tests simply with maven (by mvn clean verify).
So this means you have to name your integration test like MyIntegrationIT.java...To define the order of executions you have to use a different framework than JUnit may be TestNG which supports this kind of needs, but you already excluded it. So the questions is what kind of tests would you like to do? Page-flows etc. may be a look at JWebUnit might be look worth...
You might also want to consider http://httpunit.sourceforge.net/. It's useful for checking to see if responses come back from the http server.
However, it doesn't do the #Step functionality. Normally I'd do that by :
#Test
public void MasterTest() {
step1(..);
step2(..);
....
}
public void step1(...){...}
public void step2(...){...}

Resources