Acceptance testing preloading of data into GAE dev server datastore - spring

In my application I have a set of of DAOs which I inject into my application layer. For an acceptance test I'm writing, I want to preload the dev_server datastore with data, so I use the same Spring config in my JUnit test (using the #ContextConfiguration annotation) to inject an instance of the relevant DAO into my test. When I actually go to store some data eg:
dao.add(entity)
I get the dreaded "No API environment is registered for this thread."
Caused by: java.lang.NullPointerException: No API environment is registered for this thread.
at com.google.appengine.api.datastore.DatastoreApiHelper.getCurrentAppId(DatastoreApiHelper.java:108)
at com.google.appengine.api.datastore.DatastoreApiHelper.getCurrentAppIdNamespace(DatastoreApiHelper.java:118)
....
This is probably because my test case hasn't read in the GAE application-web.xml with the app details (although I'm guessing here I could really be wrong); so it doesn't know to write to the same datastore that the app running on the dev_server is reading/writing to.
How can I get my test to "point" to the same datastore as the app? Is there some "datasource" mechanism that I can inject both into the app and the test? Is there a way to get my test to force the datastore api to read the needed config?

Here is a page that talks about how to do unit tests that connect to a dev datastore. Is this the kind of thing you're looking for? Basically it talks about two classes, LocalServiceTestHelper and LocalDatastoreServiceTestConfig that you can use to set up an environment for testing. While the example given is for unit tests, I believe it will also work for your situation.
You can then configure things like whether the dev datastore is written to disk or just kept in memory (for faster tests). If you want this data to go to the same place as your dev server, you will probably want to adjust this, as I think the default is the "in memory" option. If you look at the javadoc there is a "setBackingStoreLocation" method where you can point to whatever file you want.

I've found the solution!!!!
For some reason the Namespace, AppID and the AuthDomain fields of the test datastore have to match that of the dev_server, then the dev_server can see the entities inserted by the test.
You can see the values for the environment (dev_server or test code) with the following statements
System.out.println(NamespaceManager.get());
System.out.println(ApiProxy.getCurrentEnvironment().getAppId());
System.out.println(ApiProxy.getCurrentEnvironment().getAuthDomain());
In your instance of LocalServiceTestHelper (eg: gaeHelper), you can set the values for the test environment
// the NamespaceManager is thread local.
NamespaceManager.set(NamespaceManager.getGoogleAppsNamespace());
gaeHelper.setEnvAppId(<the name of your app in appengine-web.xml>);
gaeHelper.setEnvAuthDomain("gmail.com");
Then the dev_server will see your entities. However because of synchronisation issues, if the test writes to the datastore after the dev_server has been started the dev_server wont see it unless it can be forced to reread the file (which I haven't figured out yet). Else the server has to be restarted.

I've found a workaround, although it's not very nice because each test method doesn't clean up the Datastore, as explained in the article Local Unit Testing for Java, however, the Datastore starts clean each time the Test class is run, so it's not so bad, provided that you're careful about that.
The problem is, that when using SpringJUnit4ClassRunner, the spring environment is created before the #Before annotation can be run, the solution is use #BeforeClass and use a static variable for LocalServiceTestHelper, to have them created before the Spring Environment is set up.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("classpath:META-INF/spring/context-test.xml")
#Transactional
public class MyTest {
#Inject
private MyService myService;
private static final LocalServiceTestHelper helper =
new LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());
#BeforeClass
public static void beforeClass() {
helper.setUp();
}
#AfterClass
public static void afterClass() {
helper.tearDown();
}
If anyone has a better solution, I'll be glad to hear!

Related

Start a Testcontainers container during each fresh application context (#DirtiesContext), with injection of properties (#DynamicPropertySource)

I am using Testcontainers to execute integration tests in a Spring project. I am using JUnit 4.x and Cucumber 7.4.1.
I want to start a docker-compose.yml-based set of containers before each test, as that makes it easy to start from scratch. Hence, I am using #DirtiesContext.
The challenge here is that certain application properties, such as spring.rabbitmq.host, are needed before the actual application context can start. So I need to inject them beforehand. There is #DynamicPropertySource for that. But then I also need to get access to my context-scoped docker containers. The best I came up with so far is the following:
#CucumberContextConfiguration
#SpringBootTest(classes = TestConfig.class)
#DirtiesContext
public class CucumberITConfig {
#DynamicPropertySource
private static void properties(DynamicPropertyRegistry registry) {
DockerComposeContainer container = new DockerComposeContainer(new File("docker-compose.yml"))
.withExposedService("rabbitmq", 5672);
container.start();
registry.add("spring.rabbitmq.host", () -> container.getServiceHost("rabbitmq", 5672));
}
}
This constructs new docker containers locally and waits for the host to be passed to the registry. While this seem to work, this looks more like a hackish approach to me. Also, a problem here is that the containers stack up after each test. That is, in the 7th test, for example, the containers from all previous 6 cycles are still running.
Are there better approaches to start Testcontainers-based docker containers before each application context, while also being able to destruct them afterwards?
If you are using #DirtiesContext after all, you can set these values as System properties and omit #DynamicPropertySource altogether. However, as others have pointed out, solving test pollution by re-creating the Spring context and all dependent services for every test class will be very slow and is generally considered an anti-pattern.

Prevent AOP(AspectJ) code to be triggered while running test

In my spring boot project, I am using MockMVC to test controller(web) layer. But I also have AOP(AspectJ) logic in my project, when I run unit test for controller with MockMVC, the test also triggers AOP code, how can I prevent AOP code to be triggered while running unit test for controller?
#Test
public void testMyControllerMethod() {
...
// myRequest hits an endpoint function of my controller, there is also AOP intercept the function call, how can I disable AOP to be triggered while running test?
mockMVC.perform(myRequest).andExpect(okStatus)
}
Question is in my code comment :)
I have checked this answer, I understand to use the if() expression, but I don't get TestMode.ACTIVE, there is no such thing in Spring boot. If someone could let me know how to check whether code is running unit test or not at runtime, I would know how to prevent AOP logic run as well.
What I meant in the other answer, as Simon already tried to explain to you, is something like this:
package de.scrum_master.app;
public class TestMode {
public static boolean ACTIVE = false;
}
But actually there I also listed a few other options such as environment variables and system properties. If I were you I would use one of those because in your Maven or Gradle build it would be very easy to set properties or environment variables via configuration. Your if() pointcut could access those variables.
Especially in the context of Spring there is an even simpler option: a test application configuration. Just provide a configuration without aspects to your tests. That way you can have different configurations for
production environment,
unit tests (no aspects),
integration tests (e.g. with aspects but different from unit test and production).
et cetera.
The advantage here is that you don't need any if() pointcuts or build any other knowledge about test/production environments into your aspects, which is quite ugly. My other answer only shows what you can do, it does not say it is the best solution.

Executing extension before SpringExtension

I'm trying to implement integration testing in my app and have test class like that:
#ExtendWith(value={MyDockerExtension.class})
#ExtendWith(value={SpringExtension.class})
#WebAppConfiguration
#ContextConfiguration(classes={...})
#TestInstance(TestInstance.LifeCycle.PER_CLASS)
public class TestClass{ ... }
Is there any way to make MyDockerExtension execute some code, before whole SpringExtension start working and generate whole Context with Configurationc classes?
I've heard that order in which we declare extensions is the key, but sadly MyDockerExtension that implements BeforeAllCallback, AfterAllCallback executes right before test method and after whole context is loaded. In that situation it's to late to start containers with docker, becuase since whole context is loaded my app already tried to connect to the container.
At first I was skeptical about the order being fixed but you're correct:
Extensions registered declaratively via #ExtendWith will be executed in the order in which they are declared in the source code.
Regarding the MyDockerExtension, you may want to look at the extension point TestInstancePostProcessor, which is called before #BeforeAll. SpringExtension implements it and I guess it's there where it sets up the application context. If you also implement it, you should be able to act before it does.

Spring Batch Step Integration Testing

I'm looking for some general opinions and advice on testing a Spring batch step and step execution.
My basic step reads in from an api, processes into an entity object and then writes to a DB. I have tested the happy path, that the step completes successfully. What I now want to do is test the exception handling when data is missing at the processor stage. I could test the processor class in isolation, but I'd rather test the step as a whole to ensure the process failure is reflected correctly at step/job level.
I've read the spring batch testing guidelines and if I'm honest, I'm slightly lost within it. Is it possible to use StepScopeTestUtils.doInStepScope or updating the StepExecution to test this scenario? Ideally I'd force the reader to return faulty data before the processor kicks in.
Any advice would be greatly appreciated.
The best approach depends on the scope of your test. Reading a little between the lines here, I assume you are using a Spring IT, setting up a Spring context and using the JobLauncherTestUtils to start a job or a step.
I think the easiest way is replace one of your beans with a mock that triggers the error scenario. Using Mockito, this can be done by adding something like this to your test-configuration.
#Bean
public ReaderDataRepository dataApi(){
return mock(ReaderDataRepository.class);
}
This bean then overrides the actual implementation. In the test setup you can then configure this mock very explicitly.
#Autowired
private ReaderDataRepository mockedRepository;
#Before
public void setUp() {
when(mockedRepository.getData()).thenReturn(faultyData())
}
This involves very little manipulation of Spring 'magic' and very explicitly defines the error from within the test.

Testing spring repositories

In the Spring Data I have found very helpful interface called JpaRepository. Because I need more functionality I decided to create my own interface of repository:
public interface BaseRepository<T, ID extends Serializable>
extends JpaRepository<T, ID> {
public <TA, TV> int deleteBy(SingularAttribute<T, TA> attr, TV val);
}
As you can see this is a generic interface. It works fine, but I would like to know how I can test it? Of course I can write integration test for each concrete repository but I am looking for better way.
As usual with testing, you should make sure you know what you're testing. Find answers to these questions:
Do you want to test the underlying database?
Do you want to test the Spring Data repository connector for this respository?
Do you want to test whether your code calls the correct methods on the interface?
Doing #1 is useless: The database vendor has already run thousands of tests on its product. There is rarely a reason to do this effort again.
Doing #2 is useless unless you suspect a bug in the code for Spring Data.
Which leaves us with #3. Use a mocking framework to make sure the method is called at the appropriate places (and maybe check the arguments, too).
That way, you can make sure your code behaves correctly.
If you notice the framework throwing errors or you notice that objects aren't deleted correctly, you can add more tests. But most of the time, this won't happen because of bugs in the database or Spring Data. Instead, your code won't call deleteBy() or it will call the method with the wrong arguments.

Resources