Testing spring repositories - spring

In the Spring Data I have found very helpful interface called JpaRepository. Because I need more functionality I decided to create my own interface of repository:
public interface BaseRepository<T, ID extends Serializable>
extends JpaRepository<T, ID> {
public <TA, TV> int deleteBy(SingularAttribute<T, TA> attr, TV val);
}
As you can see this is a generic interface. It works fine, but I would like to know how I can test it? Of course I can write integration test for each concrete repository but I am looking for better way.

As usual with testing, you should make sure you know what you're testing. Find answers to these questions:
Do you want to test the underlying database?
Do you want to test the Spring Data repository connector for this respository?
Do you want to test whether your code calls the correct methods on the interface?
Doing #1 is useless: The database vendor has already run thousands of tests on its product. There is rarely a reason to do this effort again.
Doing #2 is useless unless you suspect a bug in the code for Spring Data.
Which leaves us with #3. Use a mocking framework to make sure the method is called at the appropriate places (and maybe check the arguments, too).
That way, you can make sure your code behaves correctly.
If you notice the framework throwing errors or you notice that objects aren't deleted correctly, you can add more tests. But most of the time, this won't happen because of bugs in the database or Spring Data. Instead, your code won't call deleteBy() or it will call the method with the wrong arguments.

Related

inconsistent bean validation initialization of ConstraintValidator defined via ServiceLoader

This question asks for some specifics about more general topic regarding modularization of bean validation I asked before.
In question linked above, following this documentation and this post I split annotation and ConstraintValidator definition into 2 java modules, and linked them together using ServiceLoader as shown in documentation here. Works, mostly. But there is one unsolved issue, that it does not work for validation defined via XML, which I did according to documentation again. What does not work: The pairing between annotation and ConstraintValidator is not set, the service loader stuff is not used at all.
To recap: I have working setup using this ServiceLoader approach and it works when validating stuff coming through rest layer. All paired correctly.
BUT! We are getting these DTOs also through kafka. And here we have two different flows. There is some initialization of common ConstraintValidators on startup, and then:
if we first get REST message, ServiceLoader stuff is discovered only at this request time, some next initialization is done seemignly, and after that even kafka messages works, meaning pairing for custom validator is available everywhere. (Great!)
if kafka message arrives first though(typical), no service loader stuff is consulted and somehow it 'destroys' the configuration in way, that even if later rest request comes it won't work either, saying, that there is no ConstraintValidator for given annotation. The initialization is completed somehow defectively.
validation.xml is as easy as:
<validation-config
xmlns="http://xmlns.jcp.org/xml/ns/validation/configuration"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/validation/configuration validation-configuration-2.0.xsd"
version="2.0">
<constraint-mapping>/META-INF/validation-constraints.xml</constraint-mapping>
</validation-config>
notes:
2.0 version is because of hibernate-validator 6.2.0 which comes from spring dependency management.
Why not use annotation and dump this xml stuff altogether? Not mine file, unmodifiable.
If there is some trivial newbie mistake, please advise. Maybe there is some way how to kick in service loader functionality into action in validation.xml file, I'm not aware of and cannot find anywhere.
EDITS/suggestions:
A: try to inject validator on startup to make sure it's loaded:
#Autowired
private Validator validator;
#EventListener(ApplicationReadyEvent.class)
public void logReady() {
System.out.println(validator.toString());
}
did print initialized validator, did not help though.

Using .Net Core Identity with generic repository

So I'm trying to get my head around this for a while now, but I don't seem to succeed. In my application I'm using a generic repository with Entity Framework Core.
Hence my Repository always expect that it's accessed from a class who's BaseEntity or has inherited from that certain class.
Now I want to implement .Net Core Identity with it. But My User class is inheriting from BaseEntity. But I'd also need it to inherit from Identity in order to make it work I guess. How am I able to still use Identity?
C# only supports single inheritance. You cannot inherit from two different classes. Additionally your Identity user class, must inherit from IdentityUser. You have no choice in that. As a result, the best you can do is make your user class and the rest of your entity classes implement the same interface, i.e. IEntity. Then, instead of constraining your generic type as BaseEntity, use IEntity instead.
Of course, this means you will incur a bit of code duplication as you'll have to implement IEntity separately on both BaseEntity and your user class. However, that is unavoidable.

Executing extension before SpringExtension

I'm trying to implement integration testing in my app and have test class like that:
#ExtendWith(value={MyDockerExtension.class})
#ExtendWith(value={SpringExtension.class})
#WebAppConfiguration
#ContextConfiguration(classes={...})
#TestInstance(TestInstance.LifeCycle.PER_CLASS)
public class TestClass{ ... }
Is there any way to make MyDockerExtension execute some code, before whole SpringExtension start working and generate whole Context with Configurationc classes?
I've heard that order in which we declare extensions is the key, but sadly MyDockerExtension that implements BeforeAllCallback, AfterAllCallback executes right before test method and after whole context is loaded. In that situation it's to late to start containers with docker, becuase since whole context is loaded my app already tried to connect to the container.
At first I was skeptical about the order being fixed but you're correct:
Extensions registered declaratively via #ExtendWith will be executed in the order in which they are declared in the source code.
Regarding the MyDockerExtension, you may want to look at the extension point TestInstancePostProcessor, which is called before #BeforeAll. SpringExtension implements it and I guess it's there where it sets up the application context. If you also implement it, you should be able to act before it does.

Use protected instead of private for member variables

I always got problems with the private variable declaration.
For example FlatFileItemWrite. I would like to extend these class and overwrite the 'doRead' method. This would not work because some of the used variables are declared private. This leads to copying the complete code in an own class for overwriting one method.
Sometime even this does not work because the class extends an other class which has variables declared visible only for the same package. Then you need to copy this class also.
Then I will miss updates in the original classes with new versions. So would it not be better to use protected instead?
I can imaging only a very few reasons to use private instead of protected. For my own programs this is not an issue, I could change it on demand. But for a framework it is a pain.
with kind regards
Torsten
If something is declared private within the Spring framework (or any framework for that matter), it's not considered part of the public API. Because of that, you really shouldn't be looking to work with it directly. Doing so really means you're forking the framework and risking not being able to upgrade seamlessly.
As the project lead for Spring Batch, I'd be interested in hearing what you had to do with the FlatFileItemWriter that required you to change things that are marked private.
If the idea behind the framework was to override or extend these methods, they should have been written as being public. (be careful if a framework does not provide these methods or properties as public, since it might depend on them working in a specific way. this would be the primary reason for them being private i can think of. the secondary being that they don't matter outside that class.)
In some cases, you might not need to copy the entire class, but simply inheriting or extending it might be enough.
I'm also looking to extend certain ItemReaders/ItemWriters to support decryption/encryption on i/o. For example, I'd like to extend StaxEventItemReaderStaxEventItemReader in order to read an encrypted stream from the resource, but the FragmentEventReader is private, so I'm unable to wrap its XMLEventReader's InputStream in a decrypter.
I faced the same issue with FlatFileItemWriter.

Acceptance testing preloading of data into GAE dev server datastore

In my application I have a set of of DAOs which I inject into my application layer. For an acceptance test I'm writing, I want to preload the dev_server datastore with data, so I use the same Spring config in my JUnit test (using the #ContextConfiguration annotation) to inject an instance of the relevant DAO into my test. When I actually go to store some data eg:
dao.add(entity)
I get the dreaded "No API environment is registered for this thread."
Caused by: java.lang.NullPointerException: No API environment is registered for this thread.
at com.google.appengine.api.datastore.DatastoreApiHelper.getCurrentAppId(DatastoreApiHelper.java:108)
at com.google.appengine.api.datastore.DatastoreApiHelper.getCurrentAppIdNamespace(DatastoreApiHelper.java:118)
....
This is probably because my test case hasn't read in the GAE application-web.xml with the app details (although I'm guessing here I could really be wrong); so it doesn't know to write to the same datastore that the app running on the dev_server is reading/writing to.
How can I get my test to "point" to the same datastore as the app? Is there some "datasource" mechanism that I can inject both into the app and the test? Is there a way to get my test to force the datastore api to read the needed config?
Here is a page that talks about how to do unit tests that connect to a dev datastore. Is this the kind of thing you're looking for? Basically it talks about two classes, LocalServiceTestHelper and LocalDatastoreServiceTestConfig that you can use to set up an environment for testing. While the example given is for unit tests, I believe it will also work for your situation.
You can then configure things like whether the dev datastore is written to disk or just kept in memory (for faster tests). If you want this data to go to the same place as your dev server, you will probably want to adjust this, as I think the default is the "in memory" option. If you look at the javadoc there is a "setBackingStoreLocation" method where you can point to whatever file you want.
I've found the solution!!!!
For some reason the Namespace, AppID and the AuthDomain fields of the test datastore have to match that of the dev_server, then the dev_server can see the entities inserted by the test.
You can see the values for the environment (dev_server or test code) with the following statements
System.out.println(NamespaceManager.get());
System.out.println(ApiProxy.getCurrentEnvironment().getAppId());
System.out.println(ApiProxy.getCurrentEnvironment().getAuthDomain());
In your instance of LocalServiceTestHelper (eg: gaeHelper), you can set the values for the test environment
// the NamespaceManager is thread local.
NamespaceManager.set(NamespaceManager.getGoogleAppsNamespace());
gaeHelper.setEnvAppId(<the name of your app in appengine-web.xml>);
gaeHelper.setEnvAuthDomain("gmail.com");
Then the dev_server will see your entities. However because of synchronisation issues, if the test writes to the datastore after the dev_server has been started the dev_server wont see it unless it can be forced to reread the file (which I haven't figured out yet). Else the server has to be restarted.
I've found a workaround, although it's not very nice because each test method doesn't clean up the Datastore, as explained in the article Local Unit Testing for Java, however, the Datastore starts clean each time the Test class is run, so it's not so bad, provided that you're careful about that.
The problem is, that when using SpringJUnit4ClassRunner, the spring environment is created before the #Before annotation can be run, the solution is use #BeforeClass and use a static variable for LocalServiceTestHelper, to have them created before the Spring Environment is set up.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("classpath:META-INF/spring/context-test.xml")
#Transactional
public class MyTest {
#Inject
private MyService myService;
private static final LocalServiceTestHelper helper =
new LocalServiceTestHelper(new LocalDatastoreServiceTestConfig());
#BeforeClass
public static void beforeClass() {
helper.setUp();
}
#AfterClass
public static void afterClass() {
helper.tearDown();
}
If anyone has a better solution, I'll be glad to hear!

Resources