Micronaut-Data JDBC - Multiple Dialects for Test and Production - jdbc

The Mirconaut docs on JDBC repositories clearly tells us we have to create a test repository to test against another dialect. I think this will be manageable (e.g. Postgres for production and H2 for test).
The problem is I have to repeat my methods (e.g. find()) in the test repository. I have a book repository and a test repository:
#JdbcRepository(dialect = Dialect.POSTGRES)
interface BookRepository extends CrudRepository<Book, Long> {
Optional<Book> find(String title);
}
#JdbcRepository(dialect = Dialect.H2)
#Replaces(bean = BookRepository)
#Requires(env = ["test"])
interface TestBookRepository extends BookRepository {
// Optional<Book> find(String title);
// Required to make the find() method appear in the TestBookRepository
}
To make the find() method available in the TestBookRepository, I had to repeat the method (see commented line above).
Is there a better way to avoid repeating myself? The methods from the CrudRepository interface are available in the TestBookRepository without problems. Why is the find() method not treated the same?
BTW, I don't want to mock the test repository. I want to test the repository 'logic' injected by Micronaut-Data against an SQL database.
This is for Micronaut Data 1.0.0.M5, using Groovy for the source.

To make the find() method available in the TestBookRepository, I had
to repeat the method (see commented line above).
I cannot reproduce that behavior. In order for that to be the case I think the java compiler would need to have a bug in it that caused that.
See the project at https://github.com/jeffbrown/mikehoustonrepository.
https://github.com/jeffbrown/mikehoustonrepository/blob/82b8af568042c762a86cef9965e52fdc61053421/src/main/java/mikehoustonrepository/BookRepository.java
// src/main/java/mikehoustonrepository/BookRepository.java
package mikehoustonrepository;
import io.micronaut.data.jdbc.annotation.JdbcRepository;
import io.micronaut.data.model.query.builder.sql.Dialect;
import io.micronaut.data.repository.CrudRepository;
import java.util.Optional;
#JdbcRepository(dialect = Dialect.POSTGRES)
public interface BookRepository extends CrudRepository<Book, Long> {
Optional<Book> find(String title);
}
https://github.com/jeffbrown/mikehoustonrepository/blob/82b8af568042c762a86cef9965e52fdc61053421/src/test/java/mikehoustonrepository/TestBookRepository.java
// src/test/java/mikehoustonrepository/TestBookRepository.java
package mikehoustonrepository;
import io.micronaut.context.annotation.Replaces;
import io.micronaut.data.jdbc.annotation.JdbcRepository;
import io.micronaut.data.model.query.builder.sql.Dialect;
#JdbcRepository(dialect = Dialect.H2)
#Replaces(BookRepository.class)
public interface TestBookRepository extends BookRepository{}
https://github.com/jeffbrown/mikehoustonrepository/blob/82b8af568042c762a86cef9965e52fdc61053421/src/main/java/mikehoustonrepository/BookController.java
package mikehoustonrepository;
import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;
import io.micronaut.http.annotation.Post;
import java.util.Optional;
#Controller("/books")
public class BookController {
private final BookRepository bookRepository;
public BookController(BookRepository bookRepository) {
this.bookRepository = bookRepository;
}
#Get("/")
public Iterable<Book> index() {
return bookRepository.findAll();
}
#Post("/{title}/{author}")
public Book create(String title, String author) {
return bookRepository.save(new Book(title, author));
}
#Get("/find/{title}")
public Optional<Book> findByTitle(String title) {
return bookRepository.find(title);
}
}
https://github.com/jeffbrown/mikehoustonrepository/blob/82b8af568042c762a86cef9965e52fdc61053421/src/test/java/mikehoustonrepository/BookControllerTest.java
package mikehoustonrepository;
import io.micronaut.http.annotation.Get;
import io.micronaut.http.annotation.Post;
import io.micronaut.http.client.annotation.Client;
import io.micronaut.test.annotation.MicronautTest;
import org.junit.jupiter.api.Test;
import javax.inject.Inject;
import java.util.List;
import java.util.Optional;
import static org.junit.jupiter.api.Assertions.*;
#MicronautTest
public class BookControllerTest {
#Inject
BookClient bookClient;
#Test
public void testFind() throws Exception {
Optional<Book> book = bookClient.find("The Nature Of Necessity");
assertFalse(book.isPresent());
bookClient.create("The Nature Of Necessity", "Alvin Plantinga");
book = bookClient.find("The Nature Of Necessity");
assertTrue(book.isPresent());
}
}
#Client(value="/", path = "/books")
interface BookClient {
#Post("/{title}/{author}")
Book create(String title, String author);
#Get("/")
List<Book> list();
#Get("/find/{title}")
Optional<Book> find(String title);
}
That test passes.
You can see that a different repository is being used for test (TestBookRepository) that is used for other environments (BookRepository).
I hope that helps.

You can utilise Micronaut environments to create different environment configuration for test and production
and configure respective datasource configuration in application-test.yml
and use that datasource for tests
Micronaut Environments from docs

After some more work, I found another way to solve the original problem. You can define a base interface class that just has the methods you need. Then implement concrete classes for the dialect(s) you need. This allows one type of DB for test and one for production.
interface OrderRepository extends BaseRepository, CrudRepository<Order, UUID> {
#Join(value = "product", type = Join.Type.LEFT_FETCH)
Optional<Order> findById(UUID uuid)
}
#JdbcRepository(dialect = Dialect.H2)
#Requires(env = ["test"])
interface OrderRepositoryH2 extends OrderRepository, CrudRepository<Order, UUID> {
}
#JdbcRepository(dialect = Dialect.POSTGRES)
#Requires(env = ["dev"])
interface OrderRepositoryPostgres extends OrderRepository, CrudRepository<Order, UUID> {
}
No methods are needed in the OrderRepositoryH2 interface. Micronaut-data uses the methods from the parent interface fine. The trick is to not use the #JdbcRepository annotation in the parent interface.
You can create any other dialects needed, but you have to make sure the #Requires annotation results in only one bean for any given mode.
I plan to use H2 for testing, with an option to use the Postgres dialect for special test runs when needed.
Sorry for any confusion on the question and comments.
(I decided to mark this as the answer since it solves the original problem).

Related

How do I reset #Repository after every test in SpringBootTest?

I'm trying to put together integration test for my newly created #Repository class, however not having any luck with running all tests together. If I run each test separately - they pass, however if I run the whole test class - two tests fail that attempt to find one row by id in H2 (separate db for testing) and find none.
This is my test class below:
package com.vaidas.development.gradecalculatorbackend.repositories;
import com.vaidas.development.gradecalculatorbackend.models.Module;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.transaction.annotation.Transactional;
import static org.junit.Assert.assertEquals;
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext(classMode = DirtiesContext.ClassMode.BEFORE_EACH_TEST_METHOD)
public class ModuleRepositoryTest {
#Autowired
private ModuleRepository moduleRepository;
#Test
#Transactional
public void shouldInsertAndFindAll() {
// GIVEN
Module module = new Module("Test module", null);
int count = moduleRepository.findAll().size();
// WHEN
moduleRepository.insertModule(module);
int countAfter = moduleRepository.findAll().size();
// THEN
assertEquals(count,countAfter - 1);
}
#Test
#Transactional
public void shouldInsertAndFindOne() {
// GIVEN
Module module = new Module("Test module", null);
module = moduleRepository.insertModule(module);
// WHEN
Module storedModule = moduleRepository.findOne(module.getId());
// THEN
assertEquals(storedModule.toString(), module.toString());
}
#Test
#Transactional
public void shouldUpdate() {
// GIVEN
Module module = new Module("Test module", null);
module = moduleRepository.insertModule(module);
Module updatedModule = new Module("Test module updated", null);
updatedModule.setId(module.getId());
// WHEN
moduleRepository.updateModule(updatedModule);
// THEN
Module foundModule = moduleRepository.findOne(updatedModule.getId());
assertEquals(foundModule.getName(), updatedModule.getName());
}
#Test
#Transactional
public void shouldDelete() {
// GIVEN
Module module = new Module("Test module", null);
module = moduleRepository.insertModule(module);
// WHEN
moduleRepository.deleteModule(module);
// THEN
assertEquals(0, moduleRepository.findAll().size());
}
}
What I've tried:
using #DirtiesContext and #Transactional annotations which I expected to reset the DB content
using #Before and #After annotations for every test, however it seemed like they acted asynchronously and didn't wait for DB to finish adding/removing instances
The failing tests are:
shouldInsertAndFindOne() {
shouldUpdate() {
Each of the above throw the following error:
org.springframework.dao.EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0
Can someone explain why this would happen and what's the correct way to reset my H2 testing db before each test?
I don't know if this is the correct approach, but I usually just add a repositoryName.deleteAll() in a #Before-annotated method.
#Before
public void before() {
moduleRepository.deleteAll();
moduleRepository.flush();
}
This method will be ran before each of your #Test-annotated methods and will ensure that the moduleRepository is empty.
Also, that #Transactional annotation might be the source of your problem. Have you tried the #Before approach without that annotation? Or how about adding a #Transactional at the class level?

Spring + QueryDsl + Mockito: How to write a unit test case for a simple function

I have the following function in my service.
public boolean checkNameUnique(String name) {
QEntity qEntity = QEntity.entity;
BooleanExpression nameUniquePredicate = qEntity.name.eq(name);
long count = entityReadRepository.count(nameUniquePredicate);
return count == 0;
}
It just checks if the name already exists in db. That needs to be unique, so it returns true if does not already exist and false if it does.
Now how do I write a mockito unit test case for this? I am new to Mockito and writing unit test cases, hence the question.
My reading on Mockito has lead me to write something on the lines of
when(entityReadRepository.count(nameUniquePredicate)).thenReturn(1);
and then call the function to be tested. But that doesn't make any sense.
Entity is Hibernate entity which corresponds to a table in the DB
entityReadRepository extends JpaRepository and QueryDslPredicateExecutor. QEntity is the Q object generated by QueryDsl's plugin.
A unit test would normally mock out any external dependencies, in your case entityReadRepository. If you want to do actual db call it would be classed as integration test.
Your method should return two different values depending on the entityReadRepository response and this is what you would stub in order to unit test it. You were on a good path trying:
when(entityReadRepository.count(any(BooleanExpression.class))).thenReturn(1l);
The problem you have is that you have a lot of static calls and objects in your method and that can't be handled gracefully. One option is to use tools like Powermockito where you can mock behaviour of static methods. If you prefer to stick with mockito you could extract static piece of code to a separate method and create a spy of your class under test:
package com.slavpilus;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Spy;
import org.mockito.runners.MockitoJUnitRunner;
import static org.mockito.Matchers.any;
import static org.mockito.Mockito.doReturn;
import static org.mockito.Mockito.when;
#RunWith(MockitoJUnitRunner.class)
public class TPresenterTest {
#InjectMocks
#Spy
private ClassUnderTest target = new ClassUnderTest();
#Before
public void setUp() {
doReturn(null).when(target).getUniqueNamePredicate();
}
#Mock
private YourRepositoryDependency entityReadRepository;
#Test
public void checkNameUniqueShouldBeTrueIfNameNotInDatabase() {
when(entityReadRepository.count(any())).thenReturn(0l);
boolean isUnique = target.checkNameUnique("anyName");
Assert.assertTrue(isUnique);
}
#Test
public void checkNameUniqueShouldBeFalseIfNameFoundInDatabase() {
when(entityReadRepository.count(any())).thenReturn(1l);
boolean isUnique = target.checkNameUnique("anyName");
Assert.assertFalse(isUnique);
}
}
and your production code would look something like that:
public boolean checkNameUnique(String name) {
BooleanExpression nameUniquePredicate = getUniqueNamePredicate();
long count = entityReadRepository.count(nameUniquePredicate);
return count == 0;
}
protected BooleanExpression getUniqueNamePredicate() {
QEntity qEntity = QEntity.entity;
return qEntity.name.eq(name);
}
This approach however leaves you with some code untested as getUniqueNamePredicate method is skipped entirely during the test execution.

Why is this method in a Spring Data repository considered a query method?

We have implemented an application that should be able to use either JPA, Couchbase or MongoDB. (for now, may increase in the future). We successfully implemented JPA and Couchbase by separating repositories for each e.g. JPA will come from org.company.repository.jpa while couchbase will come from org.company.repository.cb. All repository interfaces extends a common repository found in org.company.repository. We are now targeting MongoDB by creating a new package org.company.repository.mongo. However we are encountering this error:
No property updateLastUsedDate found for type TokenHistory!
Here are our codes:
#Document
public class TokenHistory extends BaseEntity {
private String subject;
private Date lastUpdate;
// Getters and setters here...
}
Under org.company.repository.TokenHistoryRepository.java
#NoRepositoryBean
public interface TokenHistoryRepository<ID extends Serializable> extends TokenHistoryRepositoryCustom, BaseEntityRepository<TokenHistory, ID> {
// No problem here. Handled by Spring Data
TokenHistory findBySubject(#Param("subject") String subject);
}
// The custom method
interface TokenHistoryRepositoryCustom {
void updateLastUsedDate(#Param("subject") String subject);
}
Under org.company.repository.mongo.TokenHistoryMongoRepository.java
#RepositoryRestResource(path = "/token-history")
public interface TokenHistoryMongoRepository extends TokenHistoryRepository<String> {
TokenHistory findBySubject(#Param("subject") String subject);
}
class TokenHistoryMongoRepositoryCustomImpl {
public void updateLastUsedDate(String subject) {
//TODO implement this
}
}
And for Mongo Configuration
#Configuration
#Profile("mongo")
#EnableMongoRepositories(basePackages = {
"org.company.repository.mongo"
}, repositoryImplementationPostfix = "CustomImpl",
repositoryBaseClass = BaseEntityRepositoryMongoImpl.class
)
public class MongoConfig {
}
Setup is the same for both JPA and Couchbase but we didn't encountered that error. It was able to use the inner class with "CustomImpl" prefix, which should be the case base on the documentations.
Is there a problem in my setup or configuration for MongoDB?
Your TokenHistoryMongoRepositoryCustomImpl doesn't actually implement the TokenHistoryRepositoryCustom interface, which means that there's no way for us to find out that updateLastUsedDate(…) in the class found is considered to be an implementation of the interface method. Hence, it's considered a query method and then triggers the query derivation.
I highly doubt that this works for the other stores as claimed as the code inspecting query methods is shared in DefaultRepositoryInformation.

Deserialise JSON fields based on user role

I have some fields in a model that I only want to be returned when the logged in user has the role ROLE_ADMIN. I can use #JsonIgnore but that hides it for everyone. How can I make it hide dynamically?
You should use Jackson Json Views technology to acheive it - it allows to choose a different set of fields to be serialized programatically. It is also supported by Spring
Consider you have a class Model with two properties: commonField which should be available for everyone and secretField which should be available only for certain users. You should create an hierarchy of views (any classes would work) and specify which field is available in which view using #JsonView annotation
package com.stackoverflow.jsonview;
import com.fasterxml.jackson.annotation.JsonView;
public class Model {
public static class Public {}
public static class Secret extends Public {}
#JsonView(Public.class)
private String commonField;
#JsonView(Secret.class)
private String secretField;
public Model() {
}
public Model(String commonField, String secretField) {
this.commonField = commonField;
this.secretField = secretField;
}
public String getCommonField() {
return commonField;
}
public void setCommonField(String commonField) {
this.commonField = commonField;
}
public String getSecretField() {
return secretField;
}
public void setSecretField(String secretField) {
this.secretField = secretField;
}
}
Now you can specify the view you want to use in concrete ObjectMapper
package com.stackoverflow.jsonview;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.Test;
import static org.junit.Assert.*;
/**
*/
public class ModelTest {
#Test
public void testSecretField() throws JsonProcessingException {
Model model = new Model("commonField","secretField");
assertEquals("{\"commonField\":\"commonField\",\"secretField\":\"secretField\"}", new ObjectMapper().writerWithView(Model.Secret.class).writeValueAsString(model));
assertEquals("{\"commonField\":\"commonField\"}", new ObjectMapper().writerWithView(Model.Public.class).writeValueAsString(model));
}
}
I am not sure if you can use declaratie approach to make spring choose the right view based on user role out of the box, so probably you will have to write some code like this:
#RequestMapping("/data")
public String getData(HttpServletRequest request) {
Model model = service.getModel();
ObjectMapper objectMapper = new ObjectMapper();
objectMapper = request.isUserInRole("ROLE_ADMIN") ? objectMapper.writerWithView(Model.Secret.class) : objectMapper.writerWithView(Model.Public.class);
return objectMapper.writeValueAsString(model);
}
I solved this after literally a full month of trying various things. I'm working with Spring 4.3.1 and boot, with data being returned in Hal using a pagedrepository.
extend RepositoryRestMvcConfiguration as MyRepositoryRestMvcConfiguration and add #Configuration to the class, make sure your starter class has #EnableWebMvc
add this to MyRepositoryRestMvcConfiguration- extend TypeConstrainedMappingJackson2HttpMessageConverter as MyResourceSupportHttpMessageConverter
add this to MyRepositoryRestMvcConfiguration
#Override
#Bean
public TypeConstrainedMappingJackson2HttpMessageConverter halJacksonHttpMessageConverter() {
ArrayList<MediaType> mediaTypes = new ArrayList<MediaType>();
mediaTypes.add(MediaTypes.HAL_JSON);
if (config().useHalAsDefaultJsonMediaType()) {
mediaTypes.add(MediaType.APPLICATION_JSON);
}
int order = config().useHalAsDefaultJsonMediaType() ? Ordered.LOWEST_PRECEDENCE - 10
: Ordered.LOWEST_PRECEDENCE - 1;
TypeConstrainedMappingJackson2HttpMessageConverter converter = new MyResourceSupportHttpMessageConverter(
order);
converter.setObjectMapper(halObjectMapper());
converter.setSupportedMediaTypes(mediaTypes);
converter.getObjectMapper().addMixIn(Object.class, MyFilteringMixin.class);
final FilterProvider myRestrictionFilterProvider = new SimpleFilterProvider()
.addFilter("MyFilteringMixin", new MyPropertyFilter()).setFailOnUnknownId(false);
converter.getObjectMapper().setFilterProvider(myRestrictionFilterProvider);
return converter;
}
Create an empty Mixin
package filters;
import com.fasterxml.jackson.annotation.JsonFilter;
#JsonFilter("MyFilteringMixin")
public class MyFilteringMixin {}
Create an empty Mixin
create class MyPropertyFilter extending SimpleBeanPropertyFilter and override adapt this method
serializeAsField(Object, JsonGenerator, SerializerProvider, PropertyWriter)you need to call either super.serializeAsField(pPojo, pJgen, pProvider, pWriter) or pWriter.serializeAsOmittedField(pPojo, pJgen, pProvider) depending on whether you wish to include or discard this particular field.
I added an annotation to the particular fields I wanted to alter and interrogated that annotation when deciding which of these two to call. I injected the security role and stored permitted roles in the annotation.
This alters what Hal shares out to the caller, not what Hal is holding in its repository. Thus you can morph it depending on who the caller is.

How to write annotation processor to raise a warning message if a java source is calling an annotated method

Here is my requirement in Java 6: I am using Eclipse JUNO.
Annotate a method with a custom annotation.
During compilation, raise warning message if a method is calling the
annotated method.
I am looking for something like #Deprecated annotation.
This is what I have done:
Wrote a custom annotation.
Wrote an annotation processor to read and process the methods with
the annotation.
Created a jar and added it in annotation processor path. My sample code (see below) raises the warning message in the annotated method. But it is not my requirement.
What I couldn’t do:
I could not get the calling methods. I want to raise the warning
message in those calling methods.
My sample code:
Custom annotation:
package tool.apichecks;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
#Retention(RetentionPolicy.SOURCE)
#Target({ ElementType.METHOD })
public #interface HighCostMethod {
String altMethod();
}
Annotation Processor:
package tool.apichecks;
import java.util.Set;
import javax.annotation.processing.AbstractProcessor;
import javax.annotation.processing.ProcessingEnvironment;
import javax.annotation.processing.RoundEnvironment;
import javax.annotation.processing.SupportedAnnotationTypes;
import javax.lang.model.element.Element;
import javax.lang.model.element.TypeElement;
import javax.tools.Diagnostic.Kind;
#SupportedAnnotationTypes({ "tool.apichecks.HighCostMethod" })
public class MethodProcessor extends AbstractProcessor {
private enum MethodType {
HIGH_COST(HighCostMethod.class.getName());
private String name;
private MethodType(String name) {
this.name = name;
}
private static MethodType getMethodType(String name) {
MethodType methodType = null;
for (MethodType methodType2 : MethodType.values()) {
if (methodType2.name.equals(name)) {
methodType = methodType2;
break;
}
}
return methodType;
}
}
private ProcessingEnvironment processingEnvironment;
#Override
public synchronized void init(ProcessingEnvironment processingEnvironment) {
this.processingEnvironment = processingEnvironment;
}
#Override
public boolean process(Set<? extends TypeElement> annotations,
RoundEnvironment roundEnvironment) {
if (!roundEnvironment.processingOver()) {
for (TypeElement annotation : annotations) {
final Set<? extends Element> elements = roundEnvironment
.getElementsAnnotatedWith(annotation);
MethodType methodType = MethodType.getMethodType(annotation
.toString());
for (Element element : elements) {
switch (methodType) {
case HIGH_COST: {
processHighCostMethod(element);
break;
}
}
}
}
}
return true;
}
protected void processHighCostMethod(Element element) {
HighCostMethod highCostMethod = element
.getAnnotation(HighCostMethod.class);
/* TODO This warns the annotated method itself. I don't want this. I want to warn the methods that calls this method */
processingEnvironment
.getMessager()
.printMessage(
Kind.WARNING,
String.format(
"Do not use high cost method %s. Instead use %s method.",
element, highCostMethod.altMethod()), element);
}
}
Using an AnnotationProcessor will only work on the files containing the annotations or overriding methods, but not calling methods. Maybe there's a way around this, but then you will probably be limited by projects, because the processor only looks at one project at a time.
I guess you need to write an Eclipse plugin with a builder, that analyses code in all files and checks called methods for annotations.
That a lot more work than an annotation processor, but you also have more options. E.g. you could implement a quick fix for the error markers.

Resources