Rest Controllers vs spring-data-rest RepositoryRestResource - spring

I know this might feel like a duplicate of this.
When to use #RestController vs #RepositoryRestResource
But I have a few things which were not addressed in that question.
With #RepositoryRestResource, every method is by default exposed. Which I feel is a bit annoying. Correct me if I am wrong here. For example in the below case
#RepositoryRestResource
public interface ProductRepository extends MongoRepository<Product, String> {}
If I want only findAll() and findOne() to be exposed and not any other methods, especially delete. To achieve this I need to do something like this
#RepositoryRestResource
public interface ProductRepository extends MongoRepository<Product, String> {
#RestResource(exported = false)
#Override
default void delete(String s) {
}
#RestResource(exported = false)
#Override
default void delete(Product product) {
}
#RestResource(exported = false)
#Override
default void delete(Iterable<? extends Product> iterable) {
}
#RestResource(exported = false)
#Override
default void deleteAll() {
}
}
Which I feel is really lot of unwanted code. This is much better to do with Rest Controller approach
I believe it is better to return any value from REST endpoints using ResponseEntity. But with spring-data-rest approach, I am not sure how to do it.
I could not find any way to unit test(Not IT) REST endpoints exposed by the RepositoryRestResource. But with REST controller approach, I can test my REST endpoints using MockServletContext, MockMvc, MockMvcBuilders
Given all these, is it still advantageous to use sping-data-rest(except for HATEOS)?
please clarify

Spring-data-rest is about providing REST endpoints for data repositories and it does provides solid REST with all bells and whistles including ALPS metadata, search endpoints, etc. This usually covers most use cases and provides basis for customisations.
Here're some hints.
Regarding p.1) - Customising exported resources and methods.
You do not need to put #RestResource(exported = false) on all delete(...) methods because only one is actually exported: void delete(Product entity). Look into relevant documentation chapter and the source code. If i do not miss something, you just need to provide these:
findAll(Pageable)
findOne(id)
save(Entity)
delete(Entity)
A note about exported repository methods. Sometimes it's easier to extend a very basic (empty) Repository<Product, String> repository interface and provide only methods you allow on the repository, for example:
#RepositoryRestResource
public interface ProductRepository extends Repository<Product, String> {
long count();
Page<Product> findAll(Pageable pageable);
Product findOne(String entity);
<S extends Product> S save(S entity);
}
Regarding a custom controller. To customise a default behaviour the easiest is to annotate controllers with #RespositoryRestController. Check-out docs and look into RepositoryEntityController.java - that's the default controller.
Regarding p.2) Returning ResponseEntity from controllers
It's very straingforward. You can wrap entity into Resource<T> (e.g. using a PersistentEntityResourceAssembler) and create a ResponseEntity with it. See RepositoryEntityController.java and some examples, like spring-restbucks.
Regarding p.3) - Testing rest endpoints
REST endpoints that expose the RepositoryRestResource are implemented in the RepositoryEntityController (part of the spring-data-rest).
If you implement your own custom controller, you can add unit tests as usual but things get more complex if you use PersistentEntityResourceAssembler.
Unit test example:
public class FooControllerTests {
#Mock
PersistentEntityResourceAssembler assembler;
#Mock
PersistentEntityResourceAssemblerArgumentResolver assemblerResolver;
#Mock
PersistentEntity<Foo, ?> entity;
#InjectMocks
FooController fooController;
#Mock
FooService fooService;
private MockMvc mockMvc;
#Rule
public MockitoRule rule = MockitoJUnit.rule();
#Before
public void setup() {
this.mockMvc = MockMvcBuilders.standaloneSetup(fooController)
.setCustomArgumentResolvers(assemblerResolver)
.build();
}
#Test
public void test_GetItem_Success() throws Exception {
final Foo foo = new Foo();
when(fooService.findOne(1)).thenReturn(foo);
when(assemblerResolver.supportsParameter(any())).thenReturn(true);
when(assemblerResolver.resolveArgument(any(), any(), any(), any())).thenReturn(assembler);
when(assembler.toResource(foo))
.thenReturn(PersistentEntityResource.build(foo, entity).build());
this.mockMvc.perform(get("/foo/1")).andExpect(status().isOk());
}
}
See also "Building REST services with Spring" tutorial.
Hope this helps.

Related

Custom Serializer Not Called For Array Or Collection Types

I am writing a spring boot command line tool that is supposed to interface with an API backend I already implemented. That API backend is built with spring data rest with the hateoas package, so it produces HAL message types.
In my CLI tool, I want to POST an entity that contains a list of other entities (one to many relation). For easier use, I wanted to use Resource types in the models to express relations and have a JSON serializer to transform the Resources into only their self hrefs.
My serializer works fine for one to one relations, but never gets calls to serialize arrays or any collection types.
This is what the API accepts when I POST an entity:
{
"property1": "value1",
"myrelation" : "http://localhohst:8080/relatedentities/1"
"mycollection": [
"http://localhost:8080/otherrelatedentities/2",
"http://localhost:8080/otherrelatedentities/3"
]
}
On the CLI side, I created a model entity in the CLI application like this:
#Getter #Setter
public class MyEntity {
private String property1;
#JsonSerialize(using = HateoasResourceIdSerializer.class)
private Resource<RelatedEnity> myrelation;
#JsonSerialize(using = HateoasResourceIdSerializer.class)
private List<Resource<OtherRelatedEntity>> mycollection;
}
I wrote this HateoasResourceIdSerializer to transform any Resource type into only its self href:
public class HateoasResourceIdSerializer extends StdSerializer<Resource<?>> {
private static final long serialVersionUID = 1L;
public HateoasResourceIdSerializer() {
this(null);
}
public HateoasResourceIdSerializer(Class<Resource<?>> t) {
super(t);
}
#Override
public void serialize(Resource<?> value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
jgen.writeString(value.getId().getHref());
}
}
Looking at the payload sent to the API backend, I can see that the "myrelation" property is set to the URL of the target entity while the "mycollection" property is always null.
I tried writing a 2nd Serializer that would accept Collection<Resource<?>> but that didnt get called either.
My expectation would be that the serializer above for Resource would be applied to arrays as well as any collection type.
EDIT:
I was asked to provide code to register serializers, so here it is. I added the two mixins as suggested in one of the answers below (hope I did it right) but did not see the expected behavior. I also assumed that due to the registration I could remove the #JsonSerialize(using = HateoasResource(s)IdSerializer.class) annotation from the properties. The current behavior is that those properties do not get rendered at all.
#SuppressWarnings("deprecation")
#SpringBootApplication
#EnableHypermediaSupport(type=EnableHypermediaSupport.HypermediaType.HAL)
public class Application extends WebMvcConfigurerAdapter implements ApplicationRunner {
public static void main(String[] args) {
SpringApplication.run(SwissArmyKnifeApplication.class, args);
}
#Override
public void run(ApplicationArguments args) throws IllegalAccessException, IllegalArgumentException, InvocationTargetException {
// ...
}
#Autowired
private HalHttpMessageConverter halHttpMessageConverter;
#Override
public void configureMessageConverters(List<HttpMessageConverter<?>> converters) {
converters.add(halHttpMessageConverter);
super.configureMessageConverters(converters);
}
}
#Configuration
public class HalHttpMessageConverter extends AbstractJackson2HttpMessageConverter {
public HalHttpMessageConverter() {
super(new ObjectMapper(), new MediaType("application", "hal+json", DEFAULT_CHARSET));
objectMapper.registerModule(new Jackson2HalModule());
objectMapper
.setHandlerInstantiator(new Jackson2HalModule.HalHandlerInstantiator(new DefaultRelProvider(), null, null));
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.addMixIn(Resource.class, ResourceMixIn.class);
objectMapper.addMixIn(Resources.class, ResourcesMixIn.class);
}
#Override
protected boolean supports(Class<?> clazz) {
return ResourceSupport.class.isAssignableFrom(clazz);
}
}
You need to properly register your custom serialiser using MixIn feature. Instead of annotating property, you need to annotate class which informs Jackson you want to use it in all scenarios, not only for MyEntity class.
Create MixIn interface:
#JsonSerialize(using = HateoasResourceIdSerializer.class)
interface ResourceMixIn {
}
And register it:
ObjectMapper mapper = JsonMapper.builder()
.addMixIn(Resource.class, ResourceMixIn.class).build();
See other questions how to configure Jackson mapper in Spring:
How do i use Jackson Json parsing in a spring project without any annotations?
Different JSON configuration in a Spring application for REST and Ajax serialization
Spring Boot custom serializer for Collection class
Spring Boot Jackson date and timestamp Format
You are not including code to indicate how you are registering serializer for your type so that could give the clue. But custom serializers definitely should be called for array, Collection and Map values as well as simple property values.
Registering separate serializer for Collection<Type> is not needed (and is actually bit trickier to do: possible, but more work due to nested type) and is not meant to be done just to support specific type in collection (but rather to support more special Collections, if any).
So please include code related to registration, as well as version of Jackson used (and obv. if not recent one, consider upgrading it first to see problem still persists).

How to mock context.getBeansWithAnnotations with Mockito

I have created an interface Client with its two concrete implementations
clientA and clientB and annotated them with my custom annotation.
public interface Client{
public void dosomething();
}
#Component
#Myannotation
public class clientA implements Client {
public void doSomething(){
sysout("Client A do something");
}
}
#Component
#Myannotation
public class clientB implements Client {
public void doSomething(){
sysout("Client B do something");
}
}
Now I am calling the overriden methods of both clientA and clientB from Alien class.
#Component
class Alien{
#Autowired
private ApplicationContext context;
public void performOperation(){
Map<String, Object> beans =
context.getBeansWithAnnotation(MyAnnotation.class);
for(Map.Entry<String, Object> entry: beans.entrySet()) {
Client c = (Client)entry.getValue();
c.doSomething();
}
}
}
I am facing problem with writing test method for performOperation.
#RunWith(MockitoJUnitRunner.class)
class AlienTest
{
#InjectMocks
Alien a;
#Test
public void testperformOperation(){
//how to Mock for beans
assertEquals(expected, a.performOperation());
}
}
1) How should I write testperformOperation method(allowed to change the return type of performOperation method from void to any other type)
2) Is there any better way to get list of all implementations for Client interface without creating custom annotations.
I would suggest you first refactoring Alien to make it more testable using Dependency Injection idea which its dependencies (i.e Client) can be injected from outside rather than hard coded inside a method which always get from the spring context:
#Component
public class Alien{
private List<Client> clients = new ArrayList<>();
#Autowired
public Alien(List<Client> clients) {
this.clients = clients;
}
public void performOperation(){
for(Client c: clients) {
c.doSomething();
}
}
}
If you simply want to inject all Client implementation to the Alien , you just need to #Autowired List<Client> into Alien which Spring will already help you to inject all the Client implementation to it out of the box. No need to create #Myannotation
Once you make the Alien 's dependencies injectable (i.e a list of client) , you can simply inject a mock to it and verify performOperation() really invoke all of Client 's doSomething():
#RunWith(MockitoJUnitRunner.class)
class AlienTest{
#Mock
private Client mockClientA;
#Mock
private Client mockClientB;
#Test
public void testperformOperation(){
List<Client> clients = new ArrayList<>();
clients.add(mockClientA);
clients.add(mockClientB);
Alien alien = new Alien(clients);
alien.performOperation();
verify(mockClientA).doSomething();
verify(mockClientB).doSomething();
}
}
I’ll answer both parts of your question, but I believe the first approach is inferior and the second is the go-to approach.
If you want to stick with your custom annotation approach, you need to have a #Mock ApplicationContext applicationContext in your test class. In the test method (or setup method) you need to mock the call to applicationContext.getBeansWithAnnotation and return an appropriate map containing your bean (possibly also a mock)
You can easily inject all beans to a class by injecting a List of the appropriate type. In your case
get rid of #Autowired ApplicationContext
add an #Autowired List (or, preferably, use constructor injection)
This will also make the tests simpler, no need to mock ApplicationContext.
For example, see https://dzone.com/articles/load-all-implementors

SpringBoot: Testing the Service layer

Let's assume that I have two classes:
TodoRepository
TodoService
the TodoRepository is a simple CRUD Repository:
public interface TodoRepository extends CrudRepository<T, ID> {
}
the TodoService is just a class which calls this Repository:
#Service
public class TodoService{
private final TodoRepository todoRepository;
#Autowired
public TodoService(TodoRepository todoRepository) {
this.todoRepository = todoRepository;
}
public void createTodo(Todo todo) {
todoRepository.save(todo);
}
}
Should I bother testing the service layer?
Edit:
Thanks to the explanation from #Dherik. I created a test class which looks like:
Note: I am using JUnit5, Mockito and Spring framework
#ExtendWith(SpringExtension.class)
class TodoServiceTest {
#MockBean
private TodoRepository todoRepository;
private TodoService todoService;
#BeforeEach
void setUp() {
todoService = new TodoService(todoRepository);
}
#AfterEach
void tearDown() {
clearInvocations(tanklevelRepository);
}
#Test
public void createTodo() {
todoService.createTodo(new Todo());
// verify if the save method is called when createTodo is called too
verify(todoRepository, times(1)).save(any(Todo.class));
}
}
Yes, it's important.
Even now being a very simple class, maybe some developer on the future could add some weird condition on this method createTodo that the Todo is not saved anymore.
If you write a test for the actual method to verify if the save is called, the developer will be advised about the situation if he makes some change that affect the Todo save.
See a pseudo test example:
#Test
public void createTodo() {
TodoRepository todoRepository = mock(TodoRepository.class);
TodoService todoService = new TodoService(todoRepository);
todoService.createTodo(new Todo());
// verify if the save method is called when createTodo is called too
verify(todoRepository, times(1)).save(any(Todo.class));
}
I've seen this kind of thing tested with Junit using a Mock framework and injecting a mock repo into the service, then checking the mock repo was called. That seems really pointless to me as the test knows too much about the implementation. If you change the implementation you have to rewrite the test, so it's no use for refactoring.
I would test this kind of thing with an integration test that treated the app like a black box. i.e. start the app, trigger whatever creates a todo and check that it was created. I'd probably use cucumber-jvm and have a scenario with a step to create a todo and another to retrieve it.
I think you should create tests that
help you write reliable code
allow refactoring without the need to rewrite tests
prove the application does what it's supposed to

Where should we use #Transactional and where is Service layer?

I have rest style controller in Spring. In controller I have injected dao interfaces. From controller I persist data. In the other words, I have like REST web service. people sends me data, and I persits it.
/**
* Payment rest controller which receives
* JSON of data
*/
#Controller
#RequestMapping("/data")
public class PaymentTransaction {
#Autowired
private TestDao dao;
#RequestMapping(value = "/test", method = RequestMethod.POST)
#ResponseBody()
public String test(HttpServletRequest request) {
...
}
At the moment I have #transaction annotation in Dao classes. For instance:
import org.springframework.transaction.annotation.Transactional;
#Component
#Transactional
public interface TestDao {
#Transactional(propagation = Propagation.REQUIRED)
public void first();
}
I have read that this is very bad style. Using this answer at stackoverflow , here is explain and examples why is this bad - we must not add this annotation in DAO and in controller too. We must add it in service layer.
But I don't understand what is the service layer? Or where is it? I do not have anything like this.
where should I write #Transactional annotation?
Best regards,
According to the cited post, you should design your classes somehow like this (rather pseudocode):
controller (responsible for handling clients' requests/responses)
#Controller
#RequestMapping("/data")
public class TestREST {
#Autowired
private TestService service;
public void storePayment(PaymentDTO dto) {
service.storePayment(dto); //request from a client
}
public PaymentDTO getPayment(int paymentId) {
return service.getPayment(paymentId); //response to a client
}
}
service layer (also called business layer, responsible for business logic - knows what to do with incoming messages, but does not know where they come from).
public class TestServiceImpl {
#Autowired
private TestDao dao;
#Transactional(propagation=Propagation.REQUIRED) //force transaction
public void storePayment(PaymentDTO paymentDto) {
// transform dto -> entity
dao.storePayment(paymentEntity); //read-write hence transaction is on
}
#Transactional(propagation=Propagation.NOT_SUPPORTED) //avoid transaction
public Payment getPayment(int paymentId) {
return dao.findPayment(paymentId); //read-only hence no transaction
}
}
data access layer (also called persistence layer, responsible for accessing database - knows how to use entity model / ORM, does not know anything about the upper service layer)
public class TestDAOImpl {
#PersistenceContext
private EntityManager em;
public void storePayment(PaymentEntity paymentEntity) {
em.persist(paymentEntity);
}
public PaymentEntity getPayment(int paymentId) {
return em.find(PaymentEntity.class, paymentId);
}
}
By this approach you get separation of concerns mentioned in the post. From the other hand such an approach (business layer vs data access layer) got a little dose of criticism from Adam Bien's on his blog ("JPA/EJB3 killed the DAO"). As you can see there is no a single solution for the problem, but I encourage to read some other opinions and apply the solution you find the most suitable for your needs.
When you call two Dao methods first & second from controller, 2 transactions will be done, one with starts before first method and ends after it's execution and the second one starts before second method starts and ends after it's execution. Whereas you create an additional class in between controller and dao (usually this is called service layer) and annotate it with #Transactional and call multiple Dao methods in it, a transaction is started at the start of service method and all the dao calls will be executed and transaction will be closed, which is what you require. And inject the Service into Controller.
Controller -> Service -> Dao
#Controller
#RequestMapping("/data")
public class PaymentTransaction {
#Autowired
private TestService service;
#RequestMapping(value = "/test", method = RequestMethod.POST)
#ResponseBody()
public String test(HttpServletRequest request) {
...
}
}
#Service
#Transactional
public class TestService {
#Autowired
private TestDao dao;
#Transactional
public void serviceCall(){
dao.first();
dao.second();
}
}

#Autowired in static classes

This is an Spring MVC project with Hibernate.
I'm, trying to make a Logger class that, is responsible for inputting logs into database.
Other classes just call proper methods with some attributes and this class should do all magic.
By nature it should be a class with static methods, but that causes problems with autowiring dao object.
public class StatisticLogger {
#Autowired
static Dao dao;
public static void AddLoginEvent(LogStatisticBean user){
//TODO code it god damn it
}
public static void AddDocumentEvent(LogStatisticBean user, Document document, DocumentActionFlags actionPerformed){
//TODO code it god damn it
}
public static void addErrorLog(Exception e, String page, HashMap<String, Object> parameters){
ExceptionLogBean elb=new ExceptionLogBean();
elb.setStuntDescription(e);
elb.setSourcePage(page);
elb.setParameters(parameters);
if(dao!=null){ //BUT DAO IS NULL
dao.saveOrUpdateEntity(elb);
}
}
How to make it right? What should I do not to make dao object null?
I know that I could pass it as a method parameter, but that isn't very good.
I'm guessing that autowired can't work on static objects, because they are created to early to autowiring mechanism isn't created yet.
You can't #Autowired a static field. But there is a tricky skill to deal with this:
#Component
public class StatisticLogger {
private static Dao dao;
#Autowired
private Dao dao0;
#PostConstruct
private void initStaticDao () {
dao = this.dao0;
}
}
In one word, #Autowired a instance field, and assign the value to the static filed when your object is constructed. BTW, the StatisticLogger object must be managed by Spring as well.
Classical autowiring probably won't work, because a static class is not a Bean and hence can't be managed by Spring. There are ways around this, for example by using the factory-method aproach in XML, or by loading the beans from a Spring context in a static initializer block, but what I'd suggest is to change your design:
Don't use static methods, use services that you inject where you need them. If you use Spring, you might as well use it correctly. Dependency Injection is an Object Oriented technique, and it only makes sense if you actually embrace OOP.
I know this is an old question but just wanted to share what I did,
the solution by #Weibo Li is ok but the problem it raises Sonar Critical alert about assigning non static variable to a static variable
the way i resolved it with no sonar alerts is the following
I change the StatisticLogger to singlton class (no longer static)
like this
public class StatisticLogger {
private static StatisticLogger instance = null;
private Dao dao;
public static StatisticLogger getInstance() {
if (instance == null) {
instance = new StatisticLogger();
}
return instance;
}
protected StatisticLogger() {
}
public void setDao(Dao dao) {
this.dao = dao;
}
public void AddLoginEvent(LogStatisticBean user){
//TODO code it god damn it
}
public void AddDocumentEvent(LogStatisticBean user, Document document, DocumentActionFlags actionPerformed){
//TODO code it god damn it
}
public void addErrorLog(Exception e, String page, HashMap<String, Object> parameters){
ExceptionLogBean elb=new ExceptionLogBean();
elb.setStuntDescription(e);
elb.setSourcePage(page);
elb.setParameters(parameters);
if(dao!=null){
dao.saveOrUpdateEntity(elb);
}
}
I created a service(or Component) that autowire the service that i want and set it in the singlton class
This is safe since in spring it will initialize all the managed beans before doing anything else and that mean the PostConstruct method below is always called before anything can access the StatisticLogger
something like this
#Component
public class DaoSetterService {
#Autowired
private Dao dao0;
#PostConstruct
private void setDaoValue () {
StatisticLogger.getInstance().setDao(dao0);
}
}
Instead of using StatisticLogger as static class I just use it as StatisticLogger.getInstance() and i can access all the methods inside it
You can pass the DAO to StatisticLogger from where you call it.
public static void AddLoginEvent(LogStatisticBean user, DAO dao){
dao.callMethod();
}
It might be too late to put an answer to this question, especially when a question is already having an accepted answer. But it might help others in case they face the same issue.
inside the StatisticLogger class create an instance of the Dao service.
public static Dao daoService = new Dao();
then, auto-wire the service instance through the constructor of the StatisticLogger class.
#Autowired
public functionName(Dao daoService0) {
this.daoService = daoService0;
}
//use this service as usual in static class
daoService.fun();
I think this is the simplest solution for the problem.

Resources