How to convert databaseClient result into object? - converters

I have a task to insert entity through r2dbc database client, and convert the result (map) into the entity.
I want to do it this way:
databaseClient.insert().into(ApplicationData.class)
.using(applicationData)
.map(converter.populateIdIfNecessary(applicationData))
.first();
But the problem is converter entity MappingR2dbcConverter isn't created by spring.
So, I decided to create it myself:
#Bean
public MappingR2dbcConverter converter(RelationalMappingContext mappingContext,
R2dbcCustomConversions r2dbcCustomConversions)....
My question, is it correct way to convert result map into entity?

R2dbc DatabaseClient will be part of Spring framework 5.3, see My example for Spring 5.3 M2.
public static final BiFunction<Row, RowMetadata, Post> MAPPING_FUNCTION = (row, rowMetaData) -> Post.builder()
.id(row.get("id", UUID.class))
.title(row.get("title", String.class))
.content(row.get("content", String.class))
.status(row.get("status", Post.Status.class))
.metadata(row.get("metadata", Json.class))
.createdAt(row.get("created_at", LocalDateTime.class))
.build();
public Flux<Post> findAll() {
return this.databaseClient
.sql("SELECT * FROM posts")
.filter((statement, executeFunction) -> statement.fetchSize(10).execute())
.map(MAPPING_FUNCTION)
.all();
}
public Mono<Post> findById(UUID id) {
return this.databaseClient
.sql("SELECT * FROM posts WHERE id=:id")
.bind("id", id)
.map(MAPPING_FUNCTION)
.one();
}

Related

Dozer seeks xml configs instead of Java confiigs

I am working on a Spring Boot project with Spring Data Rest, Gradle and Oracle Express DB in which I use DozerBeanMapper to map entities to DTOs and vice versa, I use no xml configurations for Dozer, just Java ones:
#Slf4j
#Configuration
public class DozerConfig {
#Bean
public DozerBeanMapper getDozerMapper() {
log.info("Initializing DozerBeanMapper bean");
return new DozerBeanMapper();
}
}
Also for clarity I have explicitly configured all the fields that has to be mapped although they are all with the same names. For example my Client mapper:
#Slf4j
#Component
public class ClientMapper extends BaseMapper {
private BeanMappingBuilder builder = new BeanMappingBuilder() {
#Override
protected void configure() {
mapping(Client.class, ClientDTO.class)
.fields("id", "id")
.fields("name", "name")
.fields("midName", "midName")
.fields("surname", "surname")
.exclude("password")
.fields("phone", "phone")
.fields("email", "email")
.fields("address", "address")
.fields("idCardNumber", "idCardNumber")
.fields("idCardIssueDate", "idCardIssueDate")
.fields("idCardExpirationDate", "idCardExpirationDate")
.fields("bankAccounts", "bankAccounts")
.fields("accountManager", "accountManager")
.fields("debitCardNumber", "debitCardNumber")
.fields("creditCardNumber", "creditCardNumber")
.fields("dateCreated", "dateCreated")
.fields("dateUpdated", "dateUpdated");
}
};
#Autowired
public ClientMapper(DozerBeanMapper mapper) {
super(mapper);
mapper.addMapping(builder);
}
public ClientDTO toDto(Client entity) {
log.info("Mapping Client entity to DTO");
return mapper.map(entity, ClientDTO.class);
}
public Client toEntity(ClientDTO dto) {
log.info("Mapping Client DTO to entity");
return mapper.map(dto, Client.class);
}
public List<ClientDTO> toDtos(List<Client> entities) {
log.info("Mapping Client entities to DTOs");
return entities.stream()
.map(entity -> toDto(entity))
.collect(Collectors.toList());
}
public List<Client> toEntities(List<ClientDTO> dtos) {
log.info("Mapping Client DTOs to entities");
return dtos.stream()
.map(dto -> toEntity(dto))
.collect(Collectors.toList());
}
public EmployeeDTO toEmployeeDto(Employee entity) {
log.info("Mapping Employee entity to DTO");
return mapper.map(entity, EmployeeDTO.class);
}
public Employee toEmployeeEntity(EmployeeDTO dto) {
log.info("Mapping Employee DTO to entity");
return mapper.map(dto, Employee.class);
}
public List<EmployeeDTO> toEmployeeDtos(List<Employee> entities) {
log.info("Mapping Employee entities to DTOs");
return entities.stream()
.map(entity -> toEmployeeDto(entity))
.collect(Collectors.toList());
}
public List<Employee> toEmployeeEntities(List<EmployeeDTO> dtos) {
log.info("Mapping Employee DTOs to entities");
return dtos.stream()
.map(dto -> toEmployeeEntity(dto))
.collect(Collectors.toList());
}
}
Despite this I get the following exception:
"httpStatus": "500 Internal Server Error",
"exception": "java.lang.IllegalArgumentException",
"message": "setAttribute(name, value):\n name: "http://apache.org/xml/features/validation/schema\"\n value: \"true\"",
"stackTrace": [
"oracle.xml.jaxp.JXDocumentBuilderFactory.setAttribute(JXDocumentBuilderFactory.java:289)",
"org.dozer.loader.xml.XMLParserFactory.createDocumentBuilderFactory(XMLParserFactory.java:71)",
"org.dozer.loader.xml.XMLParserFactory.createParser(XMLParserFactory.java:50)",
"org.dozer.loader.xml.MappingStreamReader.<init>(MappingStreamReader.java:43)",
"org.dozer.loader.xml.MappingFileReader.<init>(MappingFileReader.java:44)",
"org.dozer.DozerBeanMapper.loadFromFiles(DozerBeanMapper.java:219)",
"org.dozer.DozerBeanMapper.loadCustomMappings(DozerBeanMapper.java:209)",
"org.dozer.DozerBeanMapper.initMappings(DozerBeanMapper.java:315)",
"org.dozer.DozerBeanMapper.getMappingProcessor(DozerBeanMapper.java:192)",
"org.dozer.DozerBeanMapper.map(DozerBeanMapper.java:120)",
"com.rosenhristov.bank.exception.mapper.ClientMapper.toDto(ClientMapper.java:52)",
"com.rosenhristov.bank.service.ClientService.lambda$getClientById$0(ClientService.java:27)",
"java.base/java.util.Optional.map(Optional.java:265)",
"com.rosenhristov.bank.service.ClientService.getClientById(ClientService.java:27)",
"com.rosenhristov.bank.controller.ClientController.getClientById(ClientController.java:57)",
"java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)",
"java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)",
"java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)",
"java.base/java.lang.reflect.Method.invoke(Method.java:566)",
"org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:197)",
"org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:141)",
"org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:106)",
"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:893)",
"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:807)",
"org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87)",
"org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1061)",
"org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:961)",
"org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006)",
"org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898)",
"javax.servlet.http.HttpServlet.service(HttpServlet.java:626)",
"org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883)",
"javax.servlet.http.HttpServlet.service(HttpServlet.java:733)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:93)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201)",
"org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)",
"org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193)",
"org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166)",
"org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:202)",
"org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97)",
"org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:542)",
"org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:143)",
"org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:92)",
"org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78)",
"org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:343)",
"org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:374)",
"org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65)",
"org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:868)",
"org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1590)",
"org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)",
"java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)",
"java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)",
"org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)",
"java.base/java.lang.Thread.run(Thread.java:834)"
]
It seems Dozer is trying to find some xml conffig file taking into account this:
"oracle.xml.jaxp.JXDocumentBuilderFactory.setAttribute(JXDocumentBuilderFactory.java:289)"
It seems it is searching for a xml validation schema (take a look at var1 on the picture below):
When I start the application I just saw the following in the IntelliJ console:
Trying to find Dozer configuration file: dozer.properties
2020-12-23 11:46:09.855 WARN 17488 --- [ restartedMain] org.dozer.config.GlobalSettings: Dozer configuration file not found: dozer.properties. Using defaults for all Dozer global properties.
Probably I have to search for dozer.properties and find out how to make it look for Java configurations?
Can someone help me, please? I searched for some solution in internet but I still haven't found a suitable one. I am new to Dozer, I have used Mapstruct before?
You can try my beanknife to generate the dto file automatically. It will has a read method to convert the entity to dto. Although no converter from dto to entity, I think you don't need it in most situation.
#ViewOf(value = Client.class, genName = "ClientDto", includePattern = ".*")
class ClientDtoConfiguration {}
Then it will generate a dto class named "ClientDto" with all the properties of Client.
Client client = ...
ClientDto clientDto = ClientDto.read(client);
List<Client> clients = ...
List<ClientDto> clientDtos = ClientDto.read(clients);
Then serialize the dtos instead of entities.

Getting a fixed number of records using Spring Data JPA

I am trying to create a Spring Boot application, where I need to fetch the records from the database and make a call to the REST API for each record fetched from the database. But instead of fetching all the records at once I want to retrieve in batch sizes, say 10, make the rest call for them and then fetch another 10 and do the same, until last record. I am using spring-data-jpa. How can I achieve that?
P.S.: Its a multi-threaded call and DB is the Amazon DynamoDB
My code till now:
Controller:
//Multi Threaded Call to rest
#GetMapping("/myApp-multithread")
public String getQuoteOnSepThread() throws InterruptedException, ExecutionException {
System.out.println("#################################################Multi Threaded Post Call######################");
ExecutorService executor= Executors.newFixedThreadPool(10);
List<Future<String>> myFutureList= new ArrayList<Future<String>>();
long startTime=System.currentTimeMillis()/1000;
***//Here instead of fetching and calling Mycallable for each one, I want
// to do it in batches of 10***
Iterable<Customer> customerIterable=repo.findAll();
List<Customer> customers=new ArrayList<Customer>();
customerIterable.forEach(customers::add);
for(Customer c:customers) {
MyCallable myCallable= new MyCallable(restTemplate, c);
Future<String> future= executor.submit(myCallable);
myFutureList.add(future);
}
for(Future<String> fut:myFutureList) {
fut.get();
}
executor.shutdown();
long timeElapsed= (System.currentTimeMillis()/1000)-startTime;
System.out.println("->>>>>>>>>>>>>>>Time Elapsed In Multi Threaded Post Call<<<<<<<<<<<<<<<-"+timeElapsed);
return "Success";
}
My Callable:
#Scope("prototype")
#Component
public class MyCallable implements Callable<String>{
private RestTemplate restTemplate;
private Customer c;
public MyCallable(RestTemplate rt, Customer cust) {
this.restTemplate = rt;
this.c = cust;
}
#Override
public String call() throws Exception {
System.out.println("Customer no"+ c.getId() +"On thread Number"+Thread.currentThread().getId());
restTemplate.postForObject("http://localhost:3000/save", c, String.class);
return "Done";
}
}
How can I achieve that?
Spring Data JPA offers Page and Slice to handle this case (see PagingAndSortingRepository and Pageable)
public interface PagingAndSortingRepository<T, ID extends Serializable>
extends CrudRepository<T, ID> {
Iterable<T> findAll(Sort sort);
Page<T> findAll(Pageable pageable);
}
You can create Pageable request as:
Pageable firstPageWithTwoElements = PageRequest.of(0, 2);
and pass it to your custom repository (which should extend PagingAndSortingRepository):
Page<T> pageResult = customRepository.findAll(firstPageWithTwoElements);

Spring Batch: How to Insert multiple key-value pairs into Database table for each item

After processing some XML files with Spring Batch ItemProcessor.
The ItemProcessor returns items like this:
MetsModsDef
{
int id;
String title;
String path;
Properties identifers;
....
}
now i need to save this items into a database, so that the
(id, title, path) will go into the "Work" table
and all the Properties stored in the "identifiers" field go into a "Key/Value"-Table called "Identifier" (work, identitytype, identityValue)
how can i acheive this?
currently i am using a CompositeItemWriter to split the object and write it into two tables like this:
public ItemWriter<MetsModsDef> MultiTableJdbcWriter(#Qualifier("dataSource") DataSource dataSource) {
CompositeItemWriter<MetsModsDef> cWriter = new CompositeItemWriter<MetsModsDef>();
JdbcBatchItemWriter hsqlWorkWriter = new JdbcBatchItemWriterBuilder()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO work (id, title, path,enabled) VALUES (:id, :title,:path,1)" )
.dataSource(dataSource)
.build();
JdbcBatchItemWriter hsqlIdentifierWriter = new JdbcBatchItemWriterBuilder()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO identity (work, identitytype, identityValue) VALUES (:work, :identitytype, :identityValue)" )
.dataSource(dataSource)
.build();
List<ItemWriter<? super MetsModsDef>> mWriter = new ArrayList<ItemWriter<? super MetsModsDef>>();
mWriter.add(hsqlWorkWriter);
mWriter.add(hsqlIdentifierWriter);
cWriter.setDelegates(mWriter);
but this will not work for a propertylist since (work, identitytype, identityValue) are not part of my domain object MetModsDef which only contains one map of properties which are supposed to go into the Identifier table.
i have found advice on how to do it when writing to a file,
and even on using a splitter pattern from Spring-Integration Read one record/item and write multiple records/items using spring batch
but i am still not sure how to actually do it, when writing out via jdbc or hibernate (which i assume would be similarish )
thanks for your advice !
in case somebody is interested: after a while i have come up with own solution:
I have found one extending HibernateItemWriter (for hibernate writes) on the internet:
Spring-Batch Multi-line record Item Writer with variable number of lines per record
but i did not want to extend classes, so i had to come up with my own (based on what i could research on the internet).
I am not sure how good is, and how it will handle transactions or rollback (probably bad). but for now it is the only one i have. So if you need one too, or have comments on how to improve it. or even have a better one. You are very welcome.
i have created my own IdentifierListWriter which creates the Key/value-pairs-like-objects (here each pair is called "identitifier") for each MetsModsDef Item and writes out them all using JdbcBatchItemWriter identifierWriter wich is passed to it from the configuration
public class IdentifierListWriter implements ItemWriter<MetsModsDef>
{
private ItemWriter<Identifier> _identifierWriter;
public IdentifierListWriter ( JdbcBatchItemWriter<Identifier> identifierWriter )
{
_identifierWriter= identifierWriter;
}
#Transactional(readOnly = false, propagation = Propagation.REQUIRED)
public void write(List<? extends MetsModsDef> items) throws Exception
{
// Main Table WRITER
for ( MetsModsDef item : items )
{
ArrayList<Identifier> ids = new ArrayList<Identifier>();
for(String key : item.getAllIds().stringPropertyNames())
{
ids.add(new Identifier(item.getAllIds().getProperty(key),
key, item.getId()));
}
_identifierWriter.write(ids);
}
}
}
In the java configuration i create two jdbcBatchItemWriter Beans. One for the "Work" table and one for the "identifier" table. IdentifierListWriter bean and a CompositeItemWriter MultiTableJdbcWriter Bean which uses them all to write out the object
#Bean
#Primary
public ItemWriter<MetsModsDef> MultiTableJdbcWriter(#Qualifier("dataSource") DataSource dataSource) {
IdentifierListWriter identifierListWriter = new IdentifierListWriter(identifierWriter(dataSource) );
CompositeItemWriter cWriter = new CompositeItemWriter();
cWriter.setDelegates(Arrays.asList(hsqlWorkWriter(dataSource),identifierListWriter));
return cWriter;
}
#Bean
public JdbcBatchItemWriter<MetsModsDef> hsqlWorkWriter(#Qualifier("dataSource") DataSource dataSource) {
return new JdbcBatchItemWriterBuilder<MetsModsDef>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO work (id, title, path,enabled) VALUES (:id, :title,:path,1)")
.dataSource(dataSource)
.build();
}
#Bean
public JdbcBatchItemWriter<Identifier> identifierWriter(#Qualifier("dataSource") DataSource dataSource) {
return new JdbcBatchItemWriterBuilder()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO identifier (identifier, type, work_id) VALUES ( :identifier, :type, :work)")
.dataSource(dataSource)
//.afterPropertiesSet()
.build();
}
then the multiTableJdbcWriter is called from a Step:
#Bean
public Step step1(ItemWriter<MetsModsDef> multiTableJdbcWriter) {
return stepBuilderFactory.get("step1")
.<StreamSource, MetsModsDef>chunk(1)
.reader(new MetsModsReader())
.processor(metsModsFileProcessor())
.writer(multiTableJdbcWriter)

How to make AuditorAware work with Spring Data Mongo Reactive

Spring Security 5 provides a ReactiveSecurityContextHolder to fetch the SecurityContext from a Reactive context, but when I want to implement AuditorAware and get audition work automatically, but it does not work. Currently I can not find a Reactive variant for AuditorAware.
#Bean
public AuditorAware<Username> auditor() {
return () -> ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.log()
.filter(a -> a != null && a.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(auth -> new Username(auth.getName()))
.switchIfEmpty(Mono.empty())
.blockOptional();
}
I have added #EnableMongoAuduting on my boot Application class.
On the Mongo document class. I added audition related annotations.
#CreatedDate
private LocalDateTime createdDate;
#CreatedBy
private Username author;
When I added a post, the createdDate is filled, but author is null.
{"id":"5a49ccdb9222971f40a4ada1","title":"my first post","content":"content of my first post","createdDate":"2018-01-01T13:53:31.234","author":null}
The complete codes is here, based on Spring Boot 2.0.0.M7.
Update: Spring Boot 2.4.0-M2/Spring Data Common 2.4.0-M2/Spring Data Mongo 3.1.0-M2 includes a ReactiveAuditorAware, Check this new sample, Note: use #EnableReactiveMongoAuditing to activiate it.
I am posting another solution which counts with input id to support update operations:
#Component
#RequiredArgsConstructor
public class AuditCallback implements ReactiveBeforeConvertCallback<AuditableEntity> {
private final ReactiveMongoTemplate mongoTemplate;
private Mono<?> exists(Object id, Class<?> entityClass) {
if (id == null) {
return Mono.empty();
}
return mongoTemplate.findById(id, entityClass);
}
#Override
public Publisher<AuditableEntity> onBeforeConvert(AuditableEntity entity, String collection) {
var securityContext = ReactiveSecurityContextHolder.getContext();
return securityContext
.zipWith(exists(entity.getId(), entity.getClass()))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
auditableEntity.setLastModifiedBy(tuple2.getT1().getAuthentication().getName());
auditableEntity.setLastModifiedDate(Instant.now());
return auditableEntity;
})
.switchIfEmpty(Mono.zip(securityContext, Mono.just(entity))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
String principal = tuple2.getT1().getAuthentication().getName();
Instant now = Instant.now();
auditableEntity.setLastModifiedBy(principal);
auditableEntity.setCreatedBy(principal);
auditableEntity.setLastModifiedDate(now);
auditableEntity.setCreatedDate(now);
return auditableEntity;
}));
}
}
Deprecated: see the update solution in the original post
Before the official reactive AuditAware is provided, there is an alternative to implement these via Spring Data Mongo specific ReactiveBeforeConvertCallback.
Do not use #EnableMongoAuditing
Implement your own ReactiveBeforeConvertCallback, here I use a PersistentEntity interface for those entities that need to be audited.
public class PersistentEntityCallback implements ReactiveBeforeConvertCallback<PersistentEntity> {
#Override
public Publisher<PersistentEntity> onBeforeConvert(PersistentEntity entity, String collection) {
var user = ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.filter(it -> it != null && it.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(userDetails -> new Username(userDetails.getUsername()))
.switchIfEmpty(Mono.empty());
var currentTime = LocalDateTime.now();
if (entity.getId() == null) {
entity.setCreatedDate(currentTime);
}
entity.setLastModifiedDate(currentTime);
return user
.map(u -> {
if (entity.getId() == null) {
entity.setCreatedBy(u);
}
entity.setLastModifiedBy(u);
return entity;
}
)
.defaultIfEmpty(entity);
}
}
Check the complete codes here.
To have createdBy attribute filled, you need to link your auditorAware bean with the annotation #EnableMongoAuditing
In your MongoConfig class, define your bean :
#Bean(name = "auditorAware")
public AuditorAware<String> auditor() {
....
}
and use it in the annotation :
#Configuration
#EnableMongoAuditing(auditorAwareRef="auditorAware")
class MongoConfig {
....
}

Getting a Page of DTO objects from Spring Data repository

I'm trying to use DTOs in a Spring project to decouple business and presentation but I'm having problems while retrieving data from the Spring Data repository. Here I have a sample code:
public Page<UserDto> findAll(int pageIndex) {
return userRepository.findAll(createPageable(pageIndex)); // Page<User>
}
As you can see, I'm trying to return a page of UserDto but I'm getting a page of User.
How can I do this?
You can do it like this :
public Page<UserDto> findAll(Pageable p) {
Page<User> page = userRepository.findAll(p); // Page<User>
return new PageImpl<UserDto>(UserConverter.convert(page.getContent()), p, page.getTotalElements());
}
Since Spring-Data 1.10 you can use the Page#map function:
public Page<UserDto> findAll(Pageable p) {
return userRepository.findAll(p).map(UserConverter::convert);
}
See Spring Data. The UserConverter#convert method takes a Person and converts it to PersonDto.
public PersonDto convert(Person person) {
return new PersonDto(person.getXyz1(), person.getXyz2);
}
public Page<PersonDto> findAllPageableOrderBylastName(Pageable pageable) {
Page<Person> personPage = personRepository.findAllByLastNameIsNotNullOrderByLastName(pageable);
int totalElements = (int) personPage.getTotalElements();
return new PageImpl<PersonDto>(personPage.getContents()
.stream()
.map(person -> new PersonDto(
person.getId(),
person.getFirstName(),
person.getLastName(),
person.getNumberMobil(),
person.getPresentPosition()))
.collect(Collectors.toList()), pageable, totalElements);
}
I just started programming. I realized so. Working. :)

Resources