I am writing the reactive springboot with mysql and r2dbc.When i writing the query with findByUsername, it just return a String "monotype" instead of an object.
Code:
#Override
public Mono<UserDetails> findByUsername(String username) {
log.info("get user");
System.out.println(userRespository.findByUsername(username)); //print "monoNext" in the console
Mono<UserDetails> ans= userRespository.findByUsername(username).switchIfEmpty(Mono.error(new RuntimeException())).map(
SecurityUser::new
);
return ans;
My respository:
#Repository
public interface UserRespository extends R2dbcRepository<User,Integer> {
#Query("SELECT * FROM user_info WHERE username = :username ;")
Mono<User> findByUsername(String username);
}
ANyone have idea for it?
It is because you actually print Mono.toString() because repository returns Mono
To print User when it is found you should put a callback to your reactive chain, using, for example, doOnNext() operator
return userRespository.findByUsername(username)
// put this callback means "when user is found, print whatever you want here"
.doOnNext(user -> System.out.println(user))
.switchIfEmpty(Mono.error(new RuntimeException()))
.map(SecurityUser::new);
Related
Hello I am new to Webflux I follow a tutorial for building reactive microservices. In my project I faced the following problem.
I want to create a crud api for the product service and the following is the Create method
#Override
public Product createProduct(Product product) {
Optional<ProductEntity> productEntity = Optional.ofNullable(repository.findByProductId(product.getProductId()).block());
productEntity.ifPresent((prod -> {
throw new InvalidInputException("Duplicate key, Product Id: " + product.getProductId());
}));
ProductEntity entity = mapper.apiToEntity(product);
Mono<Product> newProduct = repository.save(entity)
.log()
.map(mapper::entityToApi);
return newProduct.block();
}
The problem is that when I call this method from postman I get the error
"block()/blockFirst()/blockLast() are blocking, which is not supported in thread reactor-http-nio-3" but when I use a StreamListener this call works ok. The stream Listener gets events from a rabbit-mq channel
StreamListener
#EnableBinding(Sink.class)
public class MessageProcessor {
private final ProductService productService;
public MessageProcessor(ProductService productService) {
this.productService = productService;
}
#StreamListener(target = Sink.INPUT)
public void process(Event<Integer, Product> event) {
switch (event.getEventType()) {
case CREATE:
Product product = event.getData();
LOG.info("Create product with ID: {}", product.getProductId());
productService.createProduct(product);
break;
default:
String errorMessage = "Incorrect event type: " + event.getEventType() + ", expected a CREATE or DELETE event";
LOG.warn(errorMessage);
throw new EventProcessingException(errorMessage);
}
}
}
I Have two questions.
Why this works with The StreamListener and not with a simple request?
Is there a proper way in webflux to return the object of the Mono or we always have to return a Mono?
Your create method would want to look more like this and you would want to return a Mono<Product> from your controller rather than the object alone.
public Mono<Product> createProduct(Product product) {
return repository.findByProductId(product.getProductId())
.switchIfEmpty(Mono.just(mapper.apiToEntity(product)))
.flatMap(repository::save)
.map(mapper::entityToApi);
}
As #Thomas commented you are breaking some of the fundamentals of reactive coding and not getting the benefits by using block() and should read up on it more. For example the reactive mongo repository you are using will be returning a Mono which has its own methods for handling if it is empty without needing to use an Optional as shown above.
EDIT to map to error if entity already exists otherwise save
public Mono<Product> createProduct(Product product) {
return repository.findByProductId(product.getProductId())
.hasElement()
.filter(exists -> exists)
.flatMap(exists -> Mono.error(new Exception("my exception")))
.then(Mono.just(mapper.apiToEntity(product)))
.flatMap(repository::save)
.map(mapper::entityToApi);
}
I have implemented auditing using JPA auditing. My code looks like this:
#Configuration
#EnableJpaAuditing(auditorAwareRef = "auditorAware")
public class JpaConfiguration {
#Bean
#Scope(value= ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public AuditorAware<String> auditorAware() {
final String currentUser;
Authentication authentication = SecurityContextHolder.getContext().getAuthentication();
if(null != authentication) {
currentUser = authentication.getName();
} else {
currentUser = null;
}
return () -> Optional.ofNullable(currentUser);
}
}
The issue I am facing is if I login with one user and perform some operation, it's working fine. But when I logout and login with another user, It's still using the last user only.
After debugging the code what I found is spring not creating bean of AuditorAware for each user. It's behaving like singleton bean. Even I specify the scope as prototype also, still it's behaving like singleton.
The AuditorAware is supposed to be a singleton. You should retrieve the current user, each time the AuditAware.getCurrentAuditor is called. Not just once.
Rewrite your code to something like this.
#Bean
public AuditorAware<String> auditorAware() {
return () -> getCurrentAuthentication().map(Authentication::getName());
}
private Optional<Authentication> getCurrentAuthentication() {
return Optional.ofNullable(SecurityContextHolder.getContext().getAuthentication());
}
I am trying to create a Spring Boot application, where I need to fetch the records from the database and make a call to the REST API for each record fetched from the database. But instead of fetching all the records at once I want to retrieve in batch sizes, say 10, make the rest call for them and then fetch another 10 and do the same, until last record. I am using spring-data-jpa. How can I achieve that?
P.S.: Its a multi-threaded call and DB is the Amazon DynamoDB
My code till now:
Controller:
//Multi Threaded Call to rest
#GetMapping("/myApp-multithread")
public String getQuoteOnSepThread() throws InterruptedException, ExecutionException {
System.out.println("#################################################Multi Threaded Post Call######################");
ExecutorService executor= Executors.newFixedThreadPool(10);
List<Future<String>> myFutureList= new ArrayList<Future<String>>();
long startTime=System.currentTimeMillis()/1000;
***//Here instead of fetching and calling Mycallable for each one, I want
// to do it in batches of 10***
Iterable<Customer> customerIterable=repo.findAll();
List<Customer> customers=new ArrayList<Customer>();
customerIterable.forEach(customers::add);
for(Customer c:customers) {
MyCallable myCallable= new MyCallable(restTemplate, c);
Future<String> future= executor.submit(myCallable);
myFutureList.add(future);
}
for(Future<String> fut:myFutureList) {
fut.get();
}
executor.shutdown();
long timeElapsed= (System.currentTimeMillis()/1000)-startTime;
System.out.println("->>>>>>>>>>>>>>>Time Elapsed In Multi Threaded Post Call<<<<<<<<<<<<<<<-"+timeElapsed);
return "Success";
}
My Callable:
#Scope("prototype")
#Component
public class MyCallable implements Callable<String>{
private RestTemplate restTemplate;
private Customer c;
public MyCallable(RestTemplate rt, Customer cust) {
this.restTemplate = rt;
this.c = cust;
}
#Override
public String call() throws Exception {
System.out.println("Customer no"+ c.getId() +"On thread Number"+Thread.currentThread().getId());
restTemplate.postForObject("http://localhost:3000/save", c, String.class);
return "Done";
}
}
How can I achieve that?
Spring Data JPA offers Page and Slice to handle this case (see PagingAndSortingRepository and Pageable)
public interface PagingAndSortingRepository<T, ID extends Serializable>
extends CrudRepository<T, ID> {
Iterable<T> findAll(Sort sort);
Page<T> findAll(Pageable pageable);
}
You can create Pageable request as:
Pageable firstPageWithTwoElements = PageRequest.of(0, 2);
and pass it to your custom repository (which should extend PagingAndSortingRepository):
Page<T> pageResult = customRepository.findAll(firstPageWithTwoElements);
I am new to Spring AOP. I need to execute methods only if the user is authorized.
Here's my code.
#Before("some pointcut")
public HttpStatus checkUserAuthentication(String userName)
{
if( userAuthorized(userName) )
{
//go on executing method
} else {
return HttpStatus.FORBIDDEN;
}
}
Is there any alternative for ProceedingJoinPoint.proceed when using JoinPoint or can I use ProceedingJoinPoint with #Before advice? How to proceed with executing the if statement if the user is authorized.
I solved this using #Around advice. and changing the return type to Object so that it can return ProceedingJoinPoint on successfull verification.
#Around("some pointcut")
public Object checkUserAuthentication(ProceedingJoinPoint pjp, String userName)
{
if( userAuthorized(userName) )
{
return pjp.roceed();
} else {
return new ResponseEntity<>(HttpStatus.FORBIDDEN);
}
}
using #Around as advice the control can be passed to the method after verification.
Spring Security 5 provides a ReactiveSecurityContextHolder to fetch the SecurityContext from a Reactive context, but when I want to implement AuditorAware and get audition work automatically, but it does not work. Currently I can not find a Reactive variant for AuditorAware.
#Bean
public AuditorAware<Username> auditor() {
return () -> ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.log()
.filter(a -> a != null && a.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(auth -> new Username(auth.getName()))
.switchIfEmpty(Mono.empty())
.blockOptional();
}
I have added #EnableMongoAuduting on my boot Application class.
On the Mongo document class. I added audition related annotations.
#CreatedDate
private LocalDateTime createdDate;
#CreatedBy
private Username author;
When I added a post, the createdDate is filled, but author is null.
{"id":"5a49ccdb9222971f40a4ada1","title":"my first post","content":"content of my first post","createdDate":"2018-01-01T13:53:31.234","author":null}
The complete codes is here, based on Spring Boot 2.0.0.M7.
Update: Spring Boot 2.4.0-M2/Spring Data Common 2.4.0-M2/Spring Data Mongo 3.1.0-M2 includes a ReactiveAuditorAware, Check this new sample, Note: use #EnableReactiveMongoAuditing to activiate it.
I am posting another solution which counts with input id to support update operations:
#Component
#RequiredArgsConstructor
public class AuditCallback implements ReactiveBeforeConvertCallback<AuditableEntity> {
private final ReactiveMongoTemplate mongoTemplate;
private Mono<?> exists(Object id, Class<?> entityClass) {
if (id == null) {
return Mono.empty();
}
return mongoTemplate.findById(id, entityClass);
}
#Override
public Publisher<AuditableEntity> onBeforeConvert(AuditableEntity entity, String collection) {
var securityContext = ReactiveSecurityContextHolder.getContext();
return securityContext
.zipWith(exists(entity.getId(), entity.getClass()))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
auditableEntity.setLastModifiedBy(tuple2.getT1().getAuthentication().getName());
auditableEntity.setLastModifiedDate(Instant.now());
return auditableEntity;
})
.switchIfEmpty(Mono.zip(securityContext, Mono.just(entity))
.map(tuple2 -> {
var auditableEntity = (AuditableEntity) tuple2.getT2();
String principal = tuple2.getT1().getAuthentication().getName();
Instant now = Instant.now();
auditableEntity.setLastModifiedBy(principal);
auditableEntity.setCreatedBy(principal);
auditableEntity.setLastModifiedDate(now);
auditableEntity.setCreatedDate(now);
return auditableEntity;
}));
}
}
Deprecated: see the update solution in the original post
Before the official reactive AuditAware is provided, there is an alternative to implement these via Spring Data Mongo specific ReactiveBeforeConvertCallback.
Do not use #EnableMongoAuditing
Implement your own ReactiveBeforeConvertCallback, here I use a PersistentEntity interface for those entities that need to be audited.
public class PersistentEntityCallback implements ReactiveBeforeConvertCallback<PersistentEntity> {
#Override
public Publisher<PersistentEntity> onBeforeConvert(PersistentEntity entity, String collection) {
var user = ReactiveSecurityContextHolder.getContext()
.map(SecurityContext::getAuthentication)
.filter(it -> it != null && it.isAuthenticated())
.map(Authentication::getPrincipal)
.cast(UserDetails.class)
.map(userDetails -> new Username(userDetails.getUsername()))
.switchIfEmpty(Mono.empty());
var currentTime = LocalDateTime.now();
if (entity.getId() == null) {
entity.setCreatedDate(currentTime);
}
entity.setLastModifiedDate(currentTime);
return user
.map(u -> {
if (entity.getId() == null) {
entity.setCreatedBy(u);
}
entity.setLastModifiedBy(u);
return entity;
}
)
.defaultIfEmpty(entity);
}
}
Check the complete codes here.
To have createdBy attribute filled, you need to link your auditorAware bean with the annotation #EnableMongoAuditing
In your MongoConfig class, define your bean :
#Bean(name = "auditorAware")
public AuditorAware<String> auditor() {
....
}
and use it in the annotation :
#Configuration
#EnableMongoAuditing(auditorAwareRef="auditorAware")
class MongoConfig {
....
}