Updating Mono object by another Mono object - spring

Dears,
I'm a stuck with implementing a function (it is basically an update operation) that's capable of taking a Mono as a param and return an updated version of Mono where:
the returned instance derives from a db query;
the updated version of Mono contains fields picked by Mono.
This is the sample code (that works from providing directly the object, without using the Mono instance:
public Mono<CompanyDto> updateById(String id, CompanyDto companyDtoMono) {
return getCompanyById(id).map(companyEntity -> {
companyEntity.setDescription(companyDtoMono.getDescription());
companyEntity.setName(companyDtoMono.getName());
return companyEntity;
}).flatMap(companyEntity2 -> reactiveNeo4JTemplate.save(companyEntity2)).map(companyEntity -> companyMapper.toDto(companyEntity));
}`
Question is: how can I change the code if the function signature would be
public Mono<CompanyDto> updateById(String id, Mono<CompanyDto> companyDtoMono)
PS:
getCompanyById(id)
returns a
Mono<CompanyEntity>
Thanks,
best
FB

There are many solutions for this problem but one of them is using Zip
public Mono<CompanyDto> updateById(String id, Mono<CompanyDto> companyDtoMono){
return Mono.zip(getCompanyById(id),companyDtoMono,(companyEntity, companyDto) -> {
companyEntity.setDescription(companyDto.getDescription());
companyEntity.setName(companyDto.getName());
return companyEntity;
})
.flatMap(companyEntity2 -> reactiveNeo4JTemplate.save(companyEntity2))
.map(companyEntity -> companyMapper.toDto(companyEntity));
}

Related

Caching parallel request in Spring Webflux Mono

We are using spring webflux (project reactor), as part of the requirement we need to call one API from our server.
For the API call, we need to cache the response. So we are using Mono.cache operator.
It caches the response Mono<ResponseDto> and the next time the same API call happens, it will get it from the cache. Following is example implementation
public Mono<ResponseDto> getResponse() {
if (res == null) {
res =
fetchResponse()
.onErrorMap(Exception.class, (error) -> new CustomException())
.cache(
r -> Duration.ofSeconds(r.expiresIn()),
error -> Duration.ZERO,
() -> Duration.ZERO);
}
return res;
}
The problem is if the server calls the same API call twice ( for example Mono.zip) at the same time, then the response is not cached and we actually call it twice.
Is there any out of box solution available to this problem? Instead of caching the Response, can we cache the Mono itself so that both requests subscribe to the same Mono hence both are executed after a Single API call response?
It should also work with sequential execution too - I am afraid that if we cache the Mono then once the request is completed, the subscription is over and no other process can subscribe to it.
Project Reactor provides a cache utility CacheMono that is non-blocking but can stampede.
AsyncCache will be better integration, for the first lookup with key "K" will result in a cache miss, it will return a CompletableFuture of the API call and for the second lookup with the same key "K" will get the same CompletableFuture object.
The returned future object can be converted to/from Mono with Mono.fromFuture()
public Mono<ResponseData> lookupAndWrite(AsyncCache<String, ResponseData> cache, String key) {
return Mono.defer(
() ->
Mono.fromFuture(
cache.get(
key,
(searchKey, executor) -> {
CompletableFuture<ResponseData> future = callAPI(searchKey).toFuture();
return future.whenComplete(
(r, t) -> {
if (t != null) {
cache.synchronous().invalidate(key);
}
});
})));}
You can initialize the Mono in the constructor (assuming it doesn't depend on any request time parameter). Using cache operator will prevent multiple subscriptions to the source.
class MyService {
private final Mono<ResponseBodyDto> response;
public MyService() {
response = fetchResponse()
.onErrorMap(Exception.class, (error) -> new CustomException())
.cache(
r -> Duration.ofSeconds(r.expiresIn()),
error -> Duration.ZERO,
() -> Duration.ZERO);
}
public Mono<ResponseDto> getResponse() {
return response;
}
}
If there is a dependency on request time parameters, you should consider some custom caching solution.
You could use CacheMono from io.projectreactor.addons:reactor-extra to wrap non-reactive cache implementation like Guava Cache or simple ConcurrentHashMap. It doesn't provide an "exactly-once" guarantee and parallel requests could result in cache misses, but in many scenarios, it should not be an issue.
Here is an example with Guava Cache
public class GlobalSettingsCache {
private final GlobalSettingsClient globalSettingsClient;
private final Cache<String, GlobalSettings> cache;
public GlobalSettingsCache(GlobalSettingsClient globalSettingsClient, Duration cacheTtl) {
this.globalSettingsClient = globalSettingsClient;
this.cache = CacheBuilder.newBuilder()
.expireAfterWrite(cacheTtl)
.build();
}
public Mono<GlobalSettings> get(String tenant) {
return CacheMono.lookup(key -> Mono.justOrEmpty(cache.getIfPresent(key)).map(Signal::next), tenant)
.onCacheMissResume(() -> fetchGlobalSettings(tenant))
.andWriteWith((key, signal) -> Mono.fromRunnable(() ->
Optional.ofNullable(signal.get())
.ifPresent(value -> cache.put(key, value))));
}
private Mono<GlobalSettings> fetchGlobalSettings(String tenant) {
return globalSettingsClient.getGlobalSettings(tenant);
}
}

Spring webflux with multiple sequential API call and convert to flux object without subscribe and block

I am working on spring reactive and need to call multiple calls sequentially to other REST API using webclient. The issue is I am able to call multiple calls to other Rest API but response am not able to read without subscribe or block. I can't use subscribe or block due to non reactive programming. Is there any way, i can merge while reading the response and send it as flux.
Below is the piece of code where I am stuck.
private Flux<SeasonsDto> getSeasonsInfo(List<HuntsSeasonsMapping> l2, String seasonsUrl) {
for (HuntsSeasonsMapping s : l2)
{
List<SeasonsJsonDto> list = huntsSeasonsProcessor.appendSeaosonToJson(s.getSeasonsRef());
for (SeasonsJsonDto sjdto:list)
{
Mono<SeasonsDto> mono =new SeasonsAdapter("http://localhost:8087/").callToSeasonsAPI(sjdto.getSeasonsRef());
//Not able to read stream without subscribe an return as Flux object
}
public Mono<SeasonsDto> callToSeasonsAPI(Long long1) {
LOGGER.debug("Seasons API call");
return this.webClient.get().uri("hunts/seasonsInfo/"
+long1).header("X-GoHunt-LoggedIn-User",
"a4d4b427-c716-458b-9bb5-9917b6aa30ff").retrieve().bodyToMono(SeasonsDto.class);
}
Please help to resolve this.
You need to combine the reactive streams using operators such as map, flatMap and concatMap.
private Flux<SeasonsDto> getSeasonsInfo(List<HuntsSeasonsMapping> l2, String seasonsUrl) {
List<Mono<SeasonsDto>> monos = new ArrayList<>();
for (HuntsSeasonsMapping s : l2) {
List<SeasonsJsonDto> list = huntsSeasonsProcessor.appendSeaosonToJson(s.getSeasonsRef());
for (SeasonsJsonDto sjdto:list) {
Mono<SeasonsDto> mono =new SeasonsAdapter("http://localhost:8087/").callToSeasonsAPI(sjdto.getSeasonsRef());
//Not able to read stream without subscribe an return as Flux object
monos.add(mono);
}
}
return Flux.fromIterable(monos).concatMap(mono -> mono);
}
This can further be improved using the steam API, which I suggest you look into, but I didn't want to change too much of your existing code.
I have figured how to do this. I have completely rewrite the code and change in reactive. It means all the for loop has been removed. Below is the code for the same and may be help for others.
public Flux<SeasonsDto> getAllSeasonDetails(String uuid) {
return hunterRepository.findByUuidAndIsPrimaryAndDeleted(uuid, true, false).next().flatMapMany(h1 -> {
return huntsMappingRepository.findByHunterIdAndDeleted(h1.getId(), false).flatMap(k -> {
return huntsMappingRepository.findByHuntReferrenceIdAndDeleted(k.getHuntReferrenceId(), false)
.flatMap(l2 -> {
return huntsSeasonsProcessor.appendSeaosonToJsonFlux(l2.getSeasonsDtl()).flatMap(fs -> {
return seasonsAdapter.callSeasonsAPI(fs.getSeasonsRef(), h1.getId(), uuid).map(k->{
return k;
});
});
});
});
});
}

Unable to return Mono<Compliance>

I'm trying to retrieve Mono from DB and then filter the Compliance List which is inside the PortCall object based on one condition and finally return a Compliance or Mono
Below is my Mongo DB query
#Query("{vesselCode : ?0, arrivalVoyageCode: ?1}")
Mono<PortCall> findDeadlineTimestamp(String vesselCode, String arrivalVoyageCode);
Below is the usage in ServiceImpl to retrieve Mono
Mono<Compliance> cmp = portCallRepository.findDeadlineTimestamp(arrivalVoyageCode, vesselCode)
.doOnNext(p->p.getCompliance().stream()
.filter(c->c.getId().equalsIgnoreCase(compId))).subscribe();
You should use Reactor's operators instead of Java 8 streams.
The expected way to do that is to actually use the map operator along with the filter:
Mono<Compliance> getCompliance() {
return portCallRepository.findDeadlineTimestamp(arrivalVoyageCode, vesselCode)
.map(e -> e.getCompliance())
.filter(c -> c.getId().equalsIgnoreCase(compId));
}
Then, caller will subscribe:
getCompliance().subscribe()

How to get a Flux from a Mono flatmap?

I have the following code:
public Flux<Foo> getFoos(String xyz) {
return getBar(xyz).flatMap(b -> Flux.empty()));
}
But it results in a compilation error because getBars() returns a Mono<Bar> instead of a Flux<Bar>. How can I return a Flux from a flatMap() of a Mono value? Thanks.
Found the solution. I just had to use flatMapMany() instead of flatMap()

Spring Web-Flux: How to return a Flux to a web client on request?

We are working with spring boot 2.0.0.BUILD_SNAPSHOT and spring boot webflux 5.0.0 and currently we cant transfer a flux to a client on request.
Currently I am creating the flux from an iterator:
public Flux<ItemIgnite> getAllFlux() {
Iterator<Cache.Entry<String, ItemIgnite>> iterator = this.getAllIterator();
return Flux.create(flux -> {
while(iterator.hasNext()) {
flux.next(iterator.next().getValue());
}
});
}
And on request I am simply doing:
#RequestMapping(value="/all", method=RequestMethod.GET, produces="application/json")
public Flux<ItemIgnite> getAllFlux() {
return this.provider.getAllFlux();
}
When I now locally call localhost:8080/all after 10 seconds I get a 503 status code. Also as at client when I request /all using the WebClient:
public Flux<ItemIgnite> getAllPoducts(){
WebClient webClient = WebClient.create("http://localhost:8080");
Flux<ItemIgnite> f = webClient.get().uri("/all").accept(MediaType.ALL).exchange().flatMapMany(cr -> cr.bodyToFlux(ItemIgnite.class));
f.subscribe(System.out::println);
return f;
}
Nothing happens. No data is transferred.
When I do the following instead:
public Flux<List<ItemIgnite>> getAllFluxMono() {
return Flux.just(this.getAllList());
}
and
#RequestMapping(value="/allMono", method=RequestMethod.GET, produces="application/json")
public Flux<List<ItemIgnite>> getAllFluxMono() {
return this.provider.getAllFluxMono();
}
It is working. I guess its because all data is already finished loading and just transferred to the client as it usually would transfer data without using a flux.
What do I have to change to get the flux streaming the data to the web client which requests those data?
EDIT
I have data inside an ignite cache. So my getAllIterator is loading the data from the ignite cache:
public Iterator<Cache.Entry<String, ItemIgnite>> getAllIterator() {
return this.igniteCache.iterator();
}
EDIT
adding flux.complete() like #Simon Baslé suggested:
public Flux<ItemIgnite> getAllFlux() {
Iterator<Cache.Entry<String, ItemIgnite>> iterator = this.getAllIterator();
return Flux.create(flux -> {
while(iterator.hasNext()) {
flux.next(iterator.next().getValue());
}
flux.complete(); // see here
});
}
Solves the 503 problem in the browser. But it does not solve the problem with the WebClient. There is still no data transferred.
EDIT 3
using publishOn with Schedulers.parallel():
public Flux<ItemIgnite> getAllFlux() {
Iterator<Cache.Entry<String, ItemIgnite>> iterator = this.getAllIterator();
return Flux.<ItemIgnite>create(flux -> {
while(iterator.hasNext()) {
flux.next(iterator.next().getValue());
}
flux.complete();
}).publishOn(Schedulers.parallel());
}
Does not change the result.
Here I post you what the WebClient receives:
value :[Item ID: null, Product Name: null, Product Group: null]
complete
So it seems like he is getting One item (out of over 35.000) and the values are null and he is finishing after.
One thing that jumps out is that you never call flux.complete() in your create.
But there's actually a factory operator that is tailored to transform an Iterable to a Flux, so you could just do Flux.fromIterable(this)
Edit: in case your Iterator is hiding complexity like a DB request (or any blocking I/O), be advised this spells trouble: anything blocking in a reactive chain, if not isolated on a dedicated execution context using publishOn, has the potential to block not only the entire chain but other reactive processes has well (as threads can and will be used by multiple reactive processes).
Neither create nor fromIterable do anything in particular to protect from blocking sources. I think you are facing that kind of issue, judging from the hang you get with the WebClient.
The problem was my Object ItemIgnite which I transfer. The system Flux seems not to be able to handle this. Because If I change my original code to the following:
public Flux<String> getAllFlux() {
Iterator<Cache.Entry<String, ItemIgnite>> iterator = this.getAllIterator();
return Flux.create(flux -> {
while(iterator.hasNext()) {
flux.next(iterator.next().getValue().toString());
}
});
}
Everything is working fine. Without publishOn and without flux.complete(). Maybe someone has an idea why this is working.

Resources