Gateway operator to sub-flow stops processing - spring

I've faced a problem with sub-flows in Spring Integration.
According to documentation (1, 2) I can create a do something like this:
#Bean
fun calculateAndSafeFlow(): IntegrationFlow =
integrationFlow("calculateAndSaveChannel") {
handle(prepareDataResolver)
gateway("calculateChannel")
handle(calculationResultPersistor)
}
#Bean
fun calculateFlow(): IntegrationFlow =
integrationFlow("calculateChannel") {
handle(calculationHandler)
}
Basiclly, I need one flow for just a calculation and second for calculation and storing the results.
My problem is on line with gateway() operator. On this line it just stops processing. The calculationFlow does not take a control and nothing happens.
Calculation handler always returns a result.
Maybe I missed something... Please, help.

When you do a gateway(), then it means the reply must comes back from that call. So, apparently your handle(calculationHandler) does not return. Show us what your calculationHandler is doing and fix it the way it can return something which will be treated as a reply and then it will be sent back to the replyChannel header initiated by that gateway().

Related

How do I return different response in the webflux based on whether the Flux object has elements?

I know there is a function named "hasElements" on a Flux object. But it behaves a bit strange!
Flux<RoomBO> rooms=serverRequest.bodyToMono(PageBO.class).flatMapMany(roomRepository::getRooms);
return rooms.hasElements().flatMap(aBool -> aBool?ServerResponse.ok().body(rooms,RoomBO.class):ServerResponse.badRequest().build());
return ServerResponse.ok().body(rooms,RoomBO.class)
The second return statement can return the right things I need when the flux object is not empty,but the first return statement only returns a empty array,which likes "[]" in json.I don't know why this could happen!I use the same data to test.The only difference is that I call the hasElements function in the first situation.But I need to return badRequest when the flux object is empty. And the hasElements function seems to make my flux object empty,though I know it doesn't do this actually.
well, finally I decide to call switchIfEmpty(Mono.error()) to throw an error and then I deal with the special error globally(Sometimes it's not suitable to use onErrorReturn or onErrorResume). I think this can avoid the collect operation in memory when meets big data. But it's still not a good solution for the global error handler can be hard to maintain. I'd expect someone to give a better solution.
In your example you are transforming class Flux to class RoomBO, it is one of the reasons, why you get an empty array.
If you need to return the processed list of rooms, then, I think, collectList should be your choice. https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Flux.html#collectList--
Flux<RoomBO> roomsFlux = serverRequest.bodyToMono(PageBO.class)
.flatMapMany(roomRepository::getRooms);
return rooms
.collectList()
.flatMap(rooms -> !rooms.isEmpty() ? ServerResponse.ok().bodyValue(rooms) : ServerResponse.badRequest().build());

Spring Webflux: efficiently using Flux and/or Mono stream multiple times (possible?)

I have the method below, where I am calling several ReactiveMongoRepositories in order to receive and process certain documents. Since I am kind of new to Webflux, I am learning as I go.
To my feeling the code below doesn't feel very efficient, as I am opening multiple streams at the same time. This non-blocking way of writing code makes it complicated somehow to get a value from a stream and re-use that value in the cascaded flatmaps down the line.
In the example below I have to call the userRepository twice, since I want the user at the beginning and than later as well. Is there a possibility to do this more efficiently with Webflux?
public Mono<Guideline> addGuideline(Guideline guideline, String keycloakUserId) {
Mono<Guideline> guidelineMono = userRepository.findByKeycloakUserId(keycloakUserId)
.flatMap(user -> {
return teamRepository.findUserInTeams(user.get_id());
}).zipWith(instructionRepository.findById(guideline.getInstructionId()))
.zipWith(userRepository.findByKeycloakUserId(keycloakUserId))
.flatMap(objects -> {
User user = objects.getT2();
Instruction instruction = objects.getT1().getT2();
Team team = objects.getT1().getT1();
if (instruction.getTeamId().equals(team.get_id())) {
guideline.setAddedByUser(user.get_id());
guideline.setTeamId(team.get_id());
guideline.setDateAdded(new Date());
guideline.setGuidelineStatus(GuidelineStatus.ACTIVE);
guideline.setGuidelineSteps(Arrays.asList());
return guidelineRepository.save(guideline);
} else {
return Mono.error(new InstructionDoesntBelongOrExistException("Unable to add, since this Instruction does not belong to you or doesn't exist anymore!"));
}
});
return guidelineMono;
}
i'll post my earlier comment as an answer. If anyone feels like writing the correct code for it then go ahead.
i don't have access to an IDE current so cant write an example but you could start by fetching the instruction from the database.
Keep that Mono<Instruction> then you fetch your User and flatMap the User and fetch the Team from the database. Then you flatMap the team and build a Mono<Tuple> consisting of Mono<Tuple<User, Team>>.
After that you take your 2 Monos and use zipWith with a Combinator function and build a Mono<Tuple<User, Team, Instruction>> that you can flatMap over.
So basically fetch 1 item, then fetch 2 items, then Combinate into 3 items. You can create Tuples using the Tuples.of(...) function.

RxSwift - How to create two streams from one upstream

Background
I'm trying to observe one Int stream (actually I'm not, but to make the argument easier) and do something with it while combining that stream to multiple other streams, say a String stream and a Double stream like the following:
// RxSwift
let intStream = BehaviorSubject<Int>(value: 0) // subscribe to this later on
let sharedStream = intStream.share()
let mappedStream = sharedStream.map { ... }.share()
let combinedStream1 = Observable.combineLatest(sharedStream, stringStream).map { ... }
let combinedStream2 = Observable.combineLatest(sharedStream, doubleStream).map { ... }
The above code is just to demonstrate what I'm trying to do. The code above is part of view model code (the VM part of MVVM), and only the first map (for mappedStream) runs, while the others are not called.
Question
What is wrong with the above approach, and how do I achieve what I'm trying to do?
Also, is there a better way to achieve the same effect?
Updates
I confirmed that setting the replay count to 1 makes things work. But why?
The code above all goes in the initialization phase of the view model, and the subscription happens afterwards.
Okay, I have an answer but it's a bit complex... One problem is that you are using a Subject in the view model, but I'll ignore that for now. The real problem comes from the fact that you are using hot observables inappropriately (share() make a stream hot) and so events are getting dropped.
It might help if you put a bunch of .debug()s on this code so you can follow along. But here's the essence...
When you subscribe to mappedStream, it subscribes to the share which in turn subscribes to the sharedStream, which subscribes to the intStream. The intStream then emits the 0, and that 0 goes down the chain and shows up in the observer.
Then you subscribe to the combinedStream1, which subscribes to the sharedStream's share(). Since this share has already been subscribed to, the subscriptions stop there, and since the share has already output it's next event, the combinedStream1 doesn't get the .next(0) event.
Same for the combinedStream2.
Get rid of all the share()s and everything will work:
let intStream = BehaviorSubject<Int>(value: 0) // subscribe to this later on
let mappedStream = intStream.map { $0 }
let combinedStream1 = Observable.combineLatest(intStream, stringStream).map { $0 }
let combinedStream2 = Observable.combineLatest(intStream, doubleStream).map { $0 }
This way, each subscriber of intStream gets the 0 value.
The only time you want to share is if you need to share side effects. There aren’t any side effects in this code, so there’s no need to share.

RunnableGraph to wait for multiple response from source

I am using Akka in Play Controller and performing ask() to a actor by name publish , and internal publish actor performs ask to multiple actors and passes reference of sender. The controller actor needs to wait for response from multiple actors and create a list of response.
Please find the code below. but this code is only waiting for 1 response and latter terminating. Please suggest
// Performs ask to publish actor
Source<Object,NotUsed> inAsk = Source.fromFuture(ask(publishActor,service.getOfferVerifyRequest(request).getPayloadData(),1000));
final Sink<String, CompletionStage<String>> sink = Sink.head();
final Flow<Object, String, NotUsed> f3 = Flow.of(Object.class).map(elem -> {
log.info("Data in Graph is " +elem.toString());
return elem.toString();
});
RunnableGraph<CompletionStage<String>> result = RunnableGraph.fromGraph(
GraphDSL.create(
sink , (builder , out) ->{
final Outlet<Object> source = builder.add(inAsk).out();
builder
.from(source)
.via(builder.add(f3))
.to(out); // to() expects a SinkShape
return ClosedShape.getInstance();
}
));
ActorMaterializer mat = ActorMaterializer.create(aSystem);
CompletionStage<String> fin = result.run(mat);
fin.toCompletableFuture().thenApply(a->{
log.info("Data is "+a);
return true;
});
log.info("COMPLETED CONTROLLER ");
If you have several responses ask won't cut it, that is only for a single request-response where the response ends up in a Future/CompletionStage.
There are a few different strategies to wait for all answers:
One is to create an intermediate actor whose only job is to collect all answers and then when all partial responses has arrived respond to the original requestor, that way you could use ask to get a single aggregate response back.
Another option would be to use Source.actorRef to get an ActorRef that you could use as sender together with tell (and skip using ask). Inside the stream you would then take elements until some criteria is met (time has passed or elements have been seen). You may have to add an operator to mimic the ask response timeout to make sure the stream fails if the actor never responds.
There are some other issues with the code shared, one is creating a materializer on each request, these have a lifecycle and will fill up your heap over time, you should rather get a materializer injected from play.
With the given logic there is no need whatsoever to use the GraphDSL, that is only needed for complex streams with multiple inputs and outputs or cycles. You should be able to compose operators using the Flow API alone (see for example https://doc.akka.io/docs/akka/current/stream/stream-flows-and-basics.html#defining-and-running-streams )

How to Extract the String value from MONO/FLUX -

I am new to reactor programming,and need some help on MONO/Flux
I have POJO class
Employee.java
class Employee {
String name
}
I have Mono being returned on hitting a service, I need to extract the name from Mono as a string.
Mono<Employee> m = m.map(value -> value.getName())
but this returns again a Mono but not a string. I need to extract String value from this Mono.
You should do something like this:
m.block().getName();
This solution doesn't take care of null check.
A standard approach would be:
Employee e = m.block();
if (null != e) {
e.getName();
}
But using flux you should proceed using something like this:
Mono.just(new Employee().setName("Kill"))
.switchIfEmpty(Mono.defer(() -> Mono.just(new Employee("Bill"))))
.block()
.getName();
Keep in mind that requesting for blocking operation should be avoided if possible: it blocks the flow
You should be avoiding block() because it will block indefinitely until a next signal is received.
You should not think of the reactive container as something that is going to provide your program with an answer. Instead, you need to give it whatever you want to do with that answer. For example:
employeeMono.subscribe(value -> whatYouWantToDoWithName(value.getName()));

Resources