Spring Integration DSL adding mid flow transaction - spring-boot

I want to make specific part of flow as transactional. For instance, I want to make the first two transform operation in one transactional block. Here is the flow code that I use:
#Bean
public IntegrationFlow createNumberRange() {
return IntegrationFlows.from("npEventPubSubChannel")
.transform(...)
.transform(...)// should be transactional with above transform together
.transform(...) // non transactional
.handle((payload, headers) -> numbRepository.saveAll(payload))
.get();
}
I found a workaround as adding another handle and directing flow to transactional gateway like this one:
.handle("transactionalBean", "transactionalMetod") //Then implemented messagingGateway which consists of transactional method.
I also found mid flow transactional support but couldn't find an example to work on.
Is there an elegant solution rather than directing to another gateway in the middle of the flow?

If you want to wrap two transformers into the transaction, you don't have choice unless hide that call behind transactional gateway. That is fully similar when you do raw Java:
#Transactional
void myTransactionalMethod() {
transform1();
transform2();
}
I'm sure you are agree with me that we always have to do this way to have them both in the same transaction.
With Spring Integration Java DSL you can do this though:
.gateway(f -> f
.transform(...)
.transform(...),
e -> e.transactional())
Do you agree it is similar to what we have with the raw Java and not so bad from the elegance perspective?

Related

Main processing flow programmatic approach when using Spring Integration with Project Reactor

I want to define a flow that consumes kafka with Reactor Kafka and writes to MongoDB, and only on success writes the IDs to Kafka. I'm using the Project Reactor with Spring Integration JavaDSL, and I'd wish to have a FlowBuilder class that defines my pipeline at a high level. I currently have the following direction:
public IntegrationFlow buildFlow() {
return IntegrationFlows.from(reactiveKafkaConsumerTemplate)
.publishSubscribeChannel(c -> c
.subscribe(sf -> sf
.handle(MongoDb.reactiveOutboundChannelAdapter()))
.handle(writeToKafka)
.get();
}
I've seen in the docs that there is a support for a different approach, that also works with Project Reactor. This approach doesn't include the use of IntegrationFlows. This looks like this:
#MessagingGateway
public static interface TestGateway {
#Gateway(requestChannel = "promiseChannel")
Mono<Integer> multiply(Integer value);
}
...
#ServiceActivator(inputChannel = "promiseChannel")
public Integer multiply(Integer value) {
return value * 2;
}
...
Flux.just("1", "2", "3", "4", "5")
.map(Integer::parseInt)
.flatMap(this.testGateway::multiply)
.collectList()
.subscribe(integers -> ...);
I'd like to know what is more of a recommended way of processing when working with these two libraries. I wonder how can I use the Reactive MongoDB adapter in the second example. I'm not sure if the second approach is even possible without an IntegrationFlows wrapper.
The #MessagingGateway was designed for high-level end-user API, to hide messaging underneath as much as possible. So, the target service is free from any messaging abstraction when you develop its logic.
It is possible to use such an interface adapter from the IntegrationFlow and you should treat it as regular service activator therefore it would look like this:
.handle("testGateway", "multiply", e -> e.async(true))
The async(true) to make this service activator to subscribe to the returned Mono. You may omit this then you are on your own to subscriber to it downstream since exactly this Mono is going to be a payload for the next message in the flow.
If you want to have something opposite: call an IntegrationFlow from the Flux, like that flatMap(), then consider to use a toReactivePublisher() operator from the flow definition to return a Publisher<?> and declare it as a bean. In this case it is better to not use that MongoDb.reactiveOutboundChannelAdapter(), but just ReactiveMongoDbStoringMessageHandler to let its returned Mono to be propagated to that Publisher.
On the other hand if you want to have that #MessagingGateway with the Mono return, but still call from it a ReactiveMongoDbStoringMessageHandler, then declare it as a bean and mark it with that #ServiceActivator.
We also have an ExpressionEvaluatingRequestHandlerAdvice to catch errors (or success) on the particular endpoint and handle them respectively: https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-endpoints.html#expression-advice
I think what you are looking for is like this:
public IntegrationFlow buildFlow() {
return IntegrationFlows.from(reactiveKafkaConsumerTemplate)
.handle(reactiveMongoDbStoringMessageHandler, "handleMessage")
.handle(writeToKafka)
.get();
}
Pay attention to the .handle(reactiveMongoDbStoringMessageHandler) - it is not about a MongoDb.reactiveOutboundChannelAdapter(). Because this one wraps a ReactiveMessageHandler into a ReactiveMessageHandlerAdapter for automatic subscription. What you need is look more like you'd like to have that Mono<Void> returned to your own control, so you can use it as an input into your writeToKafka service and subscribe there yourself and handle success or error as you explained. The point is that with Reactive Stream we cannot provide an imperative error handling. The approach is the same like with any async API usage. So, we send errors to the errorChannel for Reactive Streams, too.
We probably can improve that MongoDb.reactiveOutboundChannelAdapter() with something like returnMono(true/false) to let the use-case like your to be available out-of-the-box.

Putting Spring WebFlux Publisher inside Model, good or bad practice?

I'm working on a code audit on a SpringBoot Application with Spring WebFlux and the team is putting Publisher directly inside the Model and then resolve the view.
I'm wondering if it is a good or bad practice because it seems to be working but in that case, which component is in charge of executing the Publisher ?
I think that it's the ViewResolver and it should not be its job. What do you think ?
Moreover, if the Publisher is not executed by the Controller, the classes annotated by #ControllerAdvice such like ExceptionHandler won't work if these Publisher return an error, right ?
Extract of the Spring WebFlux documentation :
Spring WebFlux, unlike Spring MVC, explicitly supports reactive types in the model (for example, Mono or io.reactivex.Single). Such asynchronous model attributes can be transparently resolved (and the model updated) to their actual values at the time of #RequestMapping invocation, provided a #ModelAttribute argument is declared without a wrapper, as the following example shows:
#ModelAttribute
public void addAccount(#RequestParam String number) {
Mono<Account> accountMono = accountRepository.findAccount(number);
model.addAttribute("account", accountMono);
}
#PostMapping("/accounts")
public String handle(#ModelAttribute Account account, BindingResult errors) {
// ...
}
In addition, any model attributes that have a reactive type wrapper are resolved to their actual values (and the model updated) just prior to view rendering.
https://docs.spring.io/spring-framework/docs/current/reference/html/web-reactive.html#webflux-ann-modelattrib-methods
Doesn't come as a shock to me.
Actually seems to be a good trade off between complexity and efficiency when the Publisher is handling complex stuff.
It has the advantage of executing the Publisher only if and when needed.
Although it might be a problem if the ModelMap handler does not have the capacity to use it properly.
As for the exceptional cases, maybe you do not want it to be executed and just printed, thus failing faster.
As for the question about what is executing the Publisher, a specific ViewResolver can be used as it is the component responsible for the "rendering". IMHO that's it's job. I do not know if a standard ViewResolver can be used for detecting values vs publishers and handle those automagically, yet this seems completely doable and efficient.

guava eventbus post after transaction/commit

I am currently playing around with guava's eventbus in spring and while the general functionality is working fine so far i came across the following problem:
When a user want's to change data on a "Line" entity this is handled as usual in a backend service. In this service the data will be persisted via JPA first and after that I create a "NotificationEvent" with a reference to the changed entity. Via the EventBus I send the reference of the line to all subscribers.
public void notifyUI(String lineId) {
EventBus eventBus = getClientEventBus();
eventBus.post(new LineNotificationEvent(lineId));
}
the eventbus itself is created simply using new EventBus() in the background.
now in this case my subscribers are on the frontend side, outside of the #Transactional realm. so when I change my data, post the event and let the subscribers get all necessary updates from the database the actual transaction is not committed yet, which makes the subscribers fetch the old data.
the only quick fix i can think of is handling it asynchronously and wait for a second or two. But is there another way to post the events using guava AFTER the transaction has been committed?
I don't think guava is "aware" of spring at all, and in particular not with its "#Transactional" stuff.
So you need a creative solution here. One solution I can think about is to move this code to the place where you're sure that the transaction has finished.
One way to achieve that is using TransactionSyncrhonizationManager:
TransactionSynchronizationManager.registerSynchronization(new TransactionSynchronization(){
void afterCommit(){
// do what you want to do after commit
// in this case call the notifyUI method
}
});
Note, that if the transaction fails (rolls back) the method won't be called, in this case you'll probably need afterCompletion method. See documentation
Another possible approach is refactoring your application to something like this:
#Service
public class NonTransactionalService {
#Autowired
private ExistingService existing;
public void entryPoint() {
String lineId = existing.invokeInTransaction(...);
// now you know for sure that the transaction has been committed
notifyUI(lineId);
}
}
#Service
public class ExistingService {
#Transactional
public String invokeInTransaction(...) {
// do your stuff that you've done before
}
}
One last thing I would like to mention here, is that Spring itself provides an events mechanism, that you might use instead of guava's one.
See this tutorial for example

Convert document objects to DTO spring reactive

I'm trying to convert a document object that is retrieved by the ReactiveCrudRepository as a Flux<Client> into Flux<ClientDto>
Now that I figure out a way to do this, I'm not sure if this is blocking or not:
public Mono<ServerResponse> findAll(final ServerRequest serverRequest) {
final Flux<ClientDto> map = clientService.findAll().map(client -> modelMapper.map(client, ClientDto.class)) /*.delayElements(Duration.ofSeconds(10))*/;
return ServerResponse.ok()
.contentType(MediaType.TEXT_EVENT_STREAM)
.body(map, ClientDto.class);
}
I've tried adding the commented delayElements method and it seems it's sending them one by one, so non-blocking.
I think this is more of a nested question, but at the core I want to know how do I figure out if I do something blocking.
Thanks in advance!
You are blocking if you explicitly call to block method or if you are using a standard jdbc connector to connect to the database instead of a reactive one like reactiveMongo provided by Spring Data.
In the snnipet you have posted, there isn't any blocking, but to be totally sure you should review the code of your clientService class and its nested calls (to a repository for example)

Cannot use ContextTransactionalCallable with TransactionProvider

I have an error, which I can't solve.
I am using Spring and JOOQ.
Error occurs here:
#Transactional
public UUID create(List<User> users) {
UUID uuid = UUID.randomUUID();
dslContext.transaction(() -> {
dslContext
.insertInto(APPLE, APPLE.APPLE_ID, APPLE.TITLE)
.values(uuid, uuid.toString())
.execute();
users.forEach(user -> {
dslContext
.insertInto(APPLE_MEMBERS, APPLE_MEMBERS.APPLE_ID, APPLE_MEMBERS.USER_ID)
.values(uuid, user.getUserId())
.execute();
});
});
return uuid;
}
Error:
org.jooq.exception.ConfigurationException: Cannot use ContextTransactionalCallable with TransactionProvider of type class org.springframework.boot.autoconfigure.jooq.SpringTransactionProvider
Maybe someone had same error or has idea how to solve this error?
Using out of the box features:
You have to pick one of the two approaches:
Spring's declarative transaction management through annotations
jOOQ's programmatic transaction management through its API
Out of the box, they cannot be combined. In your particular case, I don't see why you would want to do it. The nested, programmatic transaction has the exact same scope as the outer, declarative transaction. It is redundant.
Using custom TransactionProvider implementations
You could write your own TransactionProvider that is able to communicate with Spring's transaction management and allows for embedding nested transactions in #Transactional annotated methods, but I generally advise against it. Pick one of the two approaches.

Resources