How to detect batch submission failures from the Azure ServiceBus Spring Cloud Stream binder? - spring

How are those leveraging the Spring Cloud Streaming Azure Service Bus Binder supposed to handle Batch Submission Failures?
The Async Client library simply logs batch submission failures.
azure-sdk-for-java/sdk/servicebus/azure-messaging-servicebus/src/main/java/com/azure/messaging/servicebus/ServiceBusSenderAsyncClient.java
Line 819 in 30152b0
.doOnError(error -> LOGGER.error("Error sending batch.", error));
We are regularly encountering this issue every weekend where submissions to ServiceBus fail and messages are dropped. We're capable of alerting on these issues thanks to the logging but we end up dropping messages because we can't react programmatically to the failure. Is there a way to bubble this up? Can I choose to perform the submissions synchronously?
I haven't been able to find a way (yet) to configure the Spring Cloud Stream library to either switch to the synchronous client or to somehow bubble up submission failures through some other notification mechanism.
The producer here is using a rather simple setup:
spring:
cloud:
stream:
default-binder: servicebus-topic
spring:
cloud:
azure:
servicebus:
connection-string: "${serviceBusConnectionString}"
Leveraging StreamBridge for submission
public boolean publishEvent(Event e, Map<String, Object> additionalHeaders) {
MessageBuilder<Event> messageBuilder = MessageBuilder.withPayload(e);
additionalHeaders.get().forEach((key, value) -> messageBuilder.setHeaderIfAbsent(key, value));
return streamBridge.send(bindingName, messageBuilder.build());
}
Messages are submitted to the topic directly. Subscriptions handle routing via header values.
Snippet of library versions via our gradle build
id 'org.springframework.boot' version '2.5.6'
...
implementation 'org.springframework.boot:spring-boot-starter-web'
implementation 'org.springframework.cloud:spring-cloud-config-client:3.0.5'
implementation 'org.springframework.boot:spring-boot-starter-validation:2.5.6'
implementation 'org.springframework.boot:spring-boot-starter-actuator:2.5.6'
implementation 'com.azure.spring:azure-spring-cloud-stream-binder-servicebus-topic:2.13.0'
implementation 'org.springframework.boot:spring-boot-starter-data-jpa:2.5.6'
implementation 'org.springframework.retry:spring-retry:1.3.1'
implementation 'org.springdoc:springdoc-openapi-ui:1.6.6'

Related

Start Spring Batch job through JMS with Spring Cloud Dataflow

I have an application which listen to an activeMQ queue and start a Batch Job when receiving a message.
I'd like to use Spring Cloud Dataflow to provide an UI but I can't find informations on how to configure it.
Since it uses Spring Boot I should be able to replicate how my application currently works (use a REST API to make it listen to activeMQ and start job when receiving message), but I can't find anything on how to make it start the batch in Cloud Dataflow.
You have a few options here.
Option 1: Launch your application as-is and manually send message to launch task.
Any arbitrary Spring Boot application can be launched from Dataflow (simply register it as type = "App").
Taken from https://github.com/spring-cloud/spring-cloud-dataflow/blob/main/spring-cloud-dataflow-docs/src/main/asciidoc/streams.adoc#register-a-stream-application:
Registering an application by using --type app is the same as registering a source, processor or sink. Applications of the type app can be used only in the Stream Application DSL (which uses double pipes || instead of single pipes | in the DSL) and instructs Data Flow not to configure the Spring Cloud Stream binding properties of the application. The application that is registered using --type app does not have to be a Spring Cloud Stream application. It can be any Spring Boot application. See the Stream Application DSL introduction for more about using this application type.
You would have to send the task launch in your code. You can use the Dataflow REST client to do this. You can get an idea of how to do that by looking at https://github.com/spring-cloud/spring-cloud-dataflow/tree/main/spring-cloud-dataflow-tasklauncher/spring-cloud-dataflow-tasklauncher-sink.
Option 2: Use pre-built stream applications to model the same flow as your application.
The app you describe can be logically modeled as a Spring Cloud Stream application.
There is a JMS source (provides messages to signal the need to kickoff task/batch job)
There is a TaskLauncher sink (receives messages and kicks off the task/batch job)
This app can actually be constructed w/ little effort by using the pre-packaged applications to model this flow.
JMS Source
Dataflow Tasklauncher Sink
If you have to register these applications in the UI - they can be found at:
maven://org.springframework.cloud.stream.app:jms-source-kafka:3.1.1
maven://org.springframework.cloud:spring-cloud-dataflow-tasklauncher-sink-kafka:2.9.2
Stream definition:
jms-source | dataflow-tasklauncher-sink
The README(s) on the above source/sinks give details about the configuration options.
Option 3: Custom Spring Cloud Stream app w/ function composition
The previous option would create 2 separate apps. However, if you want to keep the logic in a single app then you can look into creating a custom Spring Cloud Stream app that uses function composition and leverage the pre-built reusable Java functions that the apps in option 2 are built upon.
JMS Supplier
TaskLauncherFunction

REST API command with event driven choreography

I'm trying to design a system in an event-driven architecture style, trying also to expose REST API to send commands/queries. I decided to use Kafka as a message broker.
The choreography I'm trying to design is the following:
The part that is very obscure to me is how to implement event joins:
billing-service should start creating the user only when it receives the user creation event (1) and the account has been created (2)
api-gateway should return the result to the client only when both account and billing service have finished their processing (2 and 3)
I know I could use other protocols on the client side (e.g. WebSockets) but I prefer not doing that because I will need to expose such API to 3rd party. I could also do an async client call and poll to check if the request has been completed but it appears very complex to manage.
What is the suggested way of implementing such an interaction?
p.s. I'm using Spring Boot and Spring Cloud Stream.
Request/reply messaging on the client side is possible with spring-cloud-stream, but it's a bit involved because it wasn't designed for that, it's intended for unidirectional stream processing.
You would be better off using spring-kafka (ReplyingKafkaTemplate) or spring-integration-kafka (Outbound Gateway) for request/reply on the client side.
On the service side you can use a #StreamListener (spring-cloud-stream) or a #KafkaListener or a spring-integration inbound-gateway.

Spring-Kafka vs. Spring-Cloud-Stream (Kafka)

Using Kafka as a messaging system in a microservice architecture what are the benefits of using spring-kafka vs. spring-cloud-stream + spring-cloud-starter-stream-kafka ?
The spring cloud stream framework supports more messaging systems and has therefore a more modular design. But what about the functionality ? Is there a gap between the functionality of spring-kafka and spring-cloud-stream + spring-cloud-starter-stream-kafka ?
Which API is better designed?
Looking forward to read about your opinions
Spring Cloud Stream with kafka binder rely on Spring-kafka. So the former has all functionalities supported by later, but the former will be more heavyweight. Below are some points help you make the choice:
If you might change kafka into another message middleware in the future, then Spring Cloud stream should be your choice since it hides implementation details of kafka.
If you want to integrate other message middle with kafka, then you should go for Spring Cloud stream, since its selling point is to make such integration easy.
If you want to enjoy the simplicity and not accept performance overhead, then choose spring-kafka
If you plan to migrate to public cloud service such as AWS Kensis, Azure EventHub, then use spring cloud stream which is part of spring cloud family.
Use Spring Cloud Stream when you are creating a system where one channel is used for input does some processing and sends it to one output channel. In other words it is more of an RPC system to replace say RESTful API calls.
If you plan to do an event sourcing system, use Spring-Kafka where you can publish and subscribe to the same stream. This is something that Spring Cloud Stream does not allow you do do easily as it disallows the following
public interface EventStream {
String STREAM = "event_stream";
#Output(EventStream.STREAM)
MessageChannel publisher();
#Input(EventStream.STREAM)
SubscribableChannel stream();
}
A few things that Spring Cloud Stream helps you avoid doing are:
setting up the serializers and deserializers

Spring 5 Reactive WebSockets: recommended use

I've been learning a bit about Spring 5 WebFlux, reactive programming and websockets. I've watched Josh Long's Spring Tips: Reactive WebSockets with Spring Framework 5. The code that sends data from server to client through a WebSocket connection uses a Spring Integration IntegrationFlow that publishes to a PublishSubcribeChannel which has a custom MessageHandler subscribed to it that takes the message, converts it to an object that is then converted to Json and emitted to the FluxSink from the callback supplied to Flux.create(), which is used to send to the WebSocketConnection.
I was wondering if the use of IntegrationFlow and PublishSubscribeChannel is the recommended way to push events from a background process to the client, or if this is just more convenient in this particular example (monitoring the file system). I'd think if you have control over the background process, you could have it emit to the FluxSink directly?
I'm thinking about use cases similar to the following:
a machine learning process whose progress is monitored
updates to the state of a game world that are sent to players
chat rooms / team collaboration software
...
What I've done in the past that has worked for me is to create a Spring Component that implements WebSocketHandler:
#Component
public class ReactiveWebSocketHandler implements WebSocketHandler {
Then in the handle method, Spring injects the WebSocketSession object
#Override
public Mono<Void> handle(WebSocketSession session) {
Then create one or more Flux reactive publishers that emit messages(WebSocketMessage) for the client.
final var output = session.send(Flux.merge(flux1, flux2));
Then you can zip up the incoming and outgoing Flux objects in a Mono and then Spring will take it from there.
return Mono.zip(incomingWebsocketMsgResponse.getWebSocketMsgFlux().then(),
outputWithErrorMsgs)
.then();
Example: https://howtodoinjava.com/spring-webflux/reactive-websockets/
Since this question, Spring introduced RSocket support - you might think about it like the WebSocket STOMP support existing in Spring MVC, but much more powerful and efficient, supporting backpressure and advanced communication patterns at the protocol level.
For the use cases you're mentioning, I'd advise using RSocket as you'd get a powerful programming model with #MessageMapping and all the expected support in Spring (codecs for JSON and CBOR, security, etc).

Spring Integration - How to add message on topic before sending back acknowledgement for SOAP web service using declarative approach

I am consuming SOAP web service using inbound-gateway.We need to put message in kafka topic and return a synchronous acknowledgement to requestor using declarative way of spring integration. Is this possible ?
public Acknowledgement process(#RequestPayload MessagePayload payload) {
// perform validation & logic
// need to send message to kafka topic using declarative way
// sending synchronous ack to request originator
return new Acknowledgement();
}
The kafka outbound channel adapter has a sync property setSync(true) when using java config sync="true" when using XML.
The calling thread (web container) will block until kafka assumes responsibility. If you use a publish-subscribe channel, make the kafka adapter the first consumer, and a service to build the Acknowledgement the second consumer (use the order property in the consumers to ensure proper ordering).
Or you can use a KafkaTemplate directly from your controller.

Resources