Integration Test a Reactive Spring Cloud Stream - spring

TLDR; How do you test a Reactive Function composition using the Test Binder?
I have a Spring Cloud Stream that uses Reactive Functions and I don't know how to test it. I don't see any official docs on how to do an Integration Test from input source to output destination binder.
In my specific case, I am connecting a Spring Integration flow using a Reactive Supplier and the IntegrationReactiveUtils.messageChannelToFlux() pattern. This works in a development environment - I can pull messages from RabbitMQ using the Spring Integration Flow and they enter the SCSt.
My SCSt has several function chained together, each one is reactive. They are composed like func1|func2|func3. I verified this works with a dev Rabbit (source) and Kafka (Destination).
I can't seem to figure out how to test this, and there doesn't seem to be any official documentation on testing a complete reactive stream. Right now I have code that roughly looks like this:
#Autowired
MessageChannel inputChannel;
#Autowired
private OutputDestination output;
#Test
void myTest() {
//omitted prep of var 'messageToSend'
this.inputChannel.send(messageToSend);
var outputMessage = output.receive(5000);
Assertions.assertNotNull(outputMessage.getPayload());
}
The error I receive is that output.receive(5000) returns null. I suspect a threading issue because I am not subscribing to the Flux and waiting for completion.
I have run a debugger in the Flux functions and see the message going all the way to the end with no errors or weirdness.

I figured this out actually. I had to specify the binder name. I had a test property spring.cloud.stream.bindings.processingStream set, which I thought made 2 new bindings (processingStream-in-0 and processingStream-out-0).
It turns out I had to set the binding name in the test code like output.receive(5000, "processingStream"), without the -out-0 suffix. I can now receive messages from the stream.

Related

Need some guidance with Spring Integration Flow

I am new to Spring Integration and have read quite some documentation and other topics here on StackOverflow. But I am still a bit overwhelmed on how to apply the newly acquired knowledge in a Spring Boot Application.
This is what should happen:
receive message from a Kafka topic, eg from "request-topic" (payload is a custom Job POJO). InboundChannelAdapter?
do some preparation (checkout from a git repo)
process files using a batch job
commit&push to git, update Job object with commit-id
publish message to Kafka with updated Job object, eg to "reply-topic". OutboundChannelAdapter?
Using DSL or plain Java configuration does not matter. My problem after trying several variants is that I could not achieve the desired result. For example, handlers would be called too early, or not at all, and thus the reply in step 5 would not be updated.
Also, there should only be one flow running at any given time, so I guess, a queue should be involved at some point, probably at step 1(?).
Where and when should I use QueueChannels, DirectChannel (or any other?), do I need GatewayHandlers, eg to reply with a commit-id?
Any hints are appreciated.
Something like this:
#Bean
IntegrationFlow flow() {
return IntegrationFlows.from(Kafka.inboundGateway(...))
.handle(// prep)
.transform(// to JobLaunchRequest)
.handle(// JobLaunchingGateway)
.handle(// cleanUp and return result)
.get();
}
It will only process one request at a time (with default concurrency).

Verfiy/Test Acknowledgment was called in spring boot kafka

I've written integration tests for my Spring Boot Kafka (Consumer/Producer) service everything gone well. So I'm committing the offsets of my consumer manually after some processing.
I want to verify whether acknowledgment.acknowledge() was called in the consumer. Is is to possible verify?
Here is my method signature of the service:
#KafkaListener(topics = {TOPIC_XXX_V1}, containerFactory = "XXXListener")
private void consumer(#Payload XXXXRequestEvent xxxxRequestEvent, Acknowledgment acknowledgment) {
.....
// do something with the database
acknowledgment.acknowledge()
For the testing side I'm using #SpyBean for the Service and a MockBean for the database interaction. I want verify somehow whether in the test case the .acknowledge() was called. FYI: the .acknowledge() is a public abstract void method
As the Acknowledge instance is injected and created as part of Spring Kafka when consuming a message, I guess there is no way to use something like verify() of Mockito for this.
When writing a unit test instead you could pass a mocked version of Acknowledge here and then verify that this method was invoked. However, with a unit test, you can't test the actual consumption of a message (serialization, correct message handler, etc.).
So in your case, I would try to verify that your message was acknowledged by e.g. using Testcontainers to execute commands inside the Kafka container and ensuring that the already acknowledged message is not returned any more.
Another approach could be to create a Kafka client as part of your test and then try to consume messages from the same topic for X seconds and expect zero results. Awaitility might help you here.

Spring KafkaListener: How to know when it's ready

I have a simple Spring Boot application which reads from Kafka and writes to Kafka. I wrote a SpringBootTest using an EmbeddedKafka to test all that.
The main problem is: Sometimes the test fails because the test sends the Kafka message too early. That way, the message is already written to Kafka before the Spring application (or its KafkaListener to be precise) is ready. Since the listener reads from the latest offset (I do not want to change any config for my test - except bootstrap.servers), it will not receive all messages in that test.
Does anyone have an idea how I could know inside the test, that the KafkaListener is ready to receive messages?
Only way I could think of is waiting until /health comes available but I have no idea whether I can be sure that this implies the KafkaListener to be ready at all.
Any help is greatly appreciated!
Best regards.
If you have a KafkaMessageListenerContainer instance, then it is very easy to use org.springframework.kafka.test.utils.ContainerTestUtils.waitForAssignment(Object container, int partitions).
https://docs.spring.io/spring-kafka/api/org/springframework/kafka/test/utils/ContainerTestUtils.html
e.g. calling ContainerTestUtils.waitForAssignment(container, 1); in your Test setup will block until the container has gotten 1 partition assigned.
So, I just read about #PostConstruct and it turns out that you can easily use this also within the test:
#PostConstruct
public void checkApplicationReady() {
applicationReady = true;
}
Now I added an #Before method to wait until that flag is set to true.
So far this seems to work very nicely!

SpringBoot get InputStream and OutputStream from websocket

we want to integrate third party library(Eclipse XText LSP) into our SpringBoot webapp.
This library works "interactively" with the user (like chat). XText API requires input and output stream to work. We want to use WebSocket to let users interact with this library smoothly (send/retrieve json messages).
We have a problem with SpringBoot because SpringBoot support for WebSocket doesn't expose input/output streams. We wrote custom TextWebSocketHandler (subclass) but none of it's methods provide access to in/out streams.
We also tried with HandshakeInterceptor (to obtain in/out streams after handshake ) but with no success.
Can we use SpringBoot WebSocket API in this scenario or should we use some lower level (Servlet?) API ?
Regards Daniel
I am not sure if this will fit your architecture or not, but I have achieved this by using Spring Boot's STOMP support and wiring it into a custom org.eclipse.lsp4j.jsonrpc.RemoteEndpoint, rather than using a lower level API.
The approach was inspired by reading through the code provided in org.eclipse.lsp4j.launch.LSPLauncher.
JSON handler
Marhalling and unmarshalling the JSON needs to be done with the API provided with the xtext language server, rather than Jackson (which would be used by the Spring STOMP integration)
Map<String, JsonRpcMethod> supportedMethods = new LinkedHashMap<String, JsonRpcMethod>();
supportedMethods.putAll(ServiceEndpoints.getSupportedMethods(LanguageClient.class));
supportedMethods.putAll(languageServer.supportedMethods());
jsonHandler = new MessageJsonHandler(supportedMethods);
jsonHandler.setMethodProvider(remoteEndpoint);
Response / notifications
Responses and notifications are sent by a message consumer which is passed to the remoteEndpoint when constructed. The message must be marshalled by the jsonHandler so as to prevent Jackson doing it.
remoteEndpoint = new RemoteEndpoint(new MessageConsumer() {
#Override
public void consume(Message message) {
simpMessagingTemplate.convertAndSendToUser('user', '/lang/message',
jsonHandler.serialize(message));
}
}, ServiceEndpoints.toEndpoint(languageServer));
Requests
Requests can be received by using a #MessageMapping method that takes the whole #Payload as a String to avoid Jackson unmarshalling it. You can then unmarshall yourself and pass the message to the remoteEndpoint.
#MessageMapping("/lang/message")
public void incoming(#Payload String message) {
remoteEndpoint.consume(jsonHandler.parseMessage(message));
}
There may be a better way to do this, and I'll watch this question with interest, but this is an approach that I have found to work.

Spring Cloud Stream - Output Messages from EventListener

I'm trying to utilize the #DomainEvents mechanism provided by Spring Data to publish Events with Spring Cloud Stream (Spring Boot 2.0.0.M7 and Finchley.M5). I have a two-part question.
Why does SendTo not work on EventListeners?
Is there a better way to accomplish this?
The DomainEvent is being created and sent to the EventListener without issues. The problem is that the SendTo mechanism didn't seem to be working. The first method below would trigger, but not forward the message. Manually building the Message and sending it as shown in the second method works correctly.
#Async
#TransactionalEventListener
#SendTo(Sink.Output)
StreamedEvent handleEventWithSendTo(MyEvent event) {
// handle and create event
}
#TransactionalEventListener
void handleEvent(MyEvent event) {
// handle and create event
sink.output().send(MessageBuilder.withPayload(payload).build())
}
The call-out in the Spring Cloud Stream docs shows using SendTo on a StreamListener, which is not quite the same thing as an EventListener, but I thought it may work.
For the second part, using DomainEvents requires the service code persisting the Entity to know about the event (to either call registerEvent directly or some method on the Entity which represents the event). I was curious if using the Spring Data callback hooks (e.g. PreUpdate, PostUpdate) would be better. Or if there was a better way all together.

Resources