Spring KafkaListener: How to know when it's ready - spring

I have a simple Spring Boot application which reads from Kafka and writes to Kafka. I wrote a SpringBootTest using an EmbeddedKafka to test all that.
The main problem is: Sometimes the test fails because the test sends the Kafka message too early. That way, the message is already written to Kafka before the Spring application (or its KafkaListener to be precise) is ready. Since the listener reads from the latest offset (I do not want to change any config for my test - except bootstrap.servers), it will not receive all messages in that test.
Does anyone have an idea how I could know inside the test, that the KafkaListener is ready to receive messages?
Only way I could think of is waiting until /health comes available but I have no idea whether I can be sure that this implies the KafkaListener to be ready at all.
Any help is greatly appreciated!
Best regards.

If you have a KafkaMessageListenerContainer instance, then it is very easy to use org.springframework.kafka.test.utils.ContainerTestUtils.waitForAssignment(Object container, int partitions).
https://docs.spring.io/spring-kafka/api/org/springframework/kafka/test/utils/ContainerTestUtils.html
e.g. calling ContainerTestUtils.waitForAssignment(container, 1); in your Test setup will block until the container has gotten 1 partition assigned.

So, I just read about #PostConstruct and it turns out that you can easily use this also within the test:
#PostConstruct
public void checkApplicationReady() {
applicationReady = true;
}
Now I added an #Before method to wait until that flag is set to true.
So far this seems to work very nicely!

Related

Spring boot start on listening to messages on application start

I have a Spring Boot application that starts listening on Azure IOT Hub at application start. It is done this way:
#EventListener
public void subscribeEventMessages(ContextRefreshedEvent event) {
client
.receive(false) // set this to false to read only the newly available events
.subscribe(this::hubAllEventsCallback);
}
My problem is, that this uses ContextRefreshedEvent but in fact i only want to start it once on application start.
I also checked other methods how start something at the beginning, like CommandLineRunner.
On the other hand if implementing listeners for more standard stuff like JMS there are specific Annotations like #JmsListener or providing Beans of specific Types.
My question is: Can i leverage some of these more message(subscribe) related mechanisms to start my method?
If we don't want our #EventListener to listen on "context refresh" but only on "context start", please (try) replace:
ContextRefreshEvent
with ContextStartEvent
...which is "sibling class" with exactly this semantical difference.

Integration Test a Reactive Spring Cloud Stream

TLDR; How do you test a Reactive Function composition using the Test Binder?
I have a Spring Cloud Stream that uses Reactive Functions and I don't know how to test it. I don't see any official docs on how to do an Integration Test from input source to output destination binder.
In my specific case, I am connecting a Spring Integration flow using a Reactive Supplier and the IntegrationReactiveUtils.messageChannelToFlux() pattern. This works in a development environment - I can pull messages from RabbitMQ using the Spring Integration Flow and they enter the SCSt.
My SCSt has several function chained together, each one is reactive. They are composed like func1|func2|func3. I verified this works with a dev Rabbit (source) and Kafka (Destination).
I can't seem to figure out how to test this, and there doesn't seem to be any official documentation on testing a complete reactive stream. Right now I have code that roughly looks like this:
#Autowired
MessageChannel inputChannel;
#Autowired
private OutputDestination output;
#Test
void myTest() {
//omitted prep of var 'messageToSend'
this.inputChannel.send(messageToSend);
var outputMessage = output.receive(5000);
Assertions.assertNotNull(outputMessage.getPayload());
}
The error I receive is that output.receive(5000) returns null. I suspect a threading issue because I am not subscribing to the Flux and waiting for completion.
I have run a debugger in the Flux functions and see the message going all the way to the end with no errors or weirdness.
I figured this out actually. I had to specify the binder name. I had a test property spring.cloud.stream.bindings.processingStream set, which I thought made 2 new bindings (processingStream-in-0 and processingStream-out-0).
It turns out I had to set the binding name in the test code like output.receive(5000, "processingStream"), without the -out-0 suffix. I can now receive messages from the stream.

Need some guidance with Spring Integration Flow

I am new to Spring Integration and have read quite some documentation and other topics here on StackOverflow. But I am still a bit overwhelmed on how to apply the newly acquired knowledge in a Spring Boot Application.
This is what should happen:
receive message from a Kafka topic, eg from "request-topic" (payload is a custom Job POJO). InboundChannelAdapter?
do some preparation (checkout from a git repo)
process files using a batch job
commit&push to git, update Job object with commit-id
publish message to Kafka with updated Job object, eg to "reply-topic". OutboundChannelAdapter?
Using DSL or plain Java configuration does not matter. My problem after trying several variants is that I could not achieve the desired result. For example, handlers would be called too early, or not at all, and thus the reply in step 5 would not be updated.
Also, there should only be one flow running at any given time, so I guess, a queue should be involved at some point, probably at step 1(?).
Where and when should I use QueueChannels, DirectChannel (or any other?), do I need GatewayHandlers, eg to reply with a commit-id?
Any hints are appreciated.
Something like this:
#Bean
IntegrationFlow flow() {
return IntegrationFlows.from(Kafka.inboundGateway(...))
.handle(// prep)
.transform(// to JobLaunchRequest)
.handle(// JobLaunchingGateway)
.handle(// cleanUp and return result)
.get();
}
It will only process one request at a time (with default concurrency).

Verfiy/Test Acknowledgment was called in spring boot kafka

I've written integration tests for my Spring Boot Kafka (Consumer/Producer) service everything gone well. So I'm committing the offsets of my consumer manually after some processing.
I want to verify whether acknowledgment.acknowledge() was called in the consumer. Is is to possible verify?
Here is my method signature of the service:
#KafkaListener(topics = {TOPIC_XXX_V1}, containerFactory = "XXXListener")
private void consumer(#Payload XXXXRequestEvent xxxxRequestEvent, Acknowledgment acknowledgment) {
.....
// do something with the database
acknowledgment.acknowledge()
For the testing side I'm using #SpyBean for the Service and a MockBean for the database interaction. I want verify somehow whether in the test case the .acknowledge() was called. FYI: the .acknowledge() is a public abstract void method
As the Acknowledge instance is injected and created as part of Spring Kafka when consuming a message, I guess there is no way to use something like verify() of Mockito for this.
When writing a unit test instead you could pass a mocked version of Acknowledge here and then verify that this method was invoked. However, with a unit test, you can't test the actual consumption of a message (serialization, correct message handler, etc.).
So in your case, I would try to verify that your message was acknowledged by e.g. using Testcontainers to execute commands inside the Kafka container and ensuring that the already acknowledged message is not returned any more.
Another approach could be to create a Kafka client as part of your test and then try to consume messages from the same topic for X seconds and expect zero results. Awaitility might help you here.

Spring Batch Step Integration Testing

I'm looking for some general opinions and advice on testing a Spring batch step and step execution.
My basic step reads in from an api, processes into an entity object and then writes to a DB. I have tested the happy path, that the step completes successfully. What I now want to do is test the exception handling when data is missing at the processor stage. I could test the processor class in isolation, but I'd rather test the step as a whole to ensure the process failure is reflected correctly at step/job level.
I've read the spring batch testing guidelines and if I'm honest, I'm slightly lost within it. Is it possible to use StepScopeTestUtils.doInStepScope or updating the StepExecution to test this scenario? Ideally I'd force the reader to return faulty data before the processor kicks in.
Any advice would be greatly appreciated.
The best approach depends on the scope of your test. Reading a little between the lines here, I assume you are using a Spring IT, setting up a Spring context and using the JobLauncherTestUtils to start a job or a step.
I think the easiest way is replace one of your beans with a mock that triggers the error scenario. Using Mockito, this can be done by adding something like this to your test-configuration.
#Bean
public ReaderDataRepository dataApi(){
return mock(ReaderDataRepository.class);
}
This bean then overrides the actual implementation. In the test setup you can then configure this mock very explicitly.
#Autowired
private ReaderDataRepository mockedRepository;
#Before
public void setUp() {
when(mockedRepository.getData()).thenReturn(faultyData())
}
This involves very little manipulation of Spring 'magic' and very explicitly defines the error from within the test.

Resources