Mocking SchemaRegistryClient in stream processor Consumer - spring-boot

I have a Reactor-based Spring Boot Kafka stream processing app that I am working on writing integration tests for. I am using Spring's #EmbeddedKafka broker. It works great, I have it overriding the bootstrap broker urls that get configured on my reactive processor's consumer & publisher, but what I haven't figured out yet is how to deal with the schema registry for my processor when testing. I'm using Confluent's KafkaAvroSerializer and KafkaAvroDeserializer classes and just have the schema.registry.url field configured in my Spring app configs to get injected into the Kafka properties. I'm using Confluent's MockSchemaRegistryClient for the test producer and consumer, but what I need is a way to inject this mock client into the actual consumer and producer in my stream processor code, but I see no way to do that. Almost seems like I need something more like an embedded version of the schema registry to point them to like the embedded broker. Our build pipeline does not support spinning up containers otherwise I'd use Docker or Testcontainers. Anyone else solve this already? Any help or suggestions appreciated.

I managed to figure this out. If you use a url that begins with mock:// for your test's SerDes, and you override the schema.registry.url property in the #SpringBootTest annotation with the same mock url, then your processor's consumer and producer will also pick up and use this mock schema registry client, and everything just works!

Related

difference between usage of #RabbitListener() and usage of DirectMessageListenerContainer in consuming from a rabbitMQ in springboot

I am trying to configure concurrent consumers in spring to consume messages from RabbitMQ, in order to achieve that i have configured consumers in two ways
1.annotated a method with #RabbitListener(queues = "name of queue")
2.implementing "MessageListener" interface and overriding onMessage(Message message)
In my case both the ways worked fine, but i am unable to figure out what is the advantage/disadvantage of using #RabbitListener() for starting a consumer over the other way.
Also adding to that i have configured "DirectMessageListenerContainer" in my configuration and mapped it to "MessageListener" implementation to achieve concurrent consumers, my question here is can we do the same mapping for consumer implemented through #RabbitListener() and if so how. I couldnt find any source on how a consumer started with a #RabbitListener() annotated method can be configured with a "DirectMessageListenerContainer"
Any Help is appreciated.
#RabbitListener is simply a higher-level abstraction. It uses the listener container underneath.
When using spring boot, use the ...listener.type application property to specify which type of container you want.
The default is simple.

JMS configuration for Spring Integration

I am trying to implement activemq(just want to receive messages) with spring integration.I cant find any clues how to provide java configuration for activemq. What are the minimum required components for job. Somewhere we have channel, adapter somewhere we dont. I am unable to understand spring concepts of adapter, channel and service activator. They are all feeling same to me. I find the integration documentation going above my head. I never had problems with understanding other spring modules(boot, mvc, cloud, batch). Can someone point me in the right direction or what is it that I am doing wrong.
You probably are missing the fact that Spring Integration is a reference implementation for well-known Enterprise Integration Patterns. So, please, consider to start from the theory and ideas. Then you can come back to Spring Integration as an API for those EIP. See respective book on the matter: https://www.enterpriseintegrationpatterns.com.
To read messages from JMS destination you need to use a JmsMessageDrivenEndpoint with respective ConnectionFactory injected.
There is nothing more about that than an ActiveMQConnectionFactory as a bean.
For example in tests we do like this:
new ActiveMQConnectionFactory("vm://localhost?broker.persistent=false")
And an in-memory broker is started.
See a test class with Java DSL for some way how to configure JMS components: https://github.com/spring-projects/spring-integration/blob/master/spring-integration-jms/src/test/java/org/springframework/integration/jms/dsl/JmsTests.java

Get underlying low-level Kafka consumers and Producers in Spring Cloud Stream

I have a usecase where I want to get the underlying Kafka producer (KafkaTemplate) in a Spring Cloud Stream application. While navigating the code I stumbled upon KafkaProducerMessageHandler which has a getKafkaTemplate method. However, it fails to auto-wire.
Also, if I directly auto-wire KafkaTemplate, the template is initialized with default properties and it ignores the broker in the binder key of the SCSt configuration
How can I access the underlying KafkaTemplate or a producer/consumer in a Spring Cloud Stream app?
EDIT: Actually my SCSt app has multiple Kafka binders and I want to get the KafkaTemplate or Kafka producer corresponding to each binder. Is that possible somehow?
It's not entirely clear why you would need to do that, but you can capture the KafkaTemplates by adding a ProducerMessageHandlerCustomizer #Bean to the application context.

Spring Integration between two message brokers

I am new to Spring-Integration.
My use case is:
Listen to a RabbitMQ queue/topic, get the message, process it, send it to other message broker (mostly it will be another RabbitMQ instance).
Expected load: 5000 messages/sec
In application.properties we can set configurations for one host.
How to use Spring Integration between two message brokers?
All the examples that i see are for one message broker. Any pointers to get started with two message brokers and Spring Integration.
Regards,
Mahesh
Since you mention an application.properties it sounds like you use Spring Boot with its auto-configuration feature. It is very important detail in your question because Spring Boot has opinion about auto-configuration and you really can have only one broker connection configuration auto-configured. If you would like to have an another similar in the same application, then you should forget that auto-configuration feature. You still can use the mentioned application.properties, but you have to manage them manually.
Since you talk about a RabbitMQ connection, so you need to exclude RabbitAutoConfiguration and manage all the required beans manually:
#SpringBootApplication(exclude = RabbitAutoConfiguration.class)
You still can use the #EnableConfigurationProperties(RabbitProperties.class) on some your #Configuration class to be able to inject that RabbitProperties and populate respective CachingConnectionFactory. For the second broker you can introduce your own #ConfigurationProperties or just configure everything manually reading properties via #Value. See more info about manual connection factory configuration in Spring AMQP reference manual: https://docs.spring.io/spring-amqp/docs/2.2.1.RELEASE/reference/html/#connections

Spring Cloud Contract and plain Spring AMQP

We are using plain Spring AMQP in our spring boot projects.
We want to make sure that our message consumers can test against real messages and avoid to test against static test messages.
Thus our producers could generate message snippets in a test phase that can be picked up by the consumer test to make sure it tests against the latest message version and see if changes in the producer break the consumer.
It seems like Spring Cloud Contract does exactly that. So is there a way to integrate spring cloud contract with spring amqp? Any hints in which direction to go would be highly appreciated.
Actually we don't support it out of the box but you can set it up yourself. In the autogenerated tests we're using an interface to receive and send messages so
you could implement your own class that uses spring-amqp. The same goes for the consumer side (the stub runner). What you would need to do is to implement and register a bean of
org.springframework.cloud.contract.verifier.messaging.MessageVerifier type for both producer and consumer. This should work cause what we're doing in the autogenerated tests is that we #Inject MessageVerifier
so if you register your own bean it will work.
UPDATE:
As #Mathias has mentioned it, the AMQP support is already there in Spring Cloud Contract https://cloud.spring.io/spring-cloud-contract/spring-cloud-contract.html#_stub_runner_spring_amqp

Resources