Overridden onMessage of MessageListner not getting called in Spring Kafka Consumer Unit Test - spring-boot

I am writing Kafka Consumer Unit Test, and need to Mock the Service of my KafkaConsumer for testing the Kafka Consumer independently. But, the mockObject of Service is not getting invoked, instead Spring is creating the original Service class object and calling it. Thus, my mock class object not getting called.
KafkaConsumer :
#Slf4j
#Component
#RequiredArgsConstructor (onConstructor = #__(#Autowired))
public class KafkaEventConsumer {
private final MyService requestService;
#KafkaListener (topics = "${kafka.topic:topic-name}")
public void receive(#Payload String message) throws Exception {
try {
LOGGER.debug("Received message:{} ", message);
ObjectMapper mapper = new ObjectMapper();
ForecastRequest forecastRequest = mapper.readValue(message, ForecastRequest.class);
JobDetail jobDetail = requestForecastService.refreshForecasts(forecastRequest);
if (jobDetail.getJobStatus() != JobStatus.complete) {
LOGGER.error("Failed to Refresh Forecast for ProgramId-{}, JobId-{}, JobStatus-{}",
forecastRequest.getProgramId(), jobDetail.getJobId(), jobDetail.getJobStatus());
throw new Exception("Internal Server Error");
}
} catch (Exception e) {
LOGGER.error("Failed to Refresh Forecast for Forecast Request {}", message, e);
throw e;
}
}
}
Kafka Consumer Test :
#RunWith (SpringRunner.class)
#ActiveProfiles ("kafkatest")
#SpringBootTest (classes = ForecastEventConsumerApplication.class)
#DirtiesContext
public class KafkaEventConsumerTest {
private static String TOPIC = "topic-name";
#Mock
private MyServiceImpl myServiceMock;
#InjectMocks
private KafkaEventConsumer kafkaEventConsumer;
private KafkaTemplate<String, String> template;
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#ClassRule
public static final KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true,3, TOPIC);
#Before
public void setUp() throws Exception {
kafkaEventConsumer = new KafkaEventConsumer(myServiceMock);
// set up the Kafka producer properties
Map<String, Object> senderProperties = KafkaTestUtils.senderProps(embeddedKafka.getBrokersAsString());
// create a Kafka producer factory
ProducerFactory<String, String> producerFactory = new DefaultKafkaProducerFactory<String, String>(senderProperties);
// create a Kafka template
template = new KafkaTemplate<>(producerFactory);
// set the default topic to send to
template.setDefaultTopic(TOPIC);
// wait until the partitions are assigned
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
messageListenerContainer.setupMessageListener(new MessageListener<String, String>() {
#Override
public void onMessage(ConsumerRecord<String, String> record) {
try {
kafkaEventConsumer.receive(record.value());
} catch (Exception e) {
e.printStackTrace();
}
}
});
ContainerTestUtils.waitForAssignment(messageListenerContainer, embeddedKafka.getPartitionsPerTopic());
}
}
#AfterClass
public static void tearDown() throws Exception {
embeddedKafka.destroy();
}
#Test
public void testReceive() throws Exception {
String forecastRequestMessage = "{\"programId\":100011770}";
ForecastRequest forecastRequest = ForecastRequest.builder().programId(100011770L).build();
JobDetail jobDetail = JobDetail.builder().jobStatus(JobStatus.complete).build();
Mockito.when(forecastServiceMock.refreshForecasts(Matchers.any())).thenReturn(jobDetail);
template.sendDefault(forecastRequestMessage);
Thread.sleep(2000L);
// validate something
}
}
The problem is, in the above #Test method instead of calling the mocked version of MyService it is calling the original MyService implementation. Also, while debugging my code I found that overridden onMessage() is also not getting called. Please help me in finding what am I doing wrong here.

You have to stop() all the MessageListenerContainers before calling their setupMessageListener(). Then you will need to start() them back to let them to pick up a fresh listener:
protected void doStart() {
...
Object messageListener = containerProperties.getMessageListener();
Assert.state(messageListener != null, "A MessageListener is required");
Anyway that sounds like you really would like to mock only your MyService which is injected into the real KafkaEventConsumer. So, how about to consider to use that like this:
#MockBean
private MyServiceImpl myServiceMock;
And you won't need to do anything in your #Before and no need in the #InjectMocks.
The KafkaEmbedded can expose its host/port (or brokers) properties to the expected Spring Boot conventional configuration properties like this:
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", kafkaEmbedded.getBrokersAsString());
}
https://docs.spring.io/spring-boot/docs/2.0.0.RELEASE/reference/htmlsingle/#boot-features-testing-spring-boot-applications-mocking-beans

Related

Why are my MockBeans and MockRestServiceServer not returning proper responses when testing JMS Listener in Spring Boot

I am having an issue when trying to integration test my JMS listener with Mockito and MockRestServiceServer. Even if I'm using the correct Mockito.when annotations, they are coming up as null, and the MockRestServiceServer is acting as if it isn't being called. If I switch instead to test against the myService component that the jms listener calls, the mocks and the MockRestServiceServer calls are working as expected, which is puzzling. I am connecting to an embedded ActiveMQ broker for the test and I am using Spring Boot 2.2.8.RELEASE and JDK 8.x if that helps.
Here is the JMS Listener Class
#Component
public class MyJmsListener {
#Autowired
private MyService myService;
#JmsListener(
destination = "${jms.queue}",
containerFactory = "myJmsListenerContainerFactory"
)
public void receive(Message<String> message) {
myService.process(message);
}
}
Here is the JMS Listener Test Class
#RunWith(SpringRunner.class)
#SpringBootTest
#ActiveProfiles("test")
public class JmsListenerTest {
...
#MockBean
private AuthorizationService authorizationService;
...
#Autowired
private MockRestServiceServer mockRestServiceServer;
#Autowired
private JmsTemplate listenerTestJmsTemplate;
#Value("${jms.queue}")
private String testDestination;
...
#Test
public void testListener() throws IOException, URISyntaxException, InterruptedException {
//ARRANGE
String payloadPath = "classpath:payloads/listenerPayload.json";
String payload = new String(Files.readAllBytes(ResourceUtils.getFile(payloadPath).toPath()));
String testAuth = "auth";
Mockito.when(authorizationService.generateTicket(Mockito.any(Headers.class), Mockito.eq("9130353887051456")))
.thenReturn(testAuth);
String extPayloadPath = "classpath:payloads/revokeCancelAutoRenewRequestApi.json";
String extPayload = new String(Files.readAllBytes(ResourceUtils.getFile(extPayloadPath).toPath()));
mockRestServiceServer.expect(ExpectedCount.once(), MockRestRequestMatchers.requestTo(new URI("/test/v3/subscriptions/400367048/something")))
.andExpect(MockRestRequestMatchers.content().string(extPayload))
.andExpect(MockRestRequestMatchers.header(HttpHeaders.AUTHORIZATION, testAuth))
.andRespond(MockRestResponseCreators.withStatus(HttpStatus.OK));
//ACT
listenerTestJmsTemplate.convertAndSend(testDestination, payload);
//ASSERT
mockRestServiceServer.verify();
Assert.assertTrue(JmsListenerWrapperConfiguration.latch.await(5, TimeUnit.SECONDS));
}
...
}
I have a JmsListenerWrapperConfiguration that will allow me to wrap the countdown latch into the jms listener.
#Configuration
#Profile("test")
public class JmsListenerWrapperConfiguration {
public static final CountDownLatch latch = new CountDownLatch(1);
#Bean
public JmsTemplate listenerTestjmsTemplate(ActiveMQConnectionFactory activeMQConnectionFactory){
JmsTemplate jmsTemplate = new JmsTemplate(activeMQConnectionFactory);
return jmsTemplate;
}
/**
* Wrap the JMS Listeners with a count down latch that will allow us to unit test them.
* #return The bean post processor that will wrap the JMS Listener.
*/
#Bean
public static BeanPostProcessor listenerWrapper() {
return new BeanPostProcessor() {
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof MyJmsListener) {
MethodInterceptor interceptor = new MethodInterceptor() {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
Object result = invocation.proceed();
if (invocation.getMethod().getName().equals("listen")) {
latch.countDown();
}
return result;
}
};
if (AopUtils.isAopProxy(bean)) {
((Advised) bean).addAdvice(interceptor);
return bean;
}
else {
ProxyFactory proxyFactory = new ProxyFactory(bean);
proxyFactory.addAdvice(interceptor);
return proxyFactory.getProxy();
}
}
else {
return bean;
}
}
};
}
}
The MockRestServiceServer configuration is defined here.
#Configuration
#Profile("test")
public class MockRestServiceServerConfiguration {
#Bean
public MockRestServiceServer mockRestServiceServer(RestTemplate restTemplate) {
MockRestServiceServerBuilder builder = MockRestServiceServer.bindTo(restTemplate);
MockRestServiceServer server = builder.bufferContent().build();
return server;
}
}
The error that I see is as follows.
java.lang.AssertionError: Further request(s) expected leaving 1 unsatisfied expectation(s).
0 request(s) executed.
at org.springframework.test.web.client.AbstractRequestExpectationManager.verify(AbstractRequestExpectationManager.java:159)
at org.springframework.test.web.client.MockRestServiceServer.verify(MockRestServiceServer.java:116)
Update
I've been debugging and of course the test is running on thread[main], whereas the JMS listener is running on thread[DefaultMessageListenerContainer-1], so my question then becomes, what should we do with Mockito mocking when the mocks/verifications need to be used by separate threads?
It turns out that the MockRestServiceServer needed to verify after the latch is awaiting as shown in this code below.
Assert.assertTrue(JmsListenerWrapperConfiguration.latch.await(5, TimeUnit.SECONDS));
mockRestServiceServer.verify();

How to write JUnit test case for JMS client in Spring Boot

I have to write unit test for my producer and consumer in ActiveMQ Artemis. I'm using publish-subscribe model. How do I write the test cases for both?
The following is my producer code:
#Component
public class OrderPublisher {
#Autowired
JmsTemplate jmsTemplate;
private static final Logger log=LoggerFactory.getLogger(OrderPublisher.class) ;
#Value("${jsa.activemq.topic}")
String orderTopic;
public void sendOrderData(String orderData) {
log.info("sending message to queue");
jmsTemplate.convertAndSend(orderTopic,orderData);
}
}
This is my application properties for producer:
spring.jms.pub-sub-domain=true
jsa.activemq.topic=bt-order-queue
spring.artemis.mode=native
spring.artemis.host=localhost
spring.artemis.port=61616
spring.artemis.user=admin
spring.artemis.password=admin
This is my code for consumer which is in a separate project.
#Component
public class OrderSubscriber {
private final RestTemplate restTemplate = new RestTemplate();
ObjectMapper objectMapper= new ObjectMapper();
private static final Logger log = LoggerFactory.getLogger(OrderSubscriber.class);
#JmsListener(destination = "${jsa.activemq.topic}")
public void receiveMessage(String order) throws JsonMappingException, JsonProcessingException {
System.out.println("order data from queue::"+order);
}
}
Properties file for consumer:
spring.artemis.mode=native
spring.artemis.host=localhost
spring.artemis.port=61616
spring.artemis.user=admin
spring.artemis.password=admin
spring.jms.pub-sub-domain=true
jsa.activemq.topic=bt-order-queue
This what I'm trying for prodcuer:
#RunWith(SpringRunner.class)
#SpringBootTest
class ApplicationTests {
#Autowired
OrderPublisher publisher;
#Rule
public EmbeddedActiveMQResource resource = new EmbeddedActiveMQResource();
//ActiveMQProducerResource producer = new ActiveMQProducerResource(resource.getVmURL(), "bt-order-queue");
//ActiveMQConsumerResource consumer = new ActiveMQConsumerResource(resource.getVmURL(), "bt-order-queue");
#Before
public void before() {
resource.start();
resource.createQueue("bt-order-queue");
}
#Test
void contextLoads() {
publisher.sendOrderData("test");
ClientMessage cons=resource.receiveMessage("bt-order-queue", 100000);
System.out.println("cons::"+cons.toString());
}
}
But as I'm lacking information on how to write it, I'm just trying to figure out a way

Explicitly Start Kafka Consumer In Main After method runs

I have a spring boot service that consumes from a kafka topic. When i consume i perform certain tasks on the kafka message. Before i can perform these operations i need to wait for the service to load some data into caches that i have set up. My issue is if i set kafka consumer to autostart it starts consuming before the cache loads and it errors out.
I am trying to explicitly start the consumer after i load the cache however i get null pointer exceptions.
#Configuration
public class KafkaConfig {
#Value("${kafka.server}")
String server;
#Value("${kafka.port}")
String port;
#Value("${kafka.group.id}")
String groupid;
#Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> config = new HashMap<>();
config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, server+":"+port);
config.put(ConsumerConfig.GROUP_ID_CONFIG, groupid);
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
// config.put("security.protocol","SASL_PLAINTEXT");
// config.put("sasl.kerberos.service.name","kafka");
return new DefaultKafkaConsumerFactory<>(config);
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
factory.setAutoStartup(false);
return factory;
}
}
KafkaListener
#Service
public class KafkaConsumer {
#Autowired
AggregationService aggregationService;
#Autowired
private KafkaListenerEndpointRegistry registry;
private final CounterService counterService;
public KafkaConsumer(CounterService counterService) {
this.counterService = counterService;
}
#KafkaListener(topics = "gliTransactionTopic", group = "gliDecoupling", id = "gliKafkaListener")
public boolean consume(String message,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) Integer partition,
#Header(KafkaHeaders.OFFSET) Long offset) throws ParseException {
System.out.println("Inside kafka listener :" + message+" partition :"+partition.toString()+" offset :"+offset.toString());
aggregationService.run();
return true;
}
}
service To start stop
#Service
public class DecouplingController {
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
public void stop() {
MessageListenerContainer listenerContainer = kafkaListenerEndpointRegistry
.getListenerContainer("gliKafkaListener");
listenerContainer.stop();
}
public void start() {
MessageListenerContainer listenerContainer = kafkaListenerEndpointRegistry
.getListenerContainer("gliKafkaListener");
listenerContainer.start();
}
}
main method
#SpringBootApplication
public class DecouplingApplication {
Ignite ignite;
static IgniteCache<Long, MappingsEntity> mappingsCache;
public static void main(String[] args) {
SpringApplication.run(DecouplingApplication.class, args);
Ignition.setClientMode(true);
Ignite ignite = Ignition.ignite("ignite");
loadCaches(ignite);
}
public static boolean loadCaches(Ignite ignite) {
mappingsCache = ignite.getOrCreateCache("MappingsCache");
mappingsCache.loadCache(null);
System.out.println("Data Loaded");
DecouplingController dc=new DecouplingController();
dc.start();
return true;
}
}
Below is the exception
Data Loaded
Exception in thread "main" java.lang.NullPointerException
at com.ignite.spring.decoupling.controller.DecouplingController.start(DecouplingController.java:126)
at com.ignite.spring.decoupling.DecouplingApplication.loadCaches(DecouplingApplication.java:64)
at com.ignite.spring.decoupling.DecouplingApplication.main(DecouplingApplication.java:37)
Instead of manually creating an object of DecouplingController , autowire the dependency in DecouplingApplication.
#Autowired
DecouplingController deDecouplingController;
The ApplicationContext which deals with autowired dependencies is not aware about the object you manually created using "new". The autowired kafkaListenerEndpointRegistry is unknown to the new DecouplingController object you created.
Seems like the gliKafkaListener1 was not registred an some part of ConsumerConfig/ListenerConfig

Spring `#Autowire` field is `null` eventhough it works fine in other classes

Spring #Autowire field is null even though it works fine in other classes successfully.
public class SendRunner implements Runnable {
private String senderAddress;
#Autowired
private SubscriberService subscriberService;
public SendRunner(String senderAddress) {
this.senderAddress = senderAddress;
}
#Override
public void run() {
sendRequest();
}
private void sendRequest() {
try {
HashMap<String, String> dataMap = new HashMap<>();
dataMap.put("subscriberId", senderAddress);
HttpEntity<?> entity = new HttpEntity<Object>(dataMap, httpHeaders);
Subscriber subscriber = subscriberService.getSubscriberByMsisdn(senderAddress);
} catch (Exception e) {
logger.error("Error occurred while trying to send api request", e);
}
}
Also this class is managed as a bean in the dispatcher servlet :
<bean id="SendRunner" class="sms.dating.messenger.connector.SendRunner">
</bean>
In here i'm getting a null pointer exception for subscriberService. What would be the possible reason for this? Thanks in advance.
Can you please try with below code snippet
#Configuration
public class Someclass{
#Autowired
private SubscriberService subscriberService;
Thread subscriberThread = new Thread() {
#Override
public void run() {
try {
HashMap<String, String> dataMap = new HashMap<>();
dataMap.put("subscriberId", senderAddress);
HttpEntity<?> entity = new HttpEntity<Object>(dataMap, httpHeaders);
Subscriber subscriber = subscriberService.getSubscriberByMsisdn(senderAddress);
} catch (Exception e) {
logger.error("Error occurred while trying to send api request", e);
}
}
};
}
Can you please annotate your SendRunner class with #Component or #Service and include the SendRunner package in componentscanpackage
Your bean not in Spring Managed context, below can be the reasons.
Package sms.dating.messenger.connector not in Component scan.
You are moving out of the Spring context by creating an object with new (see below),
this way you will not get the autowired fields.
SendRunner sendRunner = new SendRunner () ,
sendRunner.sendRequest();
Just check how I implement. Hope this will help.
#RestController
public class RestRequest {
#Autowired
SendRunner sendRunner;
#RequestMapping("/api")
public void Uri() {
sendRunner.start();
}
}
SendRunner class
#Service
public class SendRunner extends Thread{
#Autowired
private SubscriberService subscriberService;
#Override
public void run() {
SendRequest();
}
private void SendRequest() {
System.out.println("Object is " + subscriberService);
String senderAddress = "address";
subscriberService.getSubscriberByMsisdn(senderAddress);
}
}
Below are the logs printed when I hit the REST api.
Object is com.example.demo.SubscriberService#40f33492

Spring Autowired Shared Queue NullPointerException

I'm using Spring for the first time and am trying to implement a shared queue wherein a Kafka listener puts messages on the shared queue, and a ThreadManager that will eventually do something multithreaded with the items it takes off the shared queue. Here is my current implementation:
The Listener:
#Component
public class Listener {
#Autowired
private QueueConfig queueConfig;
private ExecutorService executorService;
private List<Future> futuresThread1 = new ArrayList<>();
public Listener() {
Properties appProps = new AppProperties().get();
this.executorService = Executors.newFixedThreadPool(Integer.parseInt(appProps.getProperty("listenerThreads")));
}
//TODO: how can I pass an approp into this annotation?
#KafkaListener(id = "id0", topics = "bose.cdp.ingest.marge.boseaccount.normalized")
public void listener(ConsumerRecord<?, ?> record) throws InterruptedException, ExecutionException
{
futuresThread1.add(executorService.submit(new Runnable() {
#Override public void run() {
try{
queueConfig.blockingQueue().put(record);
// System.out.println(queueConfig.blockingQueue().take());
} catch (Exception e){
System.out.print(e.toString());
}
}
}));
}
}
The Queue:
#Configuration
public class QueueConfig {
private Properties appProps = new AppProperties().get();
#Bean
public BlockingQueue<ConsumerRecord> blockingQueue() {
return new ArrayBlockingQueue<>(
Integer.parseInt(appProps.getProperty("blockingQueueSize"))
);
}
}
The ThreadManager:
#Component
public class ThreadManager {
#Autowired
private QueueConfig queueConfig;
private int threads;
public ThreadManager() {
Properties appProps = new AppProperties().get();
this.threads = Integer.parseInt(appProps.getProperty("threadManagerThreads"));
}
public void run() throws InterruptedException {
ExecutorService executorService = Executors.newFixedThreadPool(threads);
try {
while (true){
queueConfig.blockingQueue().take();
}
} catch (Exception e){
System.out.print(e.toString());
executorService.shutdownNow();
executorService.awaitTermination(1, TimeUnit.SECONDS);
}
}
}
Lastly, the main thread where everything is started from:
#SpringBootApplication
public class SourceAccountListenerApp {
public static void main(String[] args) {
SpringApplication.run(SourceAccountListenerApp.class, args);
ThreadManager threadManager = new ThreadManager();
try{
threadManager.run();
} catch (Exception e) {
System.out.println(e.toString());
}
}
}
The problem
I can tell when running this in the debugger that the Listener is adding things to the queue. When the ThreadManager takes off the shared queue, it tells me the queue is null and I get an NPE. It seems like autowiring isn't working to connect the queue the listener is using to the ThreadManager. Any help appreciated.
This is the problem:
ThreadManager threadManager = new ThreadManager();
Since you are creating the instance manually, you cannot use the DI provided by Spring.
One simple solution is implement a CommandLineRunner, that will be executed after the complete SourceAccountListenerApp initialization:
#SpringBootApplication
public class SourceAccountListenerApp {
public static void main(String[] args) {
SpringApplication.run(SourceAccountListenerApp.class, args);
}
// Create the CommandLineRunner Bean and inject ThreadManager
#Bean
CommandLineRunner runner(ThreadManager manager){
return args -> {
manager.run();
};
}
}
You use SpringĀ“s programatic, so called 'JavaConfig', way of setting up Spring beans (classes annotated with #Configuration with methods annotated with #Bean). Usually at application startup Spring will call those #Bean methods under the hood and register them in it's application context (if scope is singleton - the default - this will happen only once!). No need to call those #Bean methods anywhere in your code directly... you must not, otherwise you will get a separate, fresh instance that possibly is not fully configured!
Instead, you need to inject the BlockingQueue<ConsumerRecord> that you 'configured' in your QueueConfig.blockingQueue() method into your ThreadManager. Since the queue seems to be a mandatory dependency for the ThreadManager to work, I'd let Spring inject it via constructor:
#Component
public class ThreadManager {
private int threads;
// add instance var for queue...
private BlockingQueue<ConsumerRecord> blockingQueue;
// you could add #Autowired annotation to BlockingQueue param,
// but I believe it's not mandatory...
public ThreadManager(BlockingQueue<ConsumerRecord> blockingQueue) {
Properties appProps = new AppProperties().get();
this.threads = Integer.parseInt(appProps.getProperty("threadManagerThreads"));
this.blockingQueue = blockingQueue;
}
public void run() throws InterruptedException {
ExecutorService executorService = Executors.newFixedThreadPool(threads);
try {
while (true){
this.blockingQueue.take();
}
} catch (Exception e){
System.out.print(e.toString());
executorService.shutdownNow();
executorService.awaitTermination(1, TimeUnit.SECONDS);
}
}
}
Just to clarify one more thing: by default the method name of a #Bean method is used by Spring to assign this bean a unique ID (method name == bean id). So your method is called blockingQueue, means your BlockingQueue<ConsumerRecord> instance will also be registered with id blockingQueue in application context. The new constructor parameter is also named blockingQueue and it's type matches BlockingQueue<ConsumerRecord>. Simplified, that's one way Spring looks up and injects/wires dependencies.

Resources