#JmsListener does not receive message asynchronously - spring-boot

I am new to Solace pubsub+ broker.
I tried to send message using topic and receive message using springboot message listener
#JmsListener(destination = "HelloWorld")
public void handle(Message message)
{
try
{
Date receiveTime = new Date();
if (message instanceof TextMessage)
{
TextMessage tm = (TextMessage) message;
try
{
System.out.println("Message Received at " + new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS").format(receiveTime)
+ " with message content of: " + tm.getText());
for(long index =0; index <10000000000L; index++)
{
}
System.out.println("..done....");
}
catch (Exception e)
{
e.printStackTrace();
}
}
else
{
System.out.println(message.toString());
}
} catch (Exception e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
The publisher sends the message continuously for every second, however the consumer receives the message and process it and then it receives the next message. I expect the receiver to receive the message immediately when the publisher push the message.
I connect solace broker with amqp over Jms1 protocol and used org.amqphub.spring maven dependency.
Do i need to configure anything else to receive message asynchronously?

Related

Springboot Kafka #Listener consumer pause/resume not working

I have a springboot Kafka Consumer & Producer. The consumer is expected to read data from topic 1 by 1, process(time consuming) it & write it to another topic and then manually commit the offset.
In order to avoid rebalancing, I have tried to call pause() and resume() on KafkaContainer but the consumer is always running & never responds to pause() call, tried it even with a while loop and faced no success(unable to pause the consumer). KafkaListenerEndpointRegistry is Autowired.
Springboot version = 2.6.9, spring-kafka version = 2.8.7
#KafkaListener(id = "c1", topics = "${app.topics.topic1}", containerFactory = "listenerContainerFactory1")
public void poll(ConsumerRecord<String, String> record, Acknowledgment ack) {
log.info("Received Message by consumer of topic1: " + value);
String result = process(record.value());
producer.sendMessage(result + " topic2");
log.info("Message sent from " + topicIn + " to " + topicOut);
ack.acknowledge();
log.info("Offset committed by consumer 1");
}
private String process(String value) {
try {
pauseConsumer();
// Perform time intensive network IO operations
resumeConsumer();
} catch (InterruptedException e) {
log.error(e.getMessage());
}
return value;
}
private void pauseConsumer() throws InterruptedException {
if (registry.getListenerContainer("c1").isRunning()) {
log.info("Attempting to pause consumer");
Objects.requireNonNull(registry.getListenerContainer("c1")).pause();
Thread.sleep(5000);
log.info("kafkalistener container state - " + registry.getListenerContainer("c1").isRunning());
}
}
private void resumeConsumer() throws InterruptedException {
if (registry.getListenerContainer("c1").isContainerPaused() || registry.getListenerContainer("c1").isPauseRequested()) {
log.info("Attempting to resume consumer");
Objects.requireNonNull(registry.getListenerContainer("c1")).resume();
Thread.sleep(5000);
log.info("kafkalistener container state - " + registry.getListenerContainer("c1").isRunning());
}
}
Am I missing something? Could someone please guide me with the right way of achieving the required behaviour?
You are running the process() method on the listener thread so pause/resume will not have any effect; the pause only takes place when the listener thread exits the listener method (and after it has processed all the records received by the previous poll).
The next version (2.9), due later this month, has a new property pauseImmediate, which causes the pause to take effect after the current record is processed.
You can try like this. This work for me
public class kafkaConsumer {
public void run(String topicName) {
try {
Consumer<String, String> consumer = new KafkaConsumer<>(config);
consumer.subscribe(Collections.singleton(topicName));
while (true) {
try {
ConsumerRecords<String, String> consumerRecords = consumer.poll(Duration.ofMillis(80000));
for (TopicPartition partition : consumerRecords.partitions()) {
List<ConsumerRecord<String, String>> partitionRecords = consumerRecords.records(partition);
for (ConsumerRecord<String, String> record : partitionRecords) {
kafkaEvent = record.value();
consumer.pause(consumer.assignment());
/** Implement Your Business Logic Here **/
Once your processing done
consumer.resume(consumer.assignment());
try {
consumer.commitSync();
} catch (CommitFailedException e) {
}
}
}
} catch (Exception e) {
continue;
}
}
} catch (Exception e) {
}
}

Replay/synchronous messages memory not released for messages sent by producers to queue in SpringBoot JMS with ActiveMQ

1. Context:
A two-modules/microservice application developed with SpringBoot 2.3.0 and ActiveMQ.
Also we use ActiveMQ 5.15.13 server/broker.
Broker is defined in both modules with application properties.
Also broker connection pool is defined in both modules as well with application properties and added in both modules the pooled-jms artifact dependency (with maven):
spring.activemq.broker-url=xxx
spring.activemq.user=xxx
spring.activemq.password=xx
spring.activemq.non-blocking-redelivery=true
spring.activemq.pool.enabled=true
spring.activemq.pool.time-between-expiration-check=5s
spring.activemq.pool.max-connections=10
spring.activemq.pool.max-sessions-per-connection=10
spring.activemq.pool.idle-timeout=60s
Other configurations for JMS I done are:
spring.jms.listener.acknowledge-mode=auto
spring.jms.listener.auto-startup=true
spring.jms.listener.concurrency=5
spring.jms.listener.max-concurrency=10
spring.jms.pub-sub-domain=false
spring.jms.template.priority=100
spring.jms.template.qos-enabled=true
spring.jms.template.delivery-mode=persistent
In module 1 the JmsTemplate is used to send synchronous messages (or we can name replay-messages as well). I've opted out for a proper queue instead of a temporary queue as I understand that if there are lots of messages sent than a temporary queue is not recommended to be used for replays - so that's what I did.
2. Code samples:
MODULE 1:
#Value("${app.request-video.jms.queue.name}")
private String requestVideoQueueNameAppProperty;
#Bean
public Queue requestVideoJmsQueue() {
logger.info("Initializing requestVideoJmsQueue using application property value for " +
"app.request-video.jms.queue.name=" + requestVideoQueueNameAppProperty);
return new ActiveMQQueue(requestVideoQueueNameAppProperty);
}
#Value("${app.request-video-replay.jms.queue.name}")
private String requestVideoReplayQueueNameAppProperty;
#Bean
public Queue requestVideoReplayJmsQueue() {
logger.info("Initializing requestVideoReplayJmsQueue using application property value for " +
"app.request-video-replay.jms.queue.name=" + requestVideoReplayQueueNameAppProperty);
return new ActiveMQQueue(requestVideoReplayQueueNameAppProperty);
}
#Autowired
private JmsTemplate jmsTemplate;
public Message callSendAndReceive(TextJMSMessageDTO messageDTO, Destination jmsDestination, Destination jmsReplay) {
return jmsTemplate.sendAndReceive(jmsDestination, jmsSession -> {
try {
TextMessage textMessage = jmsSession.createTextMessage();
textMessage.setText(messageDTO.getText());
textMessage.setJMSReplyTo(jmsReplay);
textMessage.setJMSCorrelationID(UUID.randomUUID().toString());
textMessage.setJMSDeliveryMode(DeliveryMode.NON_PERSISTENT);
return textMessage;
} catch (IOException e) {
logger.error("Error sending JMS message to destination: " + jmsDestination, e);
throw new JMSException("Error sending JMS message to destination: " + jmsDestination);
}
});
}
MODULE 2:
#JmsListener(destination = "${app.backend-get-request-video.jms.queue.name}")
public void onBackendGetRequestsVideoMessage(TextMessage message, Session session) throws JMSException, IOException {
logger.info("Get requests video file message consumed!");
try {
Object replayObject = handleReplayAction(message);
JMSMessageDTO messageDTO = messageDTOFactory.getJMSMessageDTO(replayObject);
Message replayMessage = messageFactory.getJMSMessage(messageDTO, session);
BytesMessage replayBytesMessage = jmsSession.createBytesMessage();
fillByteMessageFromMediaDTO(replayBytesMessage, mediaMessageDTO);
replayBytesMessage.setJMSCorrelationID(message.getJMSCorrelationID());
final MessageProducer producer = session.createProducer(message.getJMSReplyTo());
producer.send(replayBytesMessage);
JmsUtils.closeMessageProducer(producer);
} catch (JMSException | IOException e) {
logger.error("onBackendGetRequestsVideoMessage()JMSException: " + e.getMessage(), e);
throw e;
}
}
private void fillByteMessageFromMediaDTO(BytesMessage bytesMessage, MediaJMSMessageDTO mediaMessageDTO)
throws IOException, JMSException {
String filePath = fileStorageConfiguration.getMediaFilePath(mediaMessageDTO);
FileInputStream fileInputStream = null;
try (FileInputStream fileInputStream = new FileInputStream(filePath)) {
byte[] byteBuffer = new byte[1024];
int bytes_read = 0;
while ((bytes_read = fileInputStream.read(byteBuffer)) != -1) {
bytesMessage.writeBytes(byteBuffer, 0, bytes_read);
}
} catch (JMSException e) {
logger.error("Can not write data in JMS ByteMessage from file: " + fileName, e);
} catch (FileNotFoundException e) {
logger.error("Can not open stream to file: " + fileName, e);
} catch (IOException e) {
logger.error("Can not read data from file: " + fileName, e);
}
}
3. The problem:
As I send many messages and receive many corresponding replays through producer/comsumer/JmsTamplate both application modules 1 and 2 are fast-filling the heap memory allocated until an out-of-memory error is thrown, but the memory leak appears only when using synchronous messages with replay as shown above.
I've debugged my code and all instances (session, producers, consumers, jmsTamplate, etc) are pooled and have instances of the right classes from pooled-jms library; so pool should - apparently - work properly.
I've made a heap dump of the second module and looks like producers messages (ActiveMQBytesMessage) are still in memory even long time after have been successfully consumed by the right consumer.
I have asynchronous messages sent as well in my modules and seams that those messages producer-consumer works well; the problem is present only for the synch/replay messages producer-consumer.
Sample heap dump files - taken after full night of application inactivity - as following:
module 1
module_1_dump
module 2
module_2_dump
activemq broker/server
activemq_dump
Anyone have any idea what I'm doing wrong?!

JMS Template,How can i receive a message from one queue and send to another using JMS Template

public void sendSimpleMessage(String receiver, String sender) {
try {
Message message = jmsTemplate.receive(receiver);
System.out.println(message.getIntProperty("OlQuestionId"));
jmsTemplate.send(sender, new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
throw new JMSException("Exception"+message.getIntProperty("OlQuestionId"));
}
});
} catch (JmsException jmsException) {
System.out.println(jmsException);
} catch (JMSException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
If an exception occurs while sending the received message there would be a loss of message as it is already recieved.
For Jms Template configuration i have :
#Bean
public JmsTemplate jmsTemplate() throws JMSException {
JmsTemplate template = new JmsTemplate();
template.setConnectionFactory(connectionFactory());
//template.setSessionAcknowledgeMode(Session.CLIENT_ACKNOWLEDGE);
template.setSessionTransacted(true);
template.setDeliveryMode(2);
return template;
Can you please tell me the way so that i can do recieving and sending in a single session.
Note: i have also tried Session.ClientAcknowledge while removing sessionTransacted, if exception is there i am not acknowledging the message but still there's a message loss.
Thanks
You can use client acknowledge mode. The message will stay until you decide to make it disappear.
message.acknowledge();
See How to Give manual Acknowledge using JmsTemplate and delete message from Rabbitmq queue

ActiveMQ - is it possible to acknowledge single message in CLIENT_ACKNOWLEDGE mode

According to http://docs.oracle.com/javaee/6/api/javax/jms/Message.html#acknowledge()
A client may individually acknowledge each message as it is consumed, or it may choose to acknowledge messages as an application-defined group (which is done by calling acknowledge on the last received message of the group, thereby acknowledging all messages consumed by the session.)
How can I do it in ActiveMQ? I was unable to make it work.
here is example with ActiveMQ client,
import javax.jms.Connection;
import javax.jms.JMSException;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.activemq.ActiveMQMessageConsumer;
import org.apache.activemq.ActiveMQSession;
import org.apache.activemq.command.ActiveMQTextMessage;
public class SimpleConsumer {
public static void main(String[] args) throws JMSException {
Connection conn = null;
try {
ActiveMQConnectionFactory cf = new ActiveMQConnectionFactory("tcp://localhost:61616");
conn = cf.createConnection("consumer", "consumer");
ActiveMQSession session = (ActiveMQSession) conn.createSession(false,
ActiveMQSession.INDIVIDUAL_ACKNOWLEDGE);
ActiveMQMessageConsumer consumer = (ActiveMQMessageConsumer) session
.createConsumer(session.createQueue("QUEUE"));
conn.start();
ActiveMQTextMessage msg = null;
while ((msg = (ActiveMQTextMessage) consumer.receive()) != null) {
System.out.println("Received message is: " + msg.getText());
msg.acknowledge();
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (conn != null) {
try {
conn.close();
} catch (Exception e) {
}
}
}
}
}
I am correcting my answer after the below comments with Matt Pavlovich.
CLIENT_ACKNOWLEDGE: With this option, a client acknowledges a message
by calling the message’s acknowledge method. Acknowledging a consumed
message automatically acknowledges the receipt of all messages that
have been delivered by its session.
So, the CLIENT_ACKNOWLEDGE option can NOT be used to send an acknowledgment to the JMS provider for a single message.
You can look at spec here:
http://download.oracle.com/otndocs/jcp/jms-2_0-fr-eval-spec/index.html

JMS Connection delivering messages sent to the queue while the connection was stopped

I am facing an issue with JMS Connection stop() and start(). A simple java program illustrating the same is:
public class Test {
static Connection producerConn = null;
static BufferedWriter consumerLog = null;
static BufferedWriter producerLog = null;
public static final void main(String[] args) throws Exception {
ConnectionFactory cf = new ActiveMQConnectionFactory("failover:(tcp://localhost:61616)");
producerConn = cf.createConnection();
producerLog = new BufferedWriter(new FileWriter("produced.log"));
consumerLog = new BufferedWriter(new FileWriter("consumed.log"));
new Thread(new Runnable() {
public void run() {
try {
producerConn.start();
Session session = producerConn.createSession(false, Session.AUTO_ACKNOWLEDGE);
Queue queue = session.createQueue("SampleQ1");
MessageProducer producer = session.createProducer(queue);
Random random = new Random();
byte[] messageBytes = new byte[1024];
for (int i = 0; i < 100; i++) {
random.nextBytes(messageBytes);
Message message = session.createObjectMessage(messageBytes);
producer.send(message);
Thread.currentThread().sleep(10);
producerLog.write(message.getJMSMessageID());
producerLog.newLine();
producerLog.flush();
}
System.out.println("Produced 100000 messages...");
producerLog.close();
}
catch (Exception e) {
e.printStackTrace();
}
}
}).start();
System.out.println("Started producer...");
new Thread(new Runnable() {
public void run() {
int count = 0;
try {
producerConn.start();
Session session = producerConn.createSession(false, Session.AUTO_ACKNOWLEDGE);
Queue queue = session.createQueue("SampleQ1");
MessageConsumer consumer = session.createConsumer(queue);
consumer.setMessageListener(new Test().new MyListener());
}
catch (Exception e) {
e.printStackTrace();
}
}
}).start();
System.out.println("Started consumer...");
}
private class MyListener implements MessageListener{
private int count = 0;
public void onMessage(Message message) {
try {
message.acknowledge();
System.out.println("count is " +count++);
if(count == 5){
producerConn.stop();
System.out.println("Sleeping Now for 5 seconds. . ." +System.currentTimeMillis());
Thread.currentThread().sleep(5000);
producerConn.start();
}
System.out.println("Waking up . . ." +System.currentTimeMillis());
consumerLog.write(message.getJMSMessageID());
consumerLog.newLine();
consumerLog.flush();
}
catch (Exception e) {
e.printStackTrace();
}
}
}
}
My idea is to simulate the connection stop() and start(). Therefore, in the consumer thread after calling stop(), I have placed a sleep of 5 seconds. However, in the mean time the producer thread continues its job of sending message to the queue.
I expected the test to just consume only the message delivered before the consumer calls stop() and after it calls start() again after waking up from the sleep. But what's happening here is, when consumer wakes up it reads all the messages from the server even those that were sent to the queue when consumer's message reception was stopped.
Am I doing anything wrong here ?
There is nothing wrong there, it's the correct behavior. In asynchronous messaging producer and consumer are loosely decoupled. A producer does not care whether a consumer is consuming messages or not. It keeps putting messages to a queue while the consumer may be down, or stopped consuming messages or actively consuming messages.
The connection.stop() method has no effect on producer. It affects only consumer, stop() method pauses the delivery of messages from JMS provider to a consumer while start() method starts/resumes message delivery.
Hope this helped.

Resources