Is there a way to Spring DMLC (DefaultMessageListenerContainer) to consume messages (say every 10 minutes) using CRON?
I don't want the messages to be picked up by Spring DMLC all the times.
Let's say a message is produced and dropped off into the JMS broker, I like the consumer (Spring DMLC) to pick up after some time for processing.
I am wondering if there is a way to configure Spring DMLC and Quartz?
Why do you need a DMLC in that case? If you use spring, a JMSTemplate might be what you are looking for.
void readOneMessageAndProcess() throws JmsException{
Message msg = jmsTemplate.receive("SOME.QUEUE");
// Process.
}
Then have Quartz, java timer, or a simple public static void main(String args[]) triggerd by a cron job run the method
Related
#Component
#RequiredArgsConstructor
public class EventListener {
private final EventProcessingService eventProcessingService;
#JmsListener(destination = "inputQueue", constainerFactory = "myContainerFactory)
public void receiveMessage(Message message) {
eventProcessingService.doSome(message).subscribe(); // return Mono<Void>
}
}
#Service
public class EventProcessingService {
public Mono<Void> doSome(Message message) {
//...
}
}
#Configuration
#RequiredArgsConstructor
public class MqIntegration {
private final ConnectionFactory connectionFactory;
#Bean
public Publisher<Message<String>> mqReactiveFlow() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(this.connectionFactory)
.destination("testQueue"))
.channel(MessageChannels.queue())
.toReactivePublisher();
}
}
I have some webflux application which interacts with ibm mq and a JmsListener which listens for messages from the queue when a message is received EventProcessingService makes requests to other services depending on the messages.
I would like to know how I can create a JmsListener that works with reactive threads using Spring Integration. In other words I want to know if it is possible to create an Integration flow which will receive messages from the queue and call the EvenProcessingService when the messages are received so that it does not have a negative effect on the threads inside webflux application
I think we need to clean up some points in your question.
WebFlux is not a project by itself. It is Spring Framework module about Web on top of reactive server: https://docs.spring.io/spring-framework/docs/current/reference/html/web-reactive.html#spring-webflux
The #JmsListener is a part of another Spring Framework module - spring-jms. And there is nothing relevant to threads used by reactive server for WebFlux layer. https://docs.spring.io/spring-framework/docs/current/reference/html/integration.html#jms
Spring Integration is a separate project which implement EIP on top of Spring Framework dependency injection container. It indeed has its own WebFlux module for channel adapters on top of WebFlux API in Spring Framework: https://docs.spring.io/spring-integration/docs/current/reference/html/webflux.html#webflux. And it also has a JMS module on top of JMS module from Spring Framework: https://docs.spring.io/spring-integration/docs/current/reference/html/jms.html#jms. However there is nothing related to #JmsLisntener since its Jms.messageDrivenChannelAdapter() fully covers that functionality and from a big height it does it the same way - via MessageListenerContainer.
All of this is might not be relevant to the question, but it is better to have a clear context of what you are asking so we will feel that we are on the same page with you.
Now trying to answer to your concern.
As long as you don't deal with JMS from WebFlux layer (#RequestMapping or WebFlux.inboundGateway()), you don't effect those non-blocking thread. The JMS MessageListenerContainer spawns its own threads and perform pulling from the queue and message processing.
What you are explaining with your JMS configuration and service looks more like this:
#Bean
public IntegrationFlow mqReactiveFlow() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(this.connectionFactory)
.destination("testQueue"))
.handle(this.eventProcessingService)
.nullChannel();
}
There is really no reason to shift messages just after JMS into a QueueChannel since JMS listening is already an async operation.
We need that nullChannel in the end of your flow just because your service method returns Mono and framework knows nothing what to do with that. Starting with version 5.4.3 the NullChannel is able to subscribe to the Publisher payload of the message produced to it.
You could have though a FluxMessageChannel in between to really simulate a back-pressure for JMS listener, but that won't make to much different for your next service.
I think you are going to have to bypass #JmsListener as that is registering an on message, which although asynchronous isn't going to be reactive. JMS is essentially blocking, so patching a reactive layer on top, is going to be just a patch.
You will need to use the Publisher that you have created to generate the back pressure. I think you are going to have to define and instantiate your own listener bean which does something on the lines of :
public Flux<String> mqReactiveListener() {
return Flux.from(mqReactiveFlow())
.map(Message::getPayload);
}
I've written integration tests for my Spring Boot Kafka (Consumer/Producer) service everything gone well. So I'm committing the offsets of my consumer manually after some processing.
I want to verify whether acknowledgment.acknowledge() was called in the consumer. Is is to possible verify?
Here is my method signature of the service:
#KafkaListener(topics = {TOPIC_XXX_V1}, containerFactory = "XXXListener")
private void consumer(#Payload XXXXRequestEvent xxxxRequestEvent, Acknowledgment acknowledgment) {
.....
// do something with the database
acknowledgment.acknowledge()
For the testing side I'm using #SpyBean for the Service and a MockBean for the database interaction. I want verify somehow whether in the test case the .acknowledge() was called. FYI: the .acknowledge() is a public abstract void method
As the Acknowledge instance is injected and created as part of Spring Kafka when consuming a message, I guess there is no way to use something like verify() of Mockito for this.
When writing a unit test instead you could pass a mocked version of Acknowledge here and then verify that this method was invoked. However, with a unit test, you can't test the actual consumption of a message (serialization, correct message handler, etc.).
So in your case, I would try to verify that your message was acknowledged by e.g. using Testcontainers to execute commands inside the Kafka container and ensuring that the already acknowledged message is not returned any more.
Another approach could be to create a Kafka client as part of your test and then try to consume messages from the same topic for X seconds and expect zero results. Awaitility might help you here.
Is it in a way possible to, say in memory, start a broker that can be used to execute automated test cases using Spring Integration MQTT?
I've tried achieving this with ActiveMQ (following https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-messaging.html) but somehow didn't succeed, maybe anyone has a short working example?
It's not Spring Integration (Spring Boot) responsibility to provide some embedded broker for such a protocol. If there is one, we could consider to implement an auto-configuration on the matter , similar to what we do for embedded RDBMS, JMS and MongoDB. You really need to consult ActiveMQ documentation.
Looks like we can do it like this in the test class:
private static BrokerService activeMQBroker;
...
#BeforeClass
public static void setup() throws Exception {
activeMQBroker = new BrokerService();
activeMQBroker.addConnector("mqtt://localhost:1883");
activeMQBroker.setPersistent(false);
activeMQBroker.setUseJmx(false);
activeMQBroker.start();
}
I didn't try it, but this is exactly what I do to test against STOMP.
In my Spring Boot application I have to implement an import service. Users can submit a bunch of JSON files and application will try to import the data from these files. Depending on the data amount at JSON files the single import process can take a 1 or 2 hours.
I do not want to block the users during the import process so I plan to accept the task for importing and notify user that this data is scheduled for processing. I'll put the data into the queue and a free queue-consumer on the other end will start the import process. Also, I need to have a possibility to monitor a jobs in the queue, terminate them if needed.
Right now I'm thinking to use Embedded Apache ActiveMQ in order to introduce message producer and consumer logic but before this I'd like to ask - from the architecture point of view - is it a good choice for the described task or it can be implemented with a more appropriate tools.. like for example plain Spring #Async and so on ?
It is possible to treat files concurrently with Camel like this
from("file://incoming?maxMessagesPerPoll=1&idempotent=true&moveFailed=failed&move=processed&readLock=none").threads(5).process()
Take a look at http://camel.apache.org/file2.html
But i think that it is better for your requirements to use a standalone ActiveMQ, a standalone service to move files to ActiveMQ and standalone consumer to be capable to kill or restart each one independently.
It is better to use ActiveMQ as you said and you can easily create a service to move messages to a queue with Camel like this :
CamelContext context = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory("vm://localhost?broker.persistent=true");
context.addComponent("test-jms", JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
context.addRoutes(new RouteBuilder() {
public void configure() {
// convertBodyTo to use TextMessage or maybe send them as file to the Queue from("file://testFolderPath").convertBodyTo(String.class).to("test-jms:queue:test.queue");
}
});
context.start();
Here some examples
http://www.programcreek.com/java-api-examples/index.php?api=org.apache.camel.component.jms.JmsComponent
https://skills421.wordpress.com/2014/02/08/sending-local-files-to-a-jms-queue/
https://github.com/apache/camel/blob/master/examples/camel-example-jms-file/src/main/java/org/apache/camel/example/jmstofile/CamelJmsToFileExample.java
https://github.com/apache/camel/tree/master/examples
To monitor and manage you can use jmx with VisualVM or Hawtio http://hawt.io/getstarted/index.html
http://camel.apache.org/camel-jmx.html
To consume you can use DefaultMessageListenerContainer with concurrent consumers on the queue and for this you need to change the prefetchPolicy on the ConnectionFactory of the DefaultMessageListenerContainer , Multithreaded JMS client ActiveMQ
I made a simple Jms project with 2 java files names are MessageSender.java,MessageConsumer.java.one for sending messages to Activemq:Queue and another for consuming messages from Activemq:Queue.Deployed this project in Apache Tomcat.following code was consumer code.
ActiveMQConnectionFactory connectionFactory=new ActiveMQConnectionFactory("admin","admin","tcp://localhost:61617?jms.prefetchPolicy.queuePrefetch=1");
Connection connection=connectionFactory.createConnection();
final Session session=connection.createSession(true, Session.CLIENT_ACKNOWLEDGE);
Queue queue=session.createQueue("ThermalMap");
javax.jms.MessageConsumer consumer=session.createConsumer(queue);
//anonymous class
MessageListener listener = new MessageListener() {
#Override
public void onMessage(Message msg) {
// My business code
}
};
Later If I want to change consumer code,I don't want to stop Tomcatbecause If I stop Tomcat entire jms project should not work. So clients can't able to sent messages to Activemq:Queue.So I don't want to follow this way.
I am thinking, If I stop consumers through Activemq console page.I don't need to stop Tomcat So clients can able to send messages normally.For this I check AMQ console page,I didn't seen any consumers.
Is it correct way to do this.
If it is correct way, How can I do this.
can anyone suggest me.
Thanks.
Call the .close() method on your MessageConsumer.