Can a Spring Boot standalone application integrated with Camunda consume JMS messages from JbossFuse? - spring-boot

I have a activemq:queue inQueue in my JbossFuse. How do I consume those JMS messages which are enqueued so that my process instance is triggered in the Spring boot application integrated with Camunda ? Any link to references or samples would be helpful ?
Currently I am able to consume messages from activemq but I am not sure how to consume the messages from Jboss Fuse ActiveMQ ?
#Component
public class ActiveMQConsumer {
#Autowired
CamelContext camelContext;
#Autowired
ProducerTemplate producerTemplate;
#SuppressWarnings("unchecked")
#JmsListener(destination = "inQueue")
public void consumeMessage(JSONObject employeeRecord) throws Exception {
if (employeeRecord instanceof JSONObject) {
HashMap<String, Object> employeeRecordMap = (HashMap<String, Object>) employeeRecord.toMap();
Exchange exchange = ExchangeBuilder.anExchange(camelContext).withBody(employeeRecordMap).build();
HashMap<String, Object> employeeDetails = (HashMap<String, Object>) employeeRecordMap.get("employeeDetails");
exchange.setProperty("CamundaBpmBusinessKey", employeeDetails.get("employeeADId"));
producerTemplate.send("camunda-bpm:start?processDefinitionKey=camunda-camel-activeMQ", exchange);
}
}
}
application.properties
# activeMQ config
spring.activemq.broker-url=tcp://localhost:61616
spring.activemq.user=admin
spring.activemq.password=admin
Expected to consume messages from JbossFuse.

I would recommend using the maven archetype io.fabric8.archetypes spring-boot-camel-amq-archetype version 2.2.197. This can be found:
Spring Boot example running a Camel route connecting to ActiveMQ
http://repo1.maven.org/maven2/
This will get you a nice sample project that has all of the Camel and Spring dependencies and some nice samples.

Related

Spring boot Kafka metrics configuration

We have a spring boot microservice using spring-kafka. There are a couple of Kafka listeners and producers and the Kafka related metrics are successfully presented at the actuator endpoint. But when adding another maven dependency with custom metrics, the Kafka related metrics provided by spring boot are removed. I suspect it's related to introducing new MeterRegistry bean, but overriding it with the same bean definition as in NoOpMeterRegistryConfiguration in spring-boot-actuator-autoconfigure module of spring-boot didn't help. I've also tried debugging KafkaMetricsAutoConfiguration, and it looks like all the beans are successfully instantiated. Can somebody with more experience with spring-boot-actuator-autoconfigure help me to understand the default configurations for Kafka related metrics, please?
The spring configuration in the new maven dependency is pretty simple, just ProducerFactory and MeterRegistry beans are defined
#Bean
#Primary
public ProducerFactory<Object, Object> producerFactory(MessageCounter messageCounter) {
Map<String, Object> configs = this.kafkaProperties.buildProducerProperties();
configs.put("interceptor.classes", Collections.singletonList(RecordSuccessfullySentInterceptor.class));
configs.put("message.counter.bean", messageCounter);
DefaultKafkaProducerFactory<Object, Object> producerFactory = new DefaultKafkaProducerFactory(configs);
producerFactory.setTransactionIdPrefix(this.transactionId);
return producerFactory;
}
#Bean
public MeterRegistry meterRegistry() {
return new PrometheusMeterRegistry(PrometheusConfig.DEFAULT);
}
#Bean({"retryProducerFactory"})
public ProducerFactory<Object, Object> retryProducerFactory(MessageCounter messageCounter) {
Map<String, Object> configs = this.kafkaProperties.buildProducerProperties();
configs.put("interceptor.classes", Collections.singletonList(RecordSuccessfullySentInterceptor.class));
configs.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
configs.put("message.counter.bean", messageCounter);
DefaultKafkaProducerFactory<Object, Object> producerFactory = new DefaultKafkaProducerFactory(configs);
producerFactory.setTransactionIdPrefix(this.transactionId);
return producerFactory;
}
It may be possible some transient dependency is messing up spring boot autoconfiguration for Kafka related metrics though.

JMS with spring boot, sender and receiver on same package: what is its use?

I am learning JMS with spring boot and nice to know that spring boot comes with embed Active MQ JMS broker.
I started from spring page on how to achieve this and it works like charm. Now i went little further and create two separate spring boot application one containing jms sender code and another containing receiver code.
I tried starting and application failed as both application are using same port for JMS. I fixed this by including this on one application
#Bean
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector("tcp://localhost:61616");
broker.addConnector("vm://localhost");
broker.setPersistent(false);
return broker;
}
But now sender is sending message successfully but receiver is doing nothing. I search on stackoverflow and look at this and this. And they are saying:
If you want to use JMS in production, it would be much wiser to avoid using Spring Boot embedded JMS brokers and host it separately. So 3 node setup would be preferred for PROD.
So my questions are:
1. What is the purpose of putting both jms sender and receiver on same application? Is there any practical example
2. Is it really not possible to use spring boot embedded JMS to communicate two separate application.
You might have sender and receiver in the same application if requests arrive in bursts and you want to save them somewhere before they are processed, in case of a server crash. You typically still wouldn't use an embedded broker for that.
Embedded brokers are usually used for testing only.
You can, however, run an embedded broker that is accessible externally; simply fire up a BrokerService as you have, but the other app needs to connect with the tcp://... address, not the vm://....
EDIT
App1:
#SpringBootApplication
#RestController
public class So52654109Application {
public static void main(String[] args) {
SpringApplication.run(So52654109Application.class, args);
}
#Bean
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector("tcp://localhost:61616");
broker.setPersistent(false);
broker.start();
return broker;
}
#Autowired
private JmsTemplate template;
#RequestMapping(path = "/foo/{id}")
public String foo(#PathVariable String id) {
template.convertAndSend("someQueue", id);
return id + ": thank you for your request, we'll send an email to the address on file when complete";
}
}
App2:
application.properties
spring.activemq.broker-url=tcp://localhost:61616
and
#SpringBootApplication
public class So526541091Application {
public static void main(String[] args) {
SpringApplication.run(So526541091Application.class, args);
}
#JmsListener(destination = "someQueue")
public void process(String id) {
System.out.println("Processing request for id");
}
}
Clearly, for a simple app like this you might just run the listener in the first app.
However, since there is no persistence of messages with this configuration, you would likely use an external broker for a production app (or enable persistence).

Activemq web console in Spring

I am creating an embedded ActiveMQ broker in Spring application like this:
#EnableJms
#Configuration
public class MqConfig {
#Bean
public BrokerService broker() throws Exception {
BrokerService broker = new BrokerService();
broker.setBrokerName("ETL");
broker.addConnector("tcp://localhost:61616");
broker.start();
return broker;
}
}
How can I embed the ActiveMQ Admin Web console also in the same spring application? Searched the net for 2-3 hours, but found no useful answer. The only thing that ActiveMQ site mentions is this http://activemq.apache.org/web-console.html, however is not useful with Spring

XA transactions using spring and activemq

I am trying to prove I don't need the activemq rar deployed to JBoss EAP 6.3 in order to use XA transactions...I'd like to use just the active mq client jar. I created a simple spring-boot project and exposed a method which gets exposed via a restful web service. The following code correctly rolls back if I have the active-mq rar deployed.
#Transactional
public void work() throws Exception {
ConnectionFactory connectionFactory = (ConnectionFactory) this.context
.getBean("jmsConnectionFactory");
// Send a message
MessageCreator messageCreator = new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
return session.createTextMessage("Test message!");
}
};
JmsTemplate jmsTemplate = new JmsTemplate(connectionFactory);
System.out.println("Sending a new message.");
jmsTemplate.send("test-destination", messageCreator);
throw new Exception("Something bad happened!!");
}
However, when I create my own ConnectionFactory via JNDI, the code doesn't rollback and the message still gets sent.
#Transactional
public void work() throws Exception {
ConnectionFactory connectionFactory = null;
try {
Context ctx = new InitialContext();
connectionFactory = (ConnectionFactory) ctx.lookup("java:/ConnectionFactory");
} catch (NamingException e) {
e.printStackTrace();
throw new RuntimeException(e);
}
// Send a message
MessageCreator messageCreator = new MessageCreator() {
#Override
public Message createMessage(Session session) throws JMSException {
return session.createTextMessage("Test message!");
}
};
JmsTemplate jmsTemplate = new JmsTemplate(connectionFactory);
System.out.println("Sending a new message.");
jmsTemplate.send("test-destination", messageCreator);
throw new Exception("Something bad happened!!");
}
What I'd like to understand is what spring-boot is doing at boot time to provide XA support when the activemq rar is deployed as a resource adaptor on EAP. If I understand that, I think I should be able to just package the active-mq client jar and my database jar in my spring app (not spring-boot based) and still provide XA support i.e. get spring to manage the XA transactions by delegating to the PlatformTransactionManager.
Any help would be appreciated.
Thanks, Will
Spring Boot is all about conventions. You need to indicate Spring Boot that you are connecting to a XA connection factory.
According to Spring Boot documentation (32.3 Using a Java EE managed transaction manager) :
Spring Boot will attempt to auto-configure JMS by looking for a ConnectionFactory at the JNDI path java:/JmsXA or java:/XAConnectionFactory

Ensuring Spring Integration deployment's JMS listener threads are cleaned up on Tomcat undeploy

I have a simple Spring Integration application which runs on Tomcat (v7.0.x) and consumes messages off a Websphere MQ Queue. When I un-deploy the WAR from the Tomcat server, the WAR un-deploys okay but, a JMS listener thread is left running on the Tomcat server which will still consume messages off the Websphere MQ Queue. I am therefore assuming that I am not handling the JMS listener clean up part of the application properly?
Here is the stack I am using:
Java 8
Tomcat 7.0.55
Spring Integration 4.0.4
Spring Integration Java Dsl 1.0.0.M3
In terms of my SI application's configurations, I have a JmsConfig class:
#Configuration
#ComponentScan
public class JmsConfig {
#Autowired
private Properties jndiProperties;
private ConnectionFactory mqConnectionFactory() throws NamingException {
Context ctx = new InitialContext(jndiProperties);
try {
MQQueueConnectionFactory connectionFactory = (MQQueueConnectionFactory)
ctx.lookup("jms/service/SERVICE_QCF");
return connectionFactory;
} finally {
ctx.close();
}
}
#Bean
public ConnectionFactory cachingConnectionFactory() throws NamingException {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory();
connectionFactory.setTargetConnectionFactory(mqConnectionFactory());
connectionFactory.setSessionCacheSize(10);
return connectionFactory;
}
}
I have an Integration config class:
#Configuration
#EnableIntegration
public class IntegrationConfig {
#Autowired
private ConnectionFactory cachingConnectionFactory;
#Bean
public IntegrationFlow requestFlow() {
return IntegrationFlows
.from(Jms.inboundAdapter(cachingConnectionFactory).destination(
"SERVICE_QUEUE_NAME"), c -> {
c.poller(Pollers.fixedRate(100));
})
.channel("request.service.ch").get();
}
}
Web Initialiser config class:
#Configuration
public class WebInitialiser implements WebApplicationInitializer {
public void onStartup(ServletContext servletContext)
throws ServletException {
AnnotationConfigWebApplicationContext rootContext =
new AnnotationConfigWebApplicationContext();
rootContext.register(ApplicationConfig.class, JmsConfig.class,
IntegrationConfig.class, DatabaseConfig.class);
servletContext.addListener(new ContextLoaderListener(rootContext));
}
}
During the un-deploy stage I see the following in the catalina logs which may or may not be related:
SEVERE: The web application [/service-a] appears to have started a thread named [Thread-7] but has failed to stop it. This is very likely to create a memory leak.
Is there anything that I have yet NOT set or configured or annotated in order to ensure that the deployment's JMS listener thread is cleaned up from Tomcat's JVM during the WAR's un-deploy stage?
Thanks in advance,
PM.
To ensure that JMS listener threads are cleared up upon the application's un-deploy stage, I simply created a CachingConnectionFactory bean with its targetConnectionFactory being that of the MQConnectionFactory. Then, in the Spring Integration flows, I simply pass in the cachingConnectionFactory bean to the JMS adapters instead. I've updated the configs in this post to show this. Cheers, PM.

Resources