Persist message in ActiveMQ across server restart - jms

I am learning Spring Integration JMS. Since ActiveMQ is a message broker. I referring to project given here-> http://www.javaworld.com/article/2142107/spring-framework/open-source-java-projects-spring-integration.html?page=2#
But I want to know how can I persist message in ActiveMQ. I mean I started ActiveMQ then send request using REST client. I am calling publishService.send( message ); in for loop for 50 times and and the receiver end I have sleep timer of 10 seconds. So that 50 messages gets queued and its start processing at 10 seconds interval.
EDIT:
Look at the screen shot below:
It says 50 messages enqueued and 5 of them have been dequeued.
But then in between I stopped ActiveMQ server and by the time it has consumed 5 messages out of 50 and then again restart it.
But then I was expecting it to show remaining 45 in Messages Enqueued column. But I can see 0(see screenshot below) there and they all vanished after server restart without having to persist remaining 45 messages. How can I come over this problem ?
Please have a look at the configuration below:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-jms="http://www.springframework.org/schema/integration/jms"
xmlns:oxm="http://www.springframework.org/schema/oxm"
xmlns:int-jme="http://www.springframework.org/schema/integration"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.5.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-2.5.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd
http://www.springframework.org/schema/oxm http://www.springframework.org/schema/oxm/spring-oxm-3.0.xsd">
<!-- Component scan to find all Spring components -->
<context:component-scan base-package="com.geekcap.springintegrationexample" />
<bean class="org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter">
<property name="order" value="1" />
<property name="messageConverters">
<list>
<!-- Default converters -->
<bean class="org.springframework.http.converter.StringHttpMessageConverter"/>
<bean class="org.springframework.http.converter.FormHttpMessageConverter"/>
<bean class="org.springframework.http.converter.ByteArrayHttpMessageConverter" />
<bean class="org.springframework.http.converter.xml.SourceHttpMessageConverter"/>
<bean class="org.springframework.http.converter.BufferedImageHttpMessageConverter"/>
<bean class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter" />
</list>
</property>
</bean>
<!-- Define a channel to communicate out to a JMS Destination -->
<int:channel id="topicChannel"/>
<!-- Define the ActiveMQ connection factory -->
<bean id="connectionFactory" class="org.apache.activemq.spring.ActiveMQConnectionFactory">
<property name="brokerURL" value="tcp://localhost:61616"/>
</bean>
<!--
Define an adaptor that route topicChannel messages to the myTopic topic; the outbound-channel-adapter
automagically fines the configured connectionFactory bean (by naming convention
-->
<int-jms:outbound-channel-adapter channel="topicChannel"
destination-name="topic.myTopic"
pub-sub-domain="true" />
<!-- Create a channel for a listener that will consume messages-->
<int:channel id="listenerChannel" />
<int-jms:message-driven-channel-adapter id="messageDrivenAdapter"
channel="getPayloadChannel"
destination-name="topic.myTopic"
pub-sub-domain="true" />
<int:service-activator input-channel="listenerChannel" ref="messageListenerImpl" method="processMessage" />
<int:channel id="getPayloadChannel" />
<int:service-activator input-channel="getPayloadChannel" output-channel="listenerChannel" ref="retrievePayloadServiceImpl" method="getPayload" />
</beans>
Also please see the code:
Controller from where I am sending message in for loop at once:
#Controller
public class MessageController
{
#Autowired
private PublishService publishService;
#RequestMapping( value = "/message", method = RequestMethod.POST )
#ResponseBody
public void postMessage( #RequestBody com.geekcap.springintegrationexample.model.Message message, HttpServletResponse response )
{
for(int i = 0; i < 50; i++){
// Publish the message
publishService.send( message );
// Set the status to 201 because we created a new message
response.setStatus( HttpStatus.CREATED.value() );
}
}
}
Consumer code to which I have applied timer:
#Service
public class MessageListenerImpl
{
private static final Logger logger = Logger.getLogger( MessageListenerImpl.class );
public void processMessage( String message )
{
try {
Thread.sleep(10000);
logger.info( "Received message: " + message );
System.out.println( "MessageListener::::::Received message: " + message );
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
Further by searching more I found here that As per the JMS specification, the default delivery mode is persistent. But in my case it does not seems to be worked.
Please help me to have right configuration in place so that messages can be persist across broker failure.

This is typically not a problem, but the way activeMQ was built.
you can find the following explanation in book 'ActiveMQ in action'
Once a message has been consumed and acknowledged by a message
consumer, it’s typically deleted from the broker’s message store.
So when you restart your server, it only shows you the messages which are in broker's message store. In most cases you will never have the need to have a look at the processed messages.
Hope this helps!
Good luck!

Related

Delete File after successful persist to MongoDB in Spring Integration

I have a Spring Integration flow that reads a csv file from a directory, splits the lines, then processes each line and extracts 2 objects from each line. These two objects are then send to two seperate int-mongodb:outbound-channel-adapter. I want to delete the incoming file after all of the lines have been processed and persisted. I have seen example of using the Transaction Manager to do this with the inbound adapter, but nothing with the outbound adapter. Is there a way to do this?
My config looks something like this:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/file http://www.springframework.org/schema/integration/file/spring-integration-file.xsd
http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd
http://www.springframework.org/schema/integration/mongodb http://www.springframework.org/schema/integration/mongodb/spring-integration-mongodb.xsd
http://www.springframework.org/schema/data/mongo http://www.springframework.org/schema/data/mongo/spring-mongo.xsd"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-file="http://www.springframework.org/schema/integration/file"
xmlns:task="http://www.springframework.org/schema/task"
xmlns:int-mongodb="http://www.springframework.org/schema/integration/mongodb"
xmlns:mongo="http://www.springframework.org/schema/data/mongo">
<int:poller default="true" fixed-delay="50"/>
<int-file:inbound-channel-adapter id="filesInChannel"
directory="file:${file.ingest.directory}"
auto-create-directory="true">
<int:poller id="poller" fixed-rate="100">
</int:poller>
</int-file:inbound-channel-adapter>
<task:executor id="executor" pool-size="10" queue-capacity="50" />
<int:channel id="executorChannel">
<int:queue capacity="50"/>
</int:channel>
<int:splitter input-channel="filesInChannel" output-channel="executorChannel"
expression="T(org.apache.commons.io.FileUtils).lineIterator(payload)"/>
<int:service-activator id="lineParserActivator" ref="lineParser" method="parseLine"
input-channel="executorChannel" output-channel="lineChannel">
<int:poller task-executor="executor" fixed-delay="500">
</int:poller>
</int:service-activator>
<bean name="lineParser" class="com.xxx.LineParser"/>
<int:channel id="lineChannel">
<int:queue/>
</int:channel>
<int:channel id="lineMongoOutput">
<int:queue/>
</int:channel>
<int:channel id="actionMongoOutput">
<int:queue/>
</int:channel>
<int:transformer input-channel="lineChannel" output-channel="lineMongoOutput">
<bean id="lineTransformer" class="com.xxx.transformer.LineTransformer"></bean>
</int:transformer>
<int:transformer input-channel="lineChannel" output-channel="actionMongoOutput">
<bean id="actionTransformer" class="com.xxx.transformer.ActionTransformer"></bean>
</int:transformer>
<mongo:db-factory id="mongoDbFactory" dbname="${mongo.db.name}" password="${mongo.db.pass}" username="${mongo.db.user}" port="${mongo.db.port}" host="${mongo.db.host}"/>
<int-mongodb:outbound-channel-adapter id="lineMongoOutput"
collection-name="full"
mongodb-factory="mongoDbFactory" />
<int-mongodb:outbound-channel-adapter id="actionMongoOutput"
collection-name="action"
mongodb-factory="mongoDbFactory" />
</beans>
You can't really do it on the outbound adapter because you don't know when you're "done". Given you are asynchronously handing off to the downstream flow (via executors and queue channels), you can't do it on the inbound adapter either, because the poller thread will return to the adapter as soon as all the splits are sent.
Aside from that, I see some issues in your flow:
You seem to have an excessive amount of thread handoffs - you really don't need queue channels in the downstream flow because your executions are controlled by the exec. channel.
It is quite unusual to make every channel a QueueChannel.
Finally, you have 2 transformers subscribed to the same channel.
Do you realize that messages sent to lineChannel will alternate round-robin style.
Perhaps that is your intent, given your description, but it seems a little brittle to me; I would prefer to see the different data types going to different channels.
If you avoid using queue channels, and use gateways within your service activator to send out the data to the mongo adapters, your service activator would know when it is complete and be able to remove the file at that time.
EDIT:
Here is one solution (it writes to logs rather than mongo, but you should get the idea)...
<int-file:inbound-channel-adapter directory="/tmp/foo" channel="toSplitter">
<int:poller fixed-delay="1000">
<int:transactional synchronization-factory="sf" transaction-manager="ptxMgr" />
</int:poller>
</int-file:inbound-channel-adapter>
<int:transaction-synchronization-factory id="sf">
<int:after-commit expression="payload.delete()" />
<int:after-rollback expression="payload.renameTo(new java.io.File('/tmp/bad/' + payload.name))" />
</int:transaction-synchronization-factory>
<bean id="ptxMgr" class="org.springframework.integration.transaction.PseudoTransactionManager" />
<int:splitter input-channel="toSplitter" output-channel="processChannel">
<bean class="org.springframework.integration.file.splitter.FileSplitter" />
</int:splitter>
<int:service-activator input-channel="processChannel">
<bean class="foo.Foo">
<constructor-arg ref="gate" />
</bean>
</int:service-activator>
<int:gateway id="gate" service-interface="foo.Foo$Gate">
<int:method name="toLine" request-channel="toLine" />
<int:method name="toAction" request-channel="toAction" />
</int:gateway>
<int:channel id="toLine" />
<int:logging-channel-adapter channel="toLine" expression="'LINE:' + payload" level="WARN"/>
<int:channel id="toAction" />
<int:logging-channel-adapter channel="toAction" expression="'ACTION:' + payload" level="WARN"/>
.
public class Foo {
private final Gate gateway;
public Foo(Gate gateway) {
this.gateway = gateway;
}
public void parse(String payload) {
String[] split = payload.split(",");
if (split.length != 2) {
throw new RuntimeException("Bad row size: " + split.length);
}
this.gateway.toLine(split[0]);
this.gateway.toAction(split[1]);
}
public interface Gate {
void toLine(String line);
void toAction(String action);
}
}
.
#ContextConfiguration
#RunWith(SpringJUnit4ClassRunner.class)
public class FooTests {
#Test
public void testGood() throws Exception {
File file = new File("/tmp/foo/x.txt");
FileOutputStream fos = new FileOutputStream(file);
fos.write("foo,bar".getBytes());
fos.close();
int n = 0;
while(n++ < 100 && file.exists()) {
Thread.sleep(100);
}
assertFalse(file.exists());
}
#Test
public void testBad() throws Exception {
File file = new File("/tmp/foo/y.txt");
FileOutputStream fos = new FileOutputStream(file);
fos.write("foo".getBytes());
fos.close();
int n = 0;
while(n++ < 100 && file.exists()) {
Thread.sleep(100);
}
assertFalse(file.exists());
file = new File("/tmp/bad/y.txt");
assertTrue(file.exists());
file.delete();
}
}
Add a task executor to the <poller/> to process multiple files concurrently. Add a router as needed.

Spring amqp not publishing message to the queue but to Exchange

I am trying to test & benchmark spring-amqp for RabbitMQ with multiple queues so I was creating rabbit template for each queue and using it to send message. The message sent is successful and I can see a message published in the exchange but I don't see anything in the queue. I am guessing it's very minor setting but can't figure it out.
This is my applicationContext.xml
<bean id="banchmarkConnectionFactory" class="org.springframework.amqp.rabbit.connection.CachingConnectionFactory">
<constructor-arg ref="benchmarkAmqpHost"/>
<property name="username" ref="benchmarkAmqpUser"/>
<property name="password" ref="benchmarkAmqpPass"/>
<property name="virtualHost" ref="benchmarkAmqpVHost"/>
<property name="channelCacheSize" value="10"/>
</bean>
<rabbit:template id="benchmarkAmqpTemplate"
connection-factory="banchmarkConnectionFactory"
exchange="my_exchange"
queue="BenchmarkQueue"
routing-key="BenchmarkQueue" />
<rabbit:admin connection-factory="banchmarkConnectionFactory"/>
<rabbit:queue name="BenchmarkQueue" auto-delete="true" durable="false" auto-declare="true"/>
This is my code which uses the benchmarkAmqpTemplate to publish to the queue.
public class publishMessage {
#Autowired
private RabbitTemplate benchmarkAmqpTemplate;
protected void publish(String payload) {
benchmarkAmqpTemplate.setQueue("BenchmarkQueue");
benchmarkAmqpTemplate.convertAndSend("my_exchange", "BenchmarkQueue", payload);
}
}
When I used the HelloWorld example it did publish a message in the queue so was wondering if I am doing something wrong.
UPDATE
I was able to solve this by adding direct-exchange tag in my context xml. My full xml looks like this:
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:rabbit="http://www.springframework.org/schema/rabbit"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsdhttp://www.springframework.org/schema/rabbit http://www.springframework.org/schema/rabbit/spring-rabbit.xsd">
<bean id="banchmarkConnectionFactory" class="org.springframework.amqp.rabbit.connection.CachingConnectionFactory">
<constructor-arg ref="benchmarkAmqpHost"/>
<property name="username" ref="benchmarkAmqpUser"/>
<property name="password" ref="benchmarkAmqpPass"/>
<property name="virtualHost" ref="benchmarkAmqpVHost"/>
<property name="channelCacheSize" value="10"/>
</bean>
<rabbit:template id="benchmarkAmqpTemplate"
connection-factory="banchmarkConnectionFactory"
exchange="my_exchange"
queue="BenchmarkQueue"
routing-key="BenchmarkQueue" />
<rabbit:admin connection-factory="banchmarkConnectionFactory"/>
<rabbit:queue name="BenchmarkQueue" auto-delete="true" durable="false" auto-declare="true"/>
<rabbit:direct-exchange name="my_exchange">
<rabbit:bindings>
<rabbit:binding queue="BenchmarkQueue" key="BenchmarkQueue" />
</rabbit:bindings>
</rabbit:direct-exchange>
</beans>
Sorry, but it looks like you misunderstood AMQP protocol a bit.
The message is published to the Exchange with the proper routingKey.
The publisher (RabbitTemplate) doesn't need to know about queues at all.
The queue is a part of receiver, subscriber to the queue.
There is one more feature in between - binding. The queue is bound to the Exchange under the appropriate routingKey. One queue can be bound to several exchanges with different routing keys. By default all queues are bound to the default exchange ("") with routingKeys equal to their names.
Please, refer for more info to the RabbitMQ site.

ActiveMQ, Camel, Spring, simple route not working

I am having some trouble with a simple camel route that should be grabbing a message from an ActiveMQ topic that I have and then printing out the contents of the messages to the console through the use of log.
Right now all that it is is the camel-context.xml, and a java class that is producing the topic in ActiveMQ and adding a simple string message to the queue. I am using the ActiveMQ interface to check to see if the topic is being created, and it is, but my message is not being added to the topic nor is it being routed through the camel route. Running main I can get the output of my sys out to the console, and I see that 1 message is "enqueued" and 1 message is "dequeued" in the activemq interface. I just do not get any output to the console from the "log message" in my route.
Any help or tips would be greatly appreciated since I am new to all 3 of these technologies, and I just want to get this simple "Hello World" working.
Thank you! The two files are found below:
After further testing I think that it just has something to do with the way that I am trying to log the contents of the message, because I know that it is picking up my camel route because I added a second topic and told the camel route to route the messages to it like the following:
to uri="activemq:topic:My.SecondTestTopic"
and I am able to see if being redirected to that queue in the activeMQ interface.
TestMessageProducer.java
package com.backend;
import javax.jms.*;
import org.apache.activemq.ActiveMQConnection;
import org.apache.activemq.ActiveMQConnectionFactory;
public class TestMessageProducer {
private static String url = ActiveMQConnection.DEFAULT_BROKER_URL;
public static void main(String[] args) throws JMSException {
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory(url);
Connection connection = connectionFactory.createConnection();
connection.start();
Session session = connection.createSession(false,
Session.AUTO_ACKNOWLEDGE);
Topic topic = session.createTopic("My.TestTopic");
MessageProducer producer = session.createProducer(topic);
TextMessage message = session.createTextMessage();
message.setText("THIS IS A TEST TEXT MESSAGE BEING SENT TO THE TOPIC AND HOPEFULLY BEING PICKED UP BY THE" +
"CAMEL ROUTE");
producer.send(message);
System.out.println("Sent message '" + message.getText() + "'");
connection.close();
}
}
Camel-context.xml
<?xml version="1.0" encoding="UTF-8"?>
<spring:beans xmlns:spring="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://camel.apache.org/schema/spring"
xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:alch="http://service.alchemy.kobie.com/"
xsi:schemaLocation="http://www.springframework.org/schema/beans classpath:META-INF/spring/spring-beans.xsd
http://camel.apache.org/schema/spring classpath:META-INF/spring/camel-spring.xsd">
<!-- load properties -->
<spring:bean
class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<spring:property name="locations" value="file:backend.properties" />
</spring:bean>
<spring:bean id="properties"
class="org.apache.camel.component.properties.PropertiesComponent">
<spring:property name="location" value="file:backend.properties" />
</spring:bean>
<spring:bean id="jmsConnectionFactory"
class="org.apache.activemq.ActiveMQConnectionFactory">
<spring:property name="brokerURL" value="tcp://0.0.0.0:61616?useLocalHost=true" />
</spring:bean>
<spring:bean id="pooledConnectionFactory"
class="org.apache.activemq.pool.PooledConnectionFactory">
<spring:property name="maxConnections" value="8" />
<spring:property name="maximumActive" value="500" />
<spring:property name="connectionFactory" ref="jmsConnectionFactory" />
</spring:bean>
<spring:bean id="jmsConfig"
class="org.apache.camel.component.jms.JmsConfiguration">
<spring:property name="connectionFactory" ref="pooledConnectionFactory"/>
<spring:property name="transacted" value="false"/>
<spring:property name="concurrentConsumers" value="1"/>
</spring:bean>
<spring:bean id="activemq"
class="org.apache.activemq.camel.component.ActiveMQComponent">
<spring:property name="configuration" ref="jmsConfig"/>
</spring:bean>
<!-- camel configuration -->
<camelContext xmlns="http://camel.apache.org/schema/spring">
<route>
<from uri="activemq:topic:My.TestTopic"/>
<log message="Output of message from Queue: ${in.body}"/>
<to uri="activemq:topic:My.SecondTestTopic" />
</route>
And you have started the Camel application first. Eg as you send non persistent messages to a topic. And if there is no active subscribers when sending, then the messages is not received by anybody. You may want to use persistent durable topic instead.
I suspect you are creating two seperate instances of an ActiveMQ broker. Can you update your TestMessageProducer to use the URL tcp://localhost:61616 ? Also, can you use jconsole to check the topic activity in the activemq instance on both VMs ?
=== UPDATE ===
I missed the bit about you verifying that the 2nd topic did receive the message, so your route is working.... Must be the logger. If you have the camel source code in your IDE, you could turn the debugger on and place a break point on
org.apache.camel.component.log.LogProducer
.process(Exchange exchange, AsyncCallback callback)
to see what happens and if it is called. Which logging package do you have configured ?

Spring integration: Receiving JMS messages via adapter randomly fails

I am trying to write channel adapter for JMS queue. I'd like to send message to JMS queue and receive it on Spring Integration's channel.
When I use plain JMS (with ActiveMQ), everything works properly, so I assume bug is somewhere in my Spring code.
So here's my Spring config file:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:jms="http://www.springframework.org/schema/integration/jms"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration
http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/jms
http://www.springframework.org/schema/integration/jms/spring-integration-jms.xsd">
<!-- jms beans -->
<bean id="jms.msgQueue" class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg value="MSG_QUEUE" />
</bean>
<bean name="jms.connectionFactory" class="org.apache.activemq.ActiveMQConnectionFactory">
<property name="brokerURL" value="tcp://localhost:61616" />
</bean>
<!-- spring integration beans -->
<int:channel id="channels.jms.allMessages">
<int:queue capacity="1000" />
</int:channel>
<jms:inbound-channel-adapter id="adapters.jms.msgAdapter"
connection-factory="jms.connectionFactory"
destination="jms.msgQueue"
channel="channels.jms.allMessages">
<int:poller fixed-rate="500" />
</jms:inbound-channel-adapter>
</beans>
Here is plain JMS sending-receiving code that works without problems:
#Test
public void testPlainJms() throws JMSException {
MessageProducer producer = session.createProducer(msgQueue);
MessageConsumer consumer = session.createConsumer(msgQueue);
// send to JMS queue
TextMessage textMessage = session.createTextMessage();
textMessage.setText("Message from JMS");
producer.send(textMessage);
connection.start();
javax.jms.Message message = consumer.receive(TIMEOUT);
assertEquals("Message from JMS", ((TextMessage) message).getText());
connection.stop();
}
And here is code using Spring's MessageChannel that usually doesn't work (it sometimes does but I am not able to determine when):
#Test
public void testInboundAdapter() throws JMSException {
MessageProducer producer = session.createProducer(msgQueue);
// send to JMS queue
TextMessage textMessage = session.createTextMessage();
textMessage.setText("Message from JMS");
producer.send(textMessage);
// receive in local channel (using inbound adapter)
Message<?> received = ((PollableChannel) msgChannel).receive(TIMEOUT);
String payload = (String) received.getPayload();
assertEquals("Message from JMS", payload);
}
I am getting NullPointerException on receiving message from pollable msgChannel. Here's how I autowired beans from Spring config to my test class:
#Autowired #Qualifier("jms.msgQueue")
Queue msgQueue;
#Autowired #Qualifier("channels.jms.allMessages")
MessageChannel msgChannel;
#Autowired
ConnectionFactory connectionFactory;
Strange, I am able to reproduce your issue, could be a bug with inbound-channel-adapter, I have something which consistently works though, by changing from an inbound-channel-adapter to message-driven-channel-adapter, try with this:
<jms:message-driven-channel-adapter id="adapters.jms.msgAdapter"
connection-factory="jms.connectionFactory"
destination="jms.msgQueue"
channel="channels.jms.allMessages" />
It fail when it timeout.
Message<?> received = ((PollableChannel) msgChannel).receive(TIMEOUT);
You must check received for null

Use of Spring Framework for RabbitMQ is reducing the performance

I have created a producer, which was using com.rabbitmq.client.connectionFactory and was sending 1,000,000 messages (40 Bytes) in 100 seconds.
But now I want an spring abstraction. I was unable to use com.rabbitmq.client.connectionFactory rather I had to use org.springframework.amqp.rabbit.connection.SingleConnectionFactory. Using this connection factory only 100,000 messages (40 Bytes) are send to the broker in 100 seconds.
Does anybody have experience why the performance is reduced so much (around 90%).
The code using "import com.rabbitmq.client.ConnectionFactory;" is ->
package Multiple_queues_multiple_consumers;
import java.io.IOException;
import com.rabbitmq.client.AMQP;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.ConnectionFactory;
public class Producer {
private static Connection myConnection;
private static Channel myChannel;
public static String myQueueName;
public static void main(String[] args) throws IOException {
long startTime=0;
int count=0;
ConnectionFactory myFactory=new ConnectionFactory();
myFactory.setHost("localhost");
try {
myConnection = myFactory.newConnection();
myChannel = myConnection.createChannel();
String myExchange = "wxyzabc";
String myBody = "This is a message : message numberxxxxxx";
String myRoutingKey = "RoutingKey";
myQueueName = "new_Queue";
myChannel.exchangeDeclare(myExchange, "direct", true, false, null);
myChannel.queueDeclare(myQueueName, true, false, false, null);
myChannel.queueBind(myQueueName, myExchange, myRoutingKey);
startTime=System.currentTimeMillis();
AMQP.BasicProperties properties = new AMQP.BasicProperties();
properties.setDeliveryMode(2);
startTime=System.currentTimeMillis();
while(count++<=10000){
myChannel.basicPublish(myExchange, myRoutingKey, true, true, properties, myBody.getBytes() );
}
System.out.println(System.currentTimeMillis()-startTime);
} catch (Exception e){
System.exit(0);
}
}
}
The code using SpringFramework is :->
Producer1.java
import org.springframework.amqp.core.AmqpAdmin;
import org.springframework.amqp.core.Binding;
import org.springframework.amqp.core.DirectExchange;
import org.springframework.amqp.core.Message;
import org.springframework.amqp.core.Queue;
import org.springframework.amqp.rabbit.core.RabbitAdmin;
import org.springframework.amqp.rabbit.core.RabbitTemplate;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
public class Producer1 {
public static void main(String[] args) {
ConfigurableApplicationContext context = new ClassPathXmlApplicationContext("Producer1.xml");
AmqpAdmin amqpAdmin = context.getBean(RabbitAdmin.class);
Queue queue = new Queue("sampleQueue");
DirectExchange exchange = new DirectExchange("myExchange");
Binding binding = new Binding(queue, exchange, "");
amqpAdmin.declareQueue(queue);
amqpAdmin.declareExchange(exchange);
amqpAdmin.declareBinding(binding);
RabbitTemplate rabbitTemplate = context.getBean(RabbitTemplate.class);
String routingKey = "";
String myBody = "This is a message : message numberxxxxxx";
Message Msg = new Message(myBody.getBytes(), null);
int count=0;
long CurrTime = System.currentTimeMillis();
while(count++<=10000){
rabbitTemplate.send(routingKey, Msg);
//System.out.println("Message Sent");
}
System.out.println(System.currentTimeMillis()-CurrTime);
}
}
Producer1.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:p="http://www.springframework.org/schema/p"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<!-- Define a connectionFactory -->
<bean id="rabbitConnectionFactory" class="com.rabbitmq.client.ConnectionFactory">
<property name="host" value="localhost" />
</bean>
<bean id="connectionFactory" class="org.springframework.amqp.rabbit.connection.SingleConnectionFactory">
<constructor-arg ref="rabbitConnectionFactory"/>
</bean>
<!-- Tell the Admin bean about that connectionFactory and initialize it, create a queue and an exchange on Rabbit Broker using the RabbitTemplate provided by Spring framework-Rabbit APIs -->
<bean id="Admin" class="org.springframework.amqp.rabbit.core.RabbitAdmin">
<constructor-arg ref="connectionFactory" />
</bean>
<bean id="rabbitTemplate" class="org.springframework.amqp.rabbit.core.RabbitTemplate"
p:connectionFactory-ref="connectionFactory"
p:routingKey="myRoutingKey"
p:exchange="myExchange" />
</beans>
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:p="http://www.springframework.org/schema/p"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
<!-- Define a connectionFactory -->
<bean id="connectionFactory" class="com.rabbitmq.client.connectionFactory">
<constructor-arg value="localhost" />
<property name="username" value="guest" />
<property name="password" value="guest" />
</bean>
<bean id="Admin" class="org.springframework.amqp.rabbit.core.RabbitAdmin">
<constructor-arg ref="connectionFactory" />
</bean>
</beans>
Using this xml file, the error appears saying org.springframework.amqp.rabbit.core.RabbitAdmin could not cast com.rabbitmq.client.connectionFactory for connectionfactory bean.
The exact error is: "nested exception is java.lang.IllegalStateException: Cannot convert value of type [com.rabbitmq.client.ConnectionFactory] to required type [org.springframework.amqp.rabbit.core.RabbitTemplate]: no matching editors or conversion strategy found" .
Hence i have to use bean:
<bean id="connectionFactory"
class="org.springframework.amqp.rabbit.connection.SingleConnectionFactory">
</bean>
Are you sure that you used the same Rabbit MQ broker? Could you be using a broker on a different server, or an upgraded/downgraded version of RabbitMQ?
The other thing to look at is your jvm. Is it possible that you don't have enough memory configured and now the garbage collector is thrashing? Run top and see if the jvm's memory usage is close to the configured memory size.
Are you using an old version of RabbitMQ. Lots of Linux distros include RabbitMQ 1.7.2 which is an old version that has problems with large numbers of messages. Large is hard to define because it depends on your RAM, but RabbitMQ does not like to use more than 40% of RAM because it needs to copy a persistence transaction log in order to process it and clean it for log rollover. This can cause RabbitMQ to crash, and, of course, processing the huge logs will slow it down. RabbitMQ 2.4.1 handles the persister logs much better, in smaller chunks, and it also has much, much faster message routing code.
This still sounds to me like a Java problem, either Spring is just a pig and is terribly inefficient, or you have not given your jvm enough RAM to avoid frequent gc runs. What setting are you using for -Xmx?

Resources