Which one to use RabbitTemplate or AmqpTemplate? - spring

I have the following program written in Spring Boot which is working fine. However, the problem is that the I am not sure whether I should be using RabbitTemplate or AmqpTemplate. Some of the online examples/tutorials use RabbitTemplate while others use AmqpTemplate.
Please guide as to what is the best practice and which one should be used.
#SpringBootApplication
public class BasicApplication {
private static RabbitTemplate rabbitTemplate;
private static final String QUEUE_NAME = "helloworld.q";
//this definition of Queue is required in RabbitMQ. Not required in ActiveMQ
#Bean
public Queue queue() {
return new Queue(QUEUE_NAME, false);
}
public static void main(String[] args) {
try (ConfigurableApplicationContext ctx = SpringApplication.run(BasicApplication.class, args)) {
rabbitTemplate = ctx.getBean(RabbitTemplate.class);
rabbitTemplate.convertAndSend(QUEUE_NAME, "Hello World !");
}
}
}

In most cases, for Spring beans, I would advise using the interface, in case Spring creates a JDK proxy for any reason. That would be unusual for the RabbitTemplate so it doesn't really matter which you use.
In some cases, you might need methods on the RabbitTemplate that don't appear on the interface; which would be another case where you would need to use it.
In general, though, best practice is for user code to use interfaces so you don't have hard dependencies on implementations.

AmqpTemplate is an interface. RabbitTemplate is an implementation of the AmqpTemplate interface. You should use RabbitTemplate.

Related

Spring Kafka Requirements for Supporting Multiple Consumers

As one would expect its common to want to have different Consumers deserializing in different ways off topics in Kafka. There is a known problem with spring boot autoconfig. It seems that as soon as other factories are defined Spring Kafka or the autoconfig complains about not being able to find a suitable consumer factory anymore. Some have pointed out that one solution is to include a ConsumerFactory of type (Object, Object) in the config. But no one has shown the source code for this or clarified if it needs to be named in any particular way. Or if simply adding this Consumer to the config removes the need to turn off autoconfig. All that remains very unclear.
If you are not familiar with this issue please read https://github.com/spring-projects/spring-boot/issues/19221
Where it was just stated ok, define the ConsumerFactory and add it somewhere in your config. Can someone be a bit more precise about this please.
Show exactly how to define the ConsumerFactory so that Spring boot autoconfig will not complain.
Explain if turning off autoconfig is or is not needed?
Explain if Consumer Factory needs to be named in any special way or not.
The simplest solution is to stick with Boot's auto-configuration and override the deserializer on the #KafkaListener itself...
#SpringBootApplication
public class So63108344Application {
public static void main(String[] args) {
SpringApplication.run(So63108344Application.class, args);
}
#KafkaListener(id = "so63108344-1", topics = "so63108344-1")
public void listen1(String in) {
System.out.println(in);
}
#KafkaListener(id = "so63108344-2", topics = "so63108344-2", properties =
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG +
"=org.apache.kafka.common.serialization.ByteArrayDeserializer")
public void listen2(byte[] in) {
System.out.println(in);
}
#Bean
public NewTopic topic1() {
return TopicBuilder.name("so63108344-1").partitions(1).replicas(1).build();
}
#Bean
public NewTopic topic2() {
return TopicBuilder.name("so63108344-2").partitions(1).replicas(1).build();
}
}
For more advanced container customization (or if you don't want to pollute the #KafkaListener, you can use a ContainerCustomizer...
#Component
class Customizer {
public Customizer(ConcurrentKafkaListenerContainerFactory<?, ?> factory) {
factory.setContainerCustomizer(container -> {
if (container.getGroupId().equals("so63108344-2")) {
container.getContainerProperties().setAckMode(AckMode.MANUAL_IMMEDIATE);
container.getContainerProperties().getKafkaConsumerProperties()
.setProperty(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.ByteArrayDeserializer");
}
});
}
}

Bean injection for spring integration message handler

I am fairly new to spring and spring integration. What I'm trying to do: publish mqtt messages using spring integration.
Here is the code:
#Configuration
#IntegrationComponentScan
#Service
public class MQTTPublishAdapter {
private MqttConfiguration mqttConfiguration;
public MQTTPublishAdapter(MqttConfiguration mqttConfiguration) {
this.mqttConfiguration = mqttConfiguration;
}
#Bean
public MessageChannel mqttOutboundChannel() {
return new PublishSubscribeChannel();
}
#Bean
public MqttPahoClientFactory mqttClientFactory() {
DefaultMqttPahoClientFactory factory = new
DefaultMqttPahoClientFactory();
//... set factory details
return factory;
}
#Bean
#ServiceActivator(inputChannel = "mqttOutboundChannel")
public MQTTCustomMessageHandler mqttOutbound() {
String clientId = UUID.randomUUID().toString();
MQTTCustomMessageHandler messageHandler =
new MQTTCustomMessageHandler(clientId, mqttClientFactory());
//...set messagehandler details
return messageHandler;
}
//I extend this only because the publish method is protected and I want to
send messages to different topics
public class MQTTCustomMessageHandler extends MqttPahoMessageHandler {
//default constructors
public void sendMessage(String topic, String message){
MqttMessage mqttMessage = new MqttMessage();
mqttMessage.setPayload(message.getBytes());
try {
super.publish(topic, mqttMessage, null);
} catch (Exception e) {
log.error("Failure to publish message on topic " + topic,
e.getMessage());
}
}
}
This is the clase where I am trying to inject the Handler
#Service
public class MQTTMessagePublisher {
private MQTTCustomMessageHandler mqttCustomMessageHandler;
public MQTTMessagePublisher(#Lazy MQTTCustomMessageHandler
mqttCustomMessageHandler) {
this.mqttCustomMessageHandler = mqttCustomMessageHandler;
}
public void publishMessage(String topic, String message) {
mqttCustomMessageHandler.sendMessage(topic, message);
}
}
So my question is about how should I inject the bean I am trying to use because if I remove the #Lazy annotation it says that "Requested bean is currently in creation: Is there an unresolvable circular reference?". I do not have any circular dependencies as in the bean I only set some strings, so I'm guessing that I don't really understand how this should work.
Very sorry about the formating, it's one of my first questions around here.
Edit:
If I remove
#ServiceActivator(inputChannel = "mqttOutboundChannel")
and add
messageHandler.setChannelResolver((name) -> mqttOutboundChannel());
it works. I'm still unclear why the code crashes.
You show a lot of custom code, but not all of them.
It's really hard to answer to questions where it is only a custom code. Would be great to share as much info as possible. For example an external project on GitHub to let us to play and reproduce would be fully helpful and would save some time.
Nevertheless, I wonder what is your MQTTCustomMessageHandler. However I guess it is not a MessageHandler implementation. From here the #ServiceActivator annotation is not going to work properly since it is applied really for the mqttOutbound(), not whatever you expect. Or you need to move this annotation to your sendMessage() method in the MQTTCustomMessageHandler or have it as a MessageHandler.
On the other hand it is not clear why do you need that #ServiceActivator annotation at all since you call that method manually from the MQTTMessagePublisher.
Also it is not clear why you have so much custom code when Framework provides for your out-of-the-box channel adapter implementations.
Too many questions to your code, than possible answer...
See more info in the reference manual:
https://docs.spring.io/spring-integration/docs/current/reference/html/#annotations
https://docs.spring.io/spring-integration/docs/current/reference/html/#mqtt

Spring integartion LoggingHandler logs all messages to Error

I created a spring boot application that sends Messages through a PublishSubscribeChannel. This Channel is "autowired" as SubscribableChannel interface.
I am only subscribing one MessageHandler to this channel, a KafkaProducerMessageHandler.
My problem is that one additional MessageHandler is subscribed and this is an LoggingHandler. It is instantiated with ERROR level. So i see every message logged es error.
I want to know why and where this LoggingHandler is wired (instantiated) and why it is subscribed to the channel - i want to disable it.
(
I debugged around a bit but (was not really helpful):
The LoggingHandler is instantiated and subscribed after the KafkaHandler.
I see this chain EventdrivenConsumer.doStart()<-- ``ConsumerEndpointFactoryBean.initializeEndpoint()<-- ... until reflective calls
)
EDIT
As suggested in comments here is some code (i can't share the whole project). My problem is that the code can't explain the behavior. The LoggingHandler is beeing subscribed to my PublishSubscribeChannel for some unknown reason and it is instantiated with error as level for some unknown reason.
The class that subscribes the KafkaHandler:
#Component
public class EventRelay {
#Autowired
private EventRelay( SubscribableChannel eventBus, #Qualifier( KafkaProducerConfig.KAFKA_PRODUCER ) MessageHandler kafka ) {
eventBus.subscribe( kafka );
}
}
The class that send events is implementing an proprietary interface with many callback methods:
public class PropEvents implements PropClass.IEvents {
private SubscribableChannel eventBus;
private final ObjectMapper om;
private final String userId;
public PropEvents( SubscribableChannel eventBus, ObjectMapper om, String userId ) {
this.eventBus = eventBus;
this.om = om;
this.userId = userId;
}
#Override
public void onLogin( ) {
eventBus.send( new OnLoginMessage(... ) ) );
}
//many other onXYZ methods
}
Here is the Factory that produces instances of PropEvents:
#Configuration
public class EventHandlerFactory {
private final ObjectMapper om;
private final SubscribableChannel eventBus;
#Autowired
public EventHandlerFactory( ObjectMapper om, SubscribableChannel eventBus){
this.om = checkNotNull( om );
this.eventBus = checkNotNull( eventBus );
}
#Bean
#Scope( SCOPE_PROTOTYPE)
public IEvents getEvantHandler(String userId){
if(Strings.isNullOrEmpty(userId)){
throw new IllegalArgumentException( "user id must be set." );
}
return new PropEvents(eventBus, om, userId);
}
}
I appreciate any help with debugging or use tooling (e.g. Eclipse Spring tools does not show any hint to a LoggingHandler Bean) to find where and why a LoggingHandler is instantiated and subscribed to my autowired Channel.
My current workaround is to disable logging for LoggingHandler.
My question at a glance
Why spring instantiates an LoggingHandler with error level and subscribes it to my SubscribableChannel(provided by PublishSubscribeChannel)? How to disable this?
When you #Autowired SubscribableChannel, there should be one in the application context. That might be confusing a bit and mislead, but Spring Integration provides a PublishSubscribeChannel for the global errorChannel: https://docs.spring.io/spring-integration/docs/5.0.2.RELEASE/reference/html/messaging-channels-section.html#channel-special-channels
This one has a LoggingHandler to log error as a default subscriber.
I don't think that it is OK to make your logic based on the errorChannel.
You should consider to declare your own MessageChannel bean and inject it by the particular #Qualifier.

Set ConnectionFactory for Camel JMS Producer: camel-jms Vs camel-sjms

Ciao, my basic requirement is to have a route where I can send a message and this is put on a JMS Queue. The camel context run in a JavaEE 6 container namely JBoss AS 7.1.1 so it's HornetQ for JMS which ships with it; I start the context via bootstrap singleton but I don't use the camel-cdi. So far I've been using camel-jms component, but now I'm looking to migrate to the camel-sjms if possible because springless.
My question is: what is the proper way to configure the ConnectionFactory for camel-sjms in this JavaEE scenario, please?
With the camel-jms I could put this in the endpoint URL, as simple as .to("jms:myQueue?connectionFactory=#ConnectionFactory"). With the camel-sjms instead it seems to me that I need to create an instance of the SJMSComponent myself, set the connectionFactory, and set this instance in the camel context before starting it.
I have code below for the camel-jms Vs camel-sjms case, and I would like to know if I "migrated" the setting of the ConnectionFactory correctly. Thanks.
For camel-jms this was done as:
#Singleton
#Startup
public class CamelBootstrap {
private CamelContext camelContext;
private ProducerTemplate producerTemplate;
public CamelContext getCamelContext() {
return camelContext;
}
public ProducerTemplate getProducerTemplate() {
return producerTemplate;
}
#PostConstruct
public void init() throws Exception {
camelContext = new DefaultCamelContext();
camelContext.addRoutes(new MyCamelRoutes());
camelContext.start();
producerTemplate = camelContext.createProducerTemplate();
}
}
Nothing special, and in the MyCamelRoutes I could do route configuration using:
.to("jms:myQueue?connectionFactory=#ConnectionFactory")
For camel-sjms now I have to modify the bootstrap singleton with:
#Singleton
#Startup
public class CamelBootstrap {
#Resource(mappedName="java:/ConnectionFactory")
private ConnectionFactory connectionFactory;
private CamelContext camelContext;
private ProducerTemplate producerTemplate;
public CamelContext getCamelContext() {
return camelContext;
}
public ProducerTemplate getProducerTemplate() {
return producerTemplate;
}
#PostConstruct
public void init() throws Exception {
camelContext = new DefaultCamelContext();
SjmsComponent sjms = new SjmsComponent();
sjms.setConnectionFactory(connectionFactory);
camelContext.addComponent("sjms", sjms);
camelContext.addRoutes(new MyCamelRoutes());
camelContext.start();
producerTemplate = camelContext.createProducerTemplate();
}
}
and please notice #Resource for the connectionFactory this is passed as a reference to the SjmsComponent instance, which is passed to the camelContext. And then in the MyCamelRoutes I could use the sjms while do route configuration using:
.to("sjms:myQueue")
The code seems to work correctly in both scenario, but as I understand the configuration of the ConnectionFactory is quite susceptible of performance issue if not done correctly, therefore I prefer to ask if I migrated to the camel-sjms correctly for my JavaEE scenario. Thanks again
Performance issues are likely to happend if you don't do caching/pooling of JMS resources. Caching is typically configured by wrapping a ConnectionFactory in some Caching ConnectionFactory library - or by handing over the control to the application server.
Camel SJMS includes built-in pooling. However, if you have a container managed resource to handle JMS connections, you should probably consider using it. SJMS has some facilities to deal with that, ConncetionResource instead of ConnectionFactory.

Declaration of exchanges and queues in Spring AMQP

I'm using RabbitMQ and trying to refactor my current native java implementation to using the Spring AMQP abstraction.
Declaration of exchanges, queues and their binding using the Spring library is via the AMQPAdmin interface, but I'm not sure when this sort of configuration should happen.
I have a web application that uses Rabbit to produce messages. And another app that consumes these messages. Shocker :)
But when show the declaration of the exchanges/queues take place?
Do I deploy the AMQPAdmin with the web applications and do exchange/queue administration within constructors of producers and consumers?
Declaration of these things are a one off, the broke doesn't need to know about them again, so any code would be a NOOP on subsequent executions.
Do I create a separate application for administration of the broker?
What is the current thinking or best practices here?
It would appear that very few people are using Spring's AMQP M1 release, so I will answer my own question with what I've done.
In the producer's constructor I declare the exchange. Then set the exchange on the RabbitTemplate. I also set the routing key on the RabbitTemplate as the queue name, but that isn't required, but it was the route I would be using.
#Service("userService")
public class UserService {
private final RabbitTemplate rabbitTemplate;
#Autowired
public UserService(final RabbitAdmin rabbitAdmin,
final Exchange exchange,
final Queue queue,
#Qualifier("appRabbitTemplate") final RabbitTemplate rabbitTemplate) {
this.rabbitTemplate = rabbitTemplate;
rabbitAdmin.declareExchange(exchange);
rabbitTemplate.setExchange(exchange.getName());
rabbitTemplate.setRoutingKey(queue.getName());
}
public void createAccount(final UserAccount userAccount) {
rabbitTemplate.convertAndSend("Hello message sent at " + new DateTime());
}
}
In the consumer's constructor I declare the queue and create the binding.
public class Consumer implements ChannelAwareMessageListener<Message> {
public Consumer(final RabbitAdmin rabbitAdmin, final Exchange exchange, final Queue queue) {
rabbitAdmin.declareQueue(queue);
rabbitAdmin.declareBinding(BindingBuilder.from(queue).to((DirectExchange) exchange).withQueueName());
}
#Override
public void onMessage(Message message, Channel channel) throws Exception {
System.out.println(new String(message.getBody()));
channel.basicAck(message.getMessageProperties().getDeliveryTag(), true);
}
}
Although the constructors may be run many times, RabbitMQ only declares the exchange, queue and bindings once.
If you need the whole source for this little example project, ask, and I'll put it up somewhere for you.

Resources