Spring Integration - kafka Outbound adapter not taking topic value exposed as spring bean - spring

I have successfully integrated kafka outbound channle adapter with fixed topic name. Now, i want to make the topic name configurable and hence, want to expose it via application properties.
application.properties contain one of the following entry:
kafkaTopic:testNewTopic
My configuration class looks like below:
#Configuration
#Component
public class KafkaConfig {
#Value("${kafkaTopic}")
private String kafkaTopicName;
#Bean
public String getTopic(){
return kafkaTopicName;
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");//this.brokerAddress);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
// set more properties
return new DefaultKafkaProducerFactory<>(props);
}
}
and in my si-config.xml, i have used the following (ex: topic="getTopic") :
<int-kafka:outbound-channel-adapter
id="kafkaOutboundChannelAdapter" kafka-template="kafkaTemplate"
auto-startup="true" sync="true" channel="inputToKafka" topic="getTopic">
</int-kafka:outbound-channel-adapter>
However, the configuration is unable to pick up the topic name when exposed via bean. But it works fine when i hard code the value of the topic name.
Can someone please suggest what i am doing wrong here?
Does topic within kafka outbound channel accept the value referred as bean?
How do i externalize it as every application using my utility will supply different kafka topic names

The topic attribute is for string value.
However it supports property placeholder resolution:
topic="${kafkaTopic}"
and also SpEL evaluation for aforementioned bean:
topic="#{getTopic}"
Just because this is allowed by the XML parser configuration.
However you may pay attention that KafkaTemplate, which you inject into the <int-kafka:outbound-channel-adapter> has defaultTopic property. Therefore you won't need to worry about that XML.
And one more option available for you is Spring Integration Annotations configuration. Where you can define a #ServiceActivator for the KafkaProducerMessageHandler #Bean:
#ServiceActivator(inputChannel = "inputToKafka")
#Bean
KafkaProducerMessageHandler kafkaOutboundChannelAdapter() {
kafkaOutboundChannelAdapter adapter = new kafkaOutboundChannelAdapter( kafkaTemplate());
adapter.setSync(true);
adapter.setTopicExpression(new LiteralExpression(this.kafkaTopicName));
return adapter;
}

Related

SpEL KafkaListener. How can i inject custom deserializer through properties?

I am using spring.
I have a configured ObjectMapper for the entire project and I use it to set up a kafka deserializer.
And then I need a custom kafka deserializer to be used in KafkaListener.
I'm configuring KafkaListener via autoconfiguration, not via #Configuration class.
#Component
#RequiredArgsConstructor
public class CustomMessageDeserializer implements Deserializer<MyMessage> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public MyMessage deserialize(String topic, byte[] data) {
return objectMapper.readValue(data, MyMessage.class);
}
}
If i do like this
#KafkaListener(
topics = {"${topics.invite-user-topic}"},
properties = {"value.deserializer=com.service.deserializer.CustomMessageDeserializer"}
)
public void receiveInviteUserMessages(MyMessage myMessage) {}
I received KafkaException: Could not find a public no-argument constructor
But with public no-argument constructor in CustomMessageDeserializer class i am getting NPE because ObjectMapper = null. It creates and uses a new class, not a spring component.
#KafkaListener supports SpEL expressions.
And I think that this problem can be solved using SpEL.
Do you have any idea how to inject spring bean CustomMessageDeserializer with SpEL?
There are no easy ways to do it with SPeL.
Analysis
To get started, see the JavaDoc for #KafkaListener#properties:
/**
*
* SpEL expressions must resolve to a String ...
*/
The value of value.deserializer is used to instantiate the specified deserializer class. Let's follow the call chain:
You specify this value in the #KafkaListener annotation, then you are probably not creating a bean of the ConsumerFactory.class. So Spring creates this bean class itself - see KafkaAutoConfiguration#kafkaConsumerFactory.
Next is the creation of the returned object new DefaultKafkaConsumerFactory(...) as ConsumerFactory<?,?> using the constructor for default delivery expressions keyDeserializer/valueDeserializer = () -> null
This factory is used to create a Kafka consumer (The entry point is the constructor KafkaMessageListenerContainer#ListenerConsumer, then KafkaMessageListenerContainer.this.consumerFactory.createConsumer...)
In the KafkaConsumer constructor, the valueDeserializer object is being created, because it is null (for the default factory of point 2 above):
if (valueDeserializer == null) {
this.valueDeserializer = config.getConfiguredInstance(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, Deserializer.class);
The implementation of config.getConfiguredInstance involves instantiating your deserializer class via a parameterless constructor using reflection and your String "com.service.deserializer.CustomMessageDeserializer" class name
Solutions
To use value.deserializer with your customized ObjectMapper, you must create the ConsumerFactory bean yourself using the setValueDeserializer(...) method. This is also mentioned in the second Important part of the JSON.Mapping_Types.Important documentation
If you don't want to create a ConsumerFactory bean, and also don't have complicated logic in your deserializer (you only have return objectMapper.readValue(data, MyMessage.class);), then register DefaultKafkaConsumerFactoryCustomizer:
#Bean
// inject your custom objectMapper
public DefaultKafkaConsumerFactoryCustomizer customizeJsonDeserializer(ObjectMapper objectMapper) {
return consumerFactory ->
consumerFactory.setValueDeserializerSupplier(() ->
new org.springframework.kafka.support.serializer.JsonDeserializer<>(objectMapper));
}
In this case, you don't need to create your own CustomMessageDeserializer class (remove it) and Spring will automatically parse the message into your MyMessage.
#KafkaListener annotation should also not contains the property properties = {"value.deserializer=com.my.kafka_test.component.CustomMessageDeserializer"}. This DefaultKafkaConsumerFactoryCustomizer bean will automatically be used to configure the default ConsumerFactory<?, ?> (see the implementation of the KafkaAutoConfiguration#kafkaConsumerFactory method)
Here how it works for me:
#KafkaListener(topics = "${solr.kafka.topic}", containerFactory = "batchFactory")
public void listen(List<SolrInputDocument> docs, #Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers, Acknowledgment ack) throws IOException {...}
And then I have 2 beans defined in my Configuration
#Profile("!test")
#Bean
#Autowired
public ConsumerFactory<String, SolrInputDocument> consumerFactory(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
DefaultKafkaConsumerFactory<String, SolrInputDocument> result = new DefaultKafkaConsumerFactory<>(props);
String validatedKeyDeserializerName = KafkaMessageType.valueOf(keyDeserializerName).toString();
ZiDeserializer<SolrInputDocument> deserializer = ZiDeserializerFactory.getInstance(validatedKeyDeserializerName);
result.setValueDeserializer(deserializer);
return result;
}
#Profile("!test")
#Bean
#Autowired
public ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> batchFactory(ConsumerFactory<String, SolrInputDocument> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
factory.setBatchListener(true);
factory.setConcurrency(2);
ExponentialBackOffWithMaxRetries backoff = new ExponentialBackOffWithMaxRetries(10);
backoff.setMultiplier(3); // Default is 1.5 but this seems more reasonable
factory.setCommonErrorHandler(new DefaultErrorHandler(null, backoff));
// Needed for manual commits
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Note that the interface ZiDeserializer<SolrInputDocument> deserializeris my interface and ZiDeserializerFactory.getInstance(validatedKeyDeserializerName); returns my custom implementation of ZiDeserializer. And ZiDeserializer extends org.apache.kafka.common.serialization.Deserializer. This works for me

Spring boot Kafka metrics configuration

We have a spring boot microservice using spring-kafka. There are a couple of Kafka listeners and producers and the Kafka related metrics are successfully presented at the actuator endpoint. But when adding another maven dependency with custom metrics, the Kafka related metrics provided by spring boot are removed. I suspect it's related to introducing new MeterRegistry bean, but overriding it with the same bean definition as in NoOpMeterRegistryConfiguration in spring-boot-actuator-autoconfigure module of spring-boot didn't help. I've also tried debugging KafkaMetricsAutoConfiguration, and it looks like all the beans are successfully instantiated. Can somebody with more experience with spring-boot-actuator-autoconfigure help me to understand the default configurations for Kafka related metrics, please?
The spring configuration in the new maven dependency is pretty simple, just ProducerFactory and MeterRegistry beans are defined
#Bean
#Primary
public ProducerFactory<Object, Object> producerFactory(MessageCounter messageCounter) {
Map<String, Object> configs = this.kafkaProperties.buildProducerProperties();
configs.put("interceptor.classes", Collections.singletonList(RecordSuccessfullySentInterceptor.class));
configs.put("message.counter.bean", messageCounter);
DefaultKafkaProducerFactory<Object, Object> producerFactory = new DefaultKafkaProducerFactory(configs);
producerFactory.setTransactionIdPrefix(this.transactionId);
return producerFactory;
}
#Bean
public MeterRegistry meterRegistry() {
return new PrometheusMeterRegistry(PrometheusConfig.DEFAULT);
}
#Bean({"retryProducerFactory"})
public ProducerFactory<Object, Object> retryProducerFactory(MessageCounter messageCounter) {
Map<String, Object> configs = this.kafkaProperties.buildProducerProperties();
configs.put("interceptor.classes", Collections.singletonList(RecordSuccessfullySentInterceptor.class));
configs.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
configs.put("message.counter.bean", messageCounter);
DefaultKafkaProducerFactory<Object, Object> producerFactory = new DefaultKafkaProducerFactory(configs);
producerFactory.setTransactionIdPrefix(this.transactionId);
return producerFactory;
}
It may be possible some transient dependency is messing up spring boot autoconfiguration for Kafka related metrics though.

Java JobRunr when using Spring Boot Redis Starter

How do I create and use the Redis connection that spring-boot-starter-data-redis creates? It doesn't seem like there is a Bean for RedisClient created by the default auto configuration so I'm not sure of the best way to do this.
The documentation does state that in this case you need to create the StorageProvider yourself which is fine, but can you reuse what Spring Boot has already created. I believe this would need to be a pooled connection which you would also need to enable through Spring Boot.
RedisTemplate offers a high-level abstraction for Redis interactions:
https://docs.spring.io/spring-data/data-redis/docs/current/reference/html/#redis:template
Redis autoconfiguration :
#AutoConfiguration
#ConditionalOnClass({RedisOperations.class})
#EnableConfigurationProperties({RedisProperties.class})
#Import({LettuceConnectionConfiguration.class, JedisConnectionConfiguration.class})
public class RedisAutoConfiguration {
public RedisAutoConfiguration() {
}
#Bean
#ConditionalOnMissingBean(
name = {"redisTemplate"}
)
#ConditionalOnSingleCandidate(RedisConnectionFactory.class)
public RedisTemplate<Object, Object> redisTemplate(RedisConnectionFactory redisConnectionFactory) {
RedisTemplate<Object, Object> template = new RedisTemplate();
template.setConnectionFactory(redisConnectionFactory);
return template;
}
#Bean
#ConditionalOnMissingBean
#ConditionalOnSingleCandidate(RedisConnectionFactory.class)
public StringRedisTemplate stringRedisTemplate(RedisConnectionFactory redisConnectionFactory) {
return new StringRedisTemplate(redisConnectionFactory);
}
}
Here you can find the corresponding configuration properties(including connection pool default configuration).
Simple implementation example :
https://www.baeldung.com/spring-data-redis-tutorial

Spring JMS HornetQ user is null

I am trying to connect to a remote HornetQ broker in a spring boot/spring jms application and setup a #JmsListener.
HornetQ ConnectionFactory is being fetched from JNDI registry that HornetQ instance hosts. Everything works fine as long as HornetQ security is turned off but when it is turned on I get this error
WARN o.s.j.l.DefaultMessageListenerContainer : Setup of JMS message listener invoker failed for destination 'jms/MI/Notification/Queue' - trying to recover. Cause: User: null doesn't have permission='CONSUME' on address jms.queue.MI/Notification/Queue
I ran a debug session to figure out that ConnectionFactory instance being returned is HornetQXAConnectionFactory but user and password fields are not set, which I believe is why user is null. I verified that user principal and credentials are set in JNDI properties but somehow it is not being passed on to ConnectionFactory instance. Any help on how I can get this setup working would be greatly appreciated.
This is my jms related config
#Configuration
#EnableJms
public class JmsConfig {
#Bean
public JmsListenerContainerFactory<?> jmsListenerContainerFactory(ConnectionFactory connectionFactory,
DefaultJmsListenerContainerFactoryConfigurer configurer) {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
configurer.configure(factory, connectionFactory);
factory.setDestinationResolver(destinationResolver());
return factory;
}
#Bean // Serialize message content to json using TextMessage
public MessageConverter jacksonJmsMessageConverter() {
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
converter.setTargetType(MessageType.BYTES);
converter.setTypeIdPropertyName("_type");
return converter;
}
#Value("${jms.jndi.provider.url}")
private String jndiProviderURL;
#Value("${jms.jndi.principal}")
private String jndiPrincipal;
#Value("${jms.jndi.credentials}")
private String jndiCredential;
#Bean
public JndiTemplate jndiTemplate() {
Properties env = new Properties();
env.put("java.naming.factory.initial", "org.jnp.interfaces.NamingContextFactory");
env.put("java.naming.provider.url", jndiProviderURL);
env.put("java.naming.security.principal", jndiPrincipal);
env.put("java.naming.security.credentials", jndiCredential);
return new JndiTemplate(env);
}
#Bean
public DestinationResolver destinationResolver() {
JndiDestinationResolver destinationResolver = new JndiDestinationResolver();
destinationResolver.setJndiTemplate(jndiTemplate());
return destinationResolver;
}
#Value("${jms.connectionfactory.jndiname}")
private String connectionFactoryJNDIName;
#Bean
public JndiObjectFactoryBean connectionFactoryFactory() {
JndiObjectFactoryBean jndiObjectFactoryBean = new JndiObjectFactoryBean();
jndiObjectFactoryBean.setJndiTemplate(jndiTemplate());
jndiObjectFactoryBean.setJndiName(connectionFactoryJNDIName);
jndiObjectFactoryBean.setResourceRef(true);
jndiObjectFactoryBean.setProxyInterface(ConnectionFactory.class);
return jndiObjectFactoryBean;
}
#Bean
public ConnectionFactory connectionFactory(JndiObjectFactoryBean connectionFactoryFactory) {
return (ConnectionFactory) connectionFactoryFactory.getObject();
}
}
JNDI and JMS are 100% independent as they are completely different specifications implemented in potentially completely different ways. Therefore the credentials you use for your JNDI lookup do not apply to your JMS resources. You need to explicitly set the username and password credentials on your JMS connection. This is easy using the JMS API directly (e.g. via javax.jms.ConnectionFactory#createConnection(String username, String password)). Since you're using Spring you could use something like this:
#Bean
public ConnectionFactory connectionFactory(JndiObjectFactoryBean connectionFactoryFactory) {
UserCredentialsConnectionFactoryAdapter cf = new UserCredentialsConnectionFactoryAdapter();
cf.setTargetConnectionFactory((ConnectionFactory) connectionFactoryFactory.getObject());
cf.setUsername("yourJmsUsername");
cf.setPassword("yourJmsPassword");
return cf;
}
Also, for what it's worth, the HornetQ code-base was donated to the Apache ActiveMQ project three and a half years ago now and it lives on as the Apache ActiveMQ Artemis broker. There's been 22 releases since then with numerous new features and bug fixes. I strongly recommend you migrate if at all possible.
Wrap the connection factory in a UserCredentialsConnectionFactoryAdapter.
/**
* An adapter for a target JMS {#link javax.jms.ConnectionFactory}, applying the
* given user credentials to every standard {#code createConnection()} call,
* that is, implicitly invoking {#code createConnection(username, password)}
* on the target. All other methods simply delegate to the corresponding methods
* of the target ConnectionFactory.
* ...

Proper ultimate way to migrate JMS event listening to Spring Integration with Spring Boot

I got a JmsConfig configuration class that handles JMS events from a topic in the following way:
It defines a #Bean ConnectionFactory, containing an ActiveMQ implementation
It defines a #Bean JmsListenerContainerFactory instantiating a DefaultJmsListenerContainerFactory and passing it through Boot's DefaultJmsListenerContainerFactoryConfigurer
It defines a #Bean MessageConverter containing a MappingJackson2MessageConverter and setting a custom ObjectMapper
I use #JmsListener annotation pointing to myfactory on a method of my service. This is the only use I have for the topic, subscription alone.
Now I want to move to Spring Integration. After reading a lot, and provided I don't need a bidirectional use (discarding Gateways) neither a polling mechanism (discarding #InboundChannelAdapter), I am going for a message-driven-channel-adapter, in traditional XML configuration wording. I found that Java idiom should be accomplished by means of the new Spring Integration DSL library, and thus, I look for the proper snippet.
It seems JmsMessageDrivenChannelAdapter is the proper equivalent, and I found a way:
IntegrationFlows.from(Jms.messageDriverChannelAdapter(...))
But the problem is that this only accepts the ActiveMQ ConnectionFactory or an AbstractMessageListenerContainer, but no my boot pre-configured JmsListenerContainerFactory !
How should this be implemented in an ultimate way?
JmsListenerContainerFactory is specific for the #JmsListener, it's a higher level abstraction used to configure a DefaultMessageListenerContainer. Boot does not provide an auto configuration option for a raw DefaultMessageListenerContainer; you have to wire it up yourself. But you can still use the Boot properties...
#Bean
public IntegrationFlow flow(ConnectionFactory connectionFactory,
JmsProperties properties) {
return IntegrationFlows.from(Jms.messageDrivenChannelAdapter(container(connectionFactory, properties)))
...
.get();
}
private DefaultMessageListenerContainer container(ConnectionFactory connectionFactory,
JmsProperties properties) {
DefaultMessageListenerContainer container = new DefaultMessageListenerContainer();
container.setConcurrentConsumers(properties.getListener().getConcurrency());
container.setMaxConcurrentConsumers(properties.getListener().getMaxConcurrency());
...
return container;
}
There is even a better approach. I am surprised Gary did not comment it.
There's an out-of-the-box builder called Jms.container(...).
#Bean
public IntegrationFlow jmsMyServiceMsgInboundFlow(ConnectionFactory connectionFactory, MessageConverter jmsMessageConverter, MyService myService, JmsProperties jmsProperties, #Value("${mycompany.jms.destination.my-topic}") String topicDestination){
JmsProperties.Listener jmsInProps = jmsProperties.getListener();
return IntegrationFlows.from(
Jms.messageDrivenChannelAdapter( Jms.container(connectionFactory, topicDestination)
.pubSubDomain(false)
.sessionAcknowledgeMode(jmsInProps .getAcknowledgeMode().getMode())
.maxMessagesPerTask(1)
.errorHandler(e -> e.printStackTrace())
.cacheLevel(0)
.concurrency(jmsInProps.formatConcurrency())
.taskExecutor(Executors.newCachedThreadPool())
.get()))
)
.extractPayload(true)
.jmsMessageConverter(jmsMessageConverter)
.destination(topicDestination)
.autoStartup(true)
//.errorChannel("NOPE")
)
.log(LoggingHandler.Level.DEBUG)
.log()
.handle(myService, "myMethod", e -> e.async(true).advice(retryAdvice()))
.get();

Resources