Spring Kafka bean configuration- why invoking bean method instead of autowire? - spring

I have seen this pattern on tutorials and github projects regarding spring kafka bean declaration and
I don't understand why bean methods are invoked directly instead of autowire,
for example,
In https://www.baeldung.com/spring-kafka section 4:
#Configuration
public class KafkaProducerConfig {
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
why invoking the method producerFactory?
wouldn't it be better to declare it like this?
#Configuration
public class KafkaProducerConfig {
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate(ProducerFactory<String, String> producerFactory) {
return new KafkaTemplate<>(producerFactory);
}
}
It seems that two instances of DefaultKafkaProducerFactory are created instead of just one.
What am I missing?

No, two instances of that object are not created because that producerFactory() is proxied by BeanFactory exactly for the case like this.
Although I agree that injected variant is better with the modern approach especially when we pursue a performance gain on start up.
See #Configuration.proxyBeanMethods JavaDocs:
/**
* Specify whether {#code #Bean} methods should get proxied in order to enforce
* bean lifecycle behavior, e.g. to return shared singleton bean instances even
* in case of direct {#code #Bean} method calls in user code. This feature
* requires method interception, implemented through a runtime-generated CGLIB
* subclass which comes with limitations such as the configuration class and
* its methods not being allowed to declare {#code final}.
* <p>The default is {#code true}, allowing for 'inter-bean references' via direct
* method calls within the configuration class as well as for external calls to
* this configuration's {#code #Bean} methods, e.g. from another configuration class.
* If this is not needed since each of this particular configuration's {#code #Bean}
* methods is self-contained and designed as a plain factory method for container use,
* switch this flag to {#code false} in order to avoid CGLIB subclass processing.
* <p>Turning off bean method interception effectively processes {#code #Bean}
* methods individually like when declared on non-{#code #Configuration} classes,
* a.k.a. "#Bean Lite Mode" (see {#link Bean #Bean's javadoc}). It is therefore
* behaviorally equivalent to removing the {#code #Configuration} stereotype.
* #since 5.2
*/
boolean proxyBeanMethods() default true;

Related

SpEL KafkaListener. How can i inject custom deserializer through properties?

I am using spring.
I have a configured ObjectMapper for the entire project and I use it to set up a kafka deserializer.
And then I need a custom kafka deserializer to be used in KafkaListener.
I'm configuring KafkaListener via autoconfiguration, not via #Configuration class.
#Component
#RequiredArgsConstructor
public class CustomMessageDeserializer implements Deserializer<MyMessage> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public MyMessage deserialize(String topic, byte[] data) {
return objectMapper.readValue(data, MyMessage.class);
}
}
If i do like this
#KafkaListener(
topics = {"${topics.invite-user-topic}"},
properties = {"value.deserializer=com.service.deserializer.CustomMessageDeserializer"}
)
public void receiveInviteUserMessages(MyMessage myMessage) {}
I received KafkaException: Could not find a public no-argument constructor
But with public no-argument constructor in CustomMessageDeserializer class i am getting NPE because ObjectMapper = null. It creates and uses a new class, not a spring component.
#KafkaListener supports SpEL expressions.
And I think that this problem can be solved using SpEL.
Do you have any idea how to inject spring bean CustomMessageDeserializer with SpEL?
There are no easy ways to do it with SPeL.
Analysis
To get started, see the JavaDoc for #KafkaListener#properties:
/**
*
* SpEL expressions must resolve to a String ...
*/
The value of value.deserializer is used to instantiate the specified deserializer class. Let's follow the call chain:
You specify this value in the #KafkaListener annotation, then you are probably not creating a bean of the ConsumerFactory.class. So Spring creates this bean class itself - see KafkaAutoConfiguration#kafkaConsumerFactory.
Next is the creation of the returned object new DefaultKafkaConsumerFactory(...) as ConsumerFactory<?,?> using the constructor for default delivery expressions keyDeserializer/valueDeserializer = () -> null
This factory is used to create a Kafka consumer (The entry point is the constructor KafkaMessageListenerContainer#ListenerConsumer, then KafkaMessageListenerContainer.this.consumerFactory.createConsumer...)
In the KafkaConsumer constructor, the valueDeserializer object is being created, because it is null (for the default factory of point 2 above):
if (valueDeserializer == null) {
this.valueDeserializer = config.getConfiguredInstance(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, Deserializer.class);
The implementation of config.getConfiguredInstance involves instantiating your deserializer class via a parameterless constructor using reflection and your String "com.service.deserializer.CustomMessageDeserializer" class name
Solutions
To use value.deserializer with your customized ObjectMapper, you must create the ConsumerFactory bean yourself using the setValueDeserializer(...) method. This is also mentioned in the second Important part of the JSON.Mapping_Types.Important documentation
If you don't want to create a ConsumerFactory bean, and also don't have complicated logic in your deserializer (you only have return objectMapper.readValue(data, MyMessage.class);), then register DefaultKafkaConsumerFactoryCustomizer:
#Bean
// inject your custom objectMapper
public DefaultKafkaConsumerFactoryCustomizer customizeJsonDeserializer(ObjectMapper objectMapper) {
return consumerFactory ->
consumerFactory.setValueDeserializerSupplier(() ->
new org.springframework.kafka.support.serializer.JsonDeserializer<>(objectMapper));
}
In this case, you don't need to create your own CustomMessageDeserializer class (remove it) and Spring will automatically parse the message into your MyMessage.
#KafkaListener annotation should also not contains the property properties = {"value.deserializer=com.my.kafka_test.component.CustomMessageDeserializer"}. This DefaultKafkaConsumerFactoryCustomizer bean will automatically be used to configure the default ConsumerFactory<?, ?> (see the implementation of the KafkaAutoConfiguration#kafkaConsumerFactory method)
Here how it works for me:
#KafkaListener(topics = "${solr.kafka.topic}", containerFactory = "batchFactory")
public void listen(List<SolrInputDocument> docs, #Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers, Acknowledgment ack) throws IOException {...}
And then I have 2 beans defined in my Configuration
#Profile("!test")
#Bean
#Autowired
public ConsumerFactory<String, SolrInputDocument> consumerFactory(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
DefaultKafkaConsumerFactory<String, SolrInputDocument> result = new DefaultKafkaConsumerFactory<>(props);
String validatedKeyDeserializerName = KafkaMessageType.valueOf(keyDeserializerName).toString();
ZiDeserializer<SolrInputDocument> deserializer = ZiDeserializerFactory.getInstance(validatedKeyDeserializerName);
result.setValueDeserializer(deserializer);
return result;
}
#Profile("!test")
#Bean
#Autowired
public ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> batchFactory(ConsumerFactory<String, SolrInputDocument> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
factory.setBatchListener(true);
factory.setConcurrency(2);
ExponentialBackOffWithMaxRetries backoff = new ExponentialBackOffWithMaxRetries(10);
backoff.setMultiplier(3); // Default is 1.5 but this seems more reasonable
factory.setCommonErrorHandler(new DefaultErrorHandler(null, backoff));
// Needed for manual commits
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Note that the interface ZiDeserializer<SolrInputDocument> deserializeris my interface and ZiDeserializerFactory.getInstance(validatedKeyDeserializerName); returns my custom implementation of ZiDeserializer. And ZiDeserializer extends org.apache.kafka.common.serialization.Deserializer. This works for me

Class not able to access bean managed by Spring

I have Spring configuration file where I am defining beans but somehow this bean is not accessible from one of the class in same package, though same beans are accessible from Controller class which was annotated as #Controller. I was thinking may be this class was not managed by Spring but that's not the case.
1) Configuration class
#Bean
public FooConsumer fooConsumer() {
return new FooConsumer();
}
#Bean
public Map<String, ProxyConsumer> appProxyConsumerMap() {
Map<String, ProxyConsumer> proxyConsumer = new HashMap<String, ProxyConsumer>();
proxyConsumer.put(FOO_APP, FooConsumer());
return proxyConsumer;
}
#Bean
public FooEventConsumer fooEventConsumer() {
return new FooEventConsumer();
}
#Bean
public Map<String, FooConsumer> fooConsumerMap(){
Map<String, FooConsumer> fooEventConsumer = new HashMap<String, FooConsumer>();
fooEventConsumer.put(FOO_EVENT, fooEventConsumer());
}
2) Controller class
#Resource
#Qualifier("appProxyConsumerMap")
Map<String, ProxyConsumer> appProxyConsumerMap;
//proxyApp comes as path variable
ProxyConsumer consumer = appProxyConsumerMap.get(proxyApp);
//invoke consumer
boolean consumed = consumer.consumeEvent(eventRequest);
//here consumer is my FooConsumer class, till now all works fine.
3) now in FooConsumer class it tries to access Map bean named fooConsumerMap to get which event to call but somehow it returns null.
#Resource
#Qualifier("fooConsumerMap")
Map<String, FooConsumer> fooConsumerMap;
FooEventConsumer consumer = fooConsumerMap.get(eventType);
//Here fooConsumerMap comes as null in this class, though it comes as object in controller class , please advise.
In your configuration file, construct your FooConsumer bean with the FooConsumerMap bean declared in the same configuration.
You can autowire other beans into a configuration file, but to pull together beans within the file you pass them as constructor arguments.
Note that if you call a Bean annotated method multiple times, you will surprisingly always get the same instance even if the method logic constructs a new instance.
Check documentation at https://docs.spring.io/spring-javaconfig/docs/1.0.0.m3/reference/html/creating-bean-definitions.html

#KafkaListener separate filtering logic for each listener

I need to define a custom filtering strategy for each listener produced by the listener factory.
Currently, I'm using RecordFilterStrategy to do that:
#Bean
ConcurrentKafkaListenerContainerFactory<String, GenericRecord> kafkaListenerContainerFactoryProject() {
ConcurrentKafkaListenerContainerFactory<String, GenericRecord> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setRecordFilterStrategy(new RecordFilterStrategy<String, GenericRecord>() {
#Override
public boolean filter(ConsumerRecord<String, GenericRecord> consumerRecord) {
return true;
}
});
return factory;
}
But such filtering applies to all listeners produced by this factory. What I need is something like to define the different logic for each listener:
#Component
#SendTo("out")
#KafkaListener(topics = "incoming")
public class TestListener {
#Filter
public boolean filter(){
return true;
}
#KafkaHandler
public TestObject listener(TestObject testObject) {
log.debug("Received Message: " + testObject);
return testObject;
}
}
Does spring-kafka have some tools to do that? Or I need to write such logic on my own?
Thanks in advance!
No, you don't. What you just need is a set of ConcurrentKafkaListenerContainerFactory beans with particular RecordFilterStrategy. Then your #KafkaListener should just specify which factory they are based on:
/**
* The bean name of the {#link org.springframework.kafka.config.KafkaListenerContainerFactory}
* to use to create the message listener container responsible to serve this endpoint.
* <p>If not specified, the default container factory is used, if any.
* #return the container factory bean name.
*/
String containerFactory() default "";

Spring Integration: connection to multiple MQ servers by config

I do have a Spring Boot 5 application and I also have it running against one IBM MQ server.
Now we want it to connect to three or more MQ servers. My intention is now to just add XY connection infos to the environment and then I get XY MQConnectionFactory beans and al the other beans that are needed for processing.
At the moment this is what I have:
#Bean
#Qualifier(value="MQConnection")
public MQConnectionFactory getIbmConnectionFactory() throws JMSException {
MQConnectionFactory factory = new MQConnectionFactory();
// seeting all the parameters here
return factory;
}
But this is quite static. Is there an elegant way of doing this?
I stumbled about IntegrationFlow. Is this a possibly working solution?
Thanks for all your tipps!
KR
Solution
Based on Artem Bilan's response I built this class.
#Configuration
public class ConnectionWithIntegrationFlowMulti {
protected static final Logger LOG = Logger.create();
#Value("${mq.queue.jms.sources.queue.queue-manager}")
private String queueManager;
#Autowired
private ConnectionConfig connectionConfig;
#Autowired
private SSLSocketFactory sslSocketFactory;
#Bean
public MessageChannel queureader() {
return new DirectChannel();
}
#Autowired
private IntegrationFlowContext flowContext;
#PostConstruct
public void processBeanDefinitionRegistry() throws BeansException {
Assert.notEmpty(connectionConfig.getTab().getLocations(), "At least one CCDT file locations must be provided.");
for (String tabLocation : connectionConfig.getTab().getLocations()) {
try {
IntegrationFlowRegistration theFlow = this.flowContext.registration(createFlow(tabLocation)).register();
LOG.info("Registered bean flow for %s with id = %s", queueManager, theFlow.getId());
} catch (JMSException e) {
LOG.error(e);
}
}
}
public IntegrationFlow createFlow(String tabLocation) throws JMSException {
LOG.info("creating ibmInbound");
return IntegrationFlows.from(Jms.messageDrivenChannelAdapter(getConnection(tabLocation)).destination(createDestinationBean()))
.handle(m -> LOG.info("received payload: " + m.getPayload().toString()))
.get();
}
public MQConnectionFactory getConnection(String tabLocation) throws JMSException {
MQConnectionFactory factory = new MQConnectionFactory();
// doing stuff
return factory;
}
#Bean
public MQQueue createDestinationBean() {
LOG.info("creating destination bean");
MQQueue queue = new MQQueue();
try {
queue.setBaseQueueManagerName(queueManager);
queue.setBaseQueueName(queueName);
} catch (Exception e) {
LOG.error(e, "destination bean: Error for integration flow");
}
return queue;
}
}
With Spring Integration you can create IntegrationFlow instances dynamically at runtime. For that purpose there is an IntegrationFlowContext with its registration() API. The returned IntegrationFlowRegistrationBuilder as a callback like:
/**
* Add an object which will be registered as an {#link IntegrationFlow} dependant bean in the
* application context. Usually it is some support component, which needs an application context.
* For example dynamically created connection factories or header mappers for AMQP, JMS, TCP etc.
* #param bean an additional arbitrary bean to register into the application context.
* #return the current builder instance
*/
IntegrationFlowRegistrationBuilder addBean(Object bean);
So, your MQConnectionFactory instances can be populated alongside with the other flow, used as references in the particular JMS components and registered as beans, too.
See more info in docs: https://docs.spring.io/spring-integration/docs/5.2.3.RELEASE/reference/html/dsl.html#java-dsl-runtime-flows
If you are fine with creating them statically, you can create the beans as you are now (each having a unique qualifier), but you can access them all dynamically in your services / components by having an #Autowired List<MQConnectionFactory> field or #Autowired Map<String, MQConnectionFactory> field. Spring will automatically populate the fields with all of the beans of type MQConnectionFactory
In the the Map implementation, the String will be the qualifier value.
If you also want to create the beans dynamically based on some properties, etc, it gets a little more complicated. You will need to look into something along the lines of instantiating beans at runtime

Spring Integration - kafka Outbound adapter not taking topic value exposed as spring bean

I have successfully integrated kafka outbound channle adapter with fixed topic name. Now, i want to make the topic name configurable and hence, want to expose it via application properties.
application.properties contain one of the following entry:
kafkaTopic:testNewTopic
My configuration class looks like below:
#Configuration
#Component
public class KafkaConfig {
#Value("${kafkaTopic}")
private String kafkaTopicName;
#Bean
public String getTopic(){
return kafkaTopicName;
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
#Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");//this.brokerAddress);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer");
// set more properties
return new DefaultKafkaProducerFactory<>(props);
}
}
and in my si-config.xml, i have used the following (ex: topic="getTopic") :
<int-kafka:outbound-channel-adapter
id="kafkaOutboundChannelAdapter" kafka-template="kafkaTemplate"
auto-startup="true" sync="true" channel="inputToKafka" topic="getTopic">
</int-kafka:outbound-channel-adapter>
However, the configuration is unable to pick up the topic name when exposed via bean. But it works fine when i hard code the value of the topic name.
Can someone please suggest what i am doing wrong here?
Does topic within kafka outbound channel accept the value referred as bean?
How do i externalize it as every application using my utility will supply different kafka topic names
The topic attribute is for string value.
However it supports property placeholder resolution:
topic="${kafkaTopic}"
and also SpEL evaluation for aforementioned bean:
topic="#{getTopic}"
Just because this is allowed by the XML parser configuration.
However you may pay attention that KafkaTemplate, which you inject into the <int-kafka:outbound-channel-adapter> has defaultTopic property. Therefore you won't need to worry about that XML.
And one more option available for you is Spring Integration Annotations configuration. Where you can define a #ServiceActivator for the KafkaProducerMessageHandler #Bean:
#ServiceActivator(inputChannel = "inputToKafka")
#Bean
KafkaProducerMessageHandler kafkaOutboundChannelAdapter() {
kafkaOutboundChannelAdapter adapter = new kafkaOutboundChannelAdapter( kafkaTemplate());
adapter.setSync(true);
adapter.setTopicExpression(new LiteralExpression(this.kafkaTopicName));
return adapter;
}

Resources