Spring Cloud Stream Binder Kafka - spring

I'm trying to use Spring Cloud Stream with a Kafka binder to consume messages from a topic.
Before I used annotations to create the consumer. Now I have use the functional approach, because the annotation is no more available.
These are the dependecies I used:
implementation("org.springframework.cloud:spring-cloud-stream-binder-kafka:4.0.0")
implementation("org.springframework.cloud:spring-cloud-function-kotlin:4.0.0")
This is the application.yaml
spring:
cloud:
function:
definition: consumeMessage
stream:
kafka:
binder:
brokers: localhost:9092
autoAddPartitions: true
bindings:
consumeMessage-in-0:
destination: message
group: message-group
I tried to use a Bean for the consumer itself, but no messages are revieved:
#Service
class MessageListener() {
#Bean
fun consumeMessage(): Consumer<String> = java.util.function.Consumer { payload ->
println(payload)
}
}
I'm using Spring Boot 3 as project base.
It is no possible to reveive any message via the listener. Does anybody know how to solve the problem?

This was only an issue with the integration test. If you are running unit/integration tests, an internal mock/broker is used to perform the tests.
If you want to run the test with a "real" broker, you can use the #Import(KafkaBinderConfiguration::class) annotation.

Related

Configuring custom Kafka Consumer Deserializer in application.properties file. [spring boot]

I want to consume avro messages and deserialize them without using the Confluent schema registry. I have the schema locally. So, for that, I followed this https://medium.com/#mailshine/apache-avro-quick-example-in-kafka-7b2909396c02 for Consumer part only. Now, I want to configure this deserializer in the application.properties file (the Spring boot way).
I tried adding
spring.kafka.consumer.value-deserializer=com.example.AvroDeserializer
But this results in error saying "Could not find a public no-argument constructor for com.example.AvroDeserializer".
Is there any way to call the constructor with argument from application.properties configuration.
Or
Do I need to configure this in Code instead of properties?
Thanks in advance!!
You can do it using properties, but you have to do it via the configure method (with a no-arg constructor) - this is because Kafka instantiates the deserializer, not Spring.
See the org.springframework.kafka.support.serializer.JsonDeserializer source code for an example.
Otherwise, you can just inject your Deserializer into the consumer factory...
#Bean
MyDeserializer(DefaultKafkaConsumerFactory<String, Foo> factory) {
MyDeserializer<Foo> deser = new MyDeserializer<>(...);
factory.setValueDeserializer(deser);
return deser;
}

How to disable kafka connection from spring boot test?

I am using spring kafka to consume message from kafka topic, so I have a kafka consumer configuration class:
#Configuration
class KafkaConfiguration {
// kafka consumer configurations
}
I have some JUnit tests which will load spring context with mockMvc to test my API, I don't want to test features related to kafka messaging, how can I stop kafka from consuming message only for JUnit tests? It keep failing because I don't have a kafka server at my local and CI environment.
Spring profile is not a very good option, because I will have to write code like:
#Configuration
#Profile("!unit-test")
class KafkaConfiguration {
//kafka configuration
}
which I will end up with production code written only for testing purpose, not very clean, is there other way I can do to disable kafka for tests?
Add #ConditionalOnProperty annotation on configuration class
Example:
#ConditionalOnProperty(value = "kafka.enable", havingValue = "true", matchIfMissing = true)
, and add application.properties file property kafka.enable=false
Spring Profiles have exactly this purpose. It always enabling/disabling parts of your application for various scenario's under which your application runs (staging, production, unit tests).
The only other option would be to make Kafka available during the test by using TestContainers for example.

Spring cloud stream kafka binder to create consumer with on demand configuration

I am using Spring boot 1.5.9.RELEASE and Spring cloud Edgware.RELEASE across the Microservices.
I've bound a consumer using #EnableBinding annotation. The annotation will do rest of the part for me to consume events.
Some requirements came up to configure the topic name and some other configuration properties manually for which I want to override some of the properties of a consumer defined in an application.properties at application boot time.
Is there any direct way to do such?
You can use an initialization bean, it can do the work:
#SpringBootApplication
public class SpringDataDemoApplication {
#Bean
InitializingBean populateDatabase() {
return () -> {
// doWhatYouWantHere...
};
}

MaprStream with spring integration Kafka Producer issue

I am using mapr-stream with spring integration and trying to create a publisher to send messages to maprstream topics. I am using the below Jar version compatibility matrix mentioned here.
Spring-integration-kafka - 2.0.1.RELEASE
Spring-Kafka - 1.0.3.RELEASE
Kafka-clients - 0.9.0.0-mapr-1607
As mentioned in the spring integration Kafka documentation, I should be able to set the property 'sync' in KafkaProducerMessageHandler if I am using spring-integration-kafka-2.0.1 jar,
but I am getting the schema validation issues saying the 'sync' is not expected in the KafkaProducerMessageHandler.
Could someone please help me on this?
XML Namespace support for sync was not added until 2.1.
With 2.0.x, you have to set the property on the KafkaProducerMessageHandler bean programmatically.
EDIT
#Autowired
private KafkaProducerMessageHandler handler;
#PostConstruct
public void init() {
this.handler.setSync(true);
}

how to use ApplicationEventPublisher in spring integration with annotation?

I am new to spring integration and I have to do some event based processing ? can anyone tell me how to use ApplicationEventPublisher. sample will be very helpful.
For publishing Spring Application events Spring Integration provides ApplicationEventPublishingMessageHandler component. This is one-way, just send producer and should be configured together with the #ServiceActivator annotations:
#ServiceActivator(inputChannel = "sendEventChannel")
#Bean
public MessageHandler eventProducer() {
return new ApplicationEventPublishingMessageHandler();
}
Also see http://docs.spring.io/spring-integration/reference/html/applicationevent.html.

Resources