MaprStream with spring integration Kafka Producer issue - spring

I am using mapr-stream with spring integration and trying to create a publisher to send messages to maprstream topics. I am using the below Jar version compatibility matrix mentioned here.
Spring-integration-kafka - 2.0.1.RELEASE
Spring-Kafka - 1.0.3.RELEASE
Kafka-clients - 0.9.0.0-mapr-1607
As mentioned in the spring integration Kafka documentation, I should be able to set the property 'sync' in KafkaProducerMessageHandler if I am using spring-integration-kafka-2.0.1 jar,
but I am getting the schema validation issues saying the 'sync' is not expected in the KafkaProducerMessageHandler.
Could someone please help me on this?

XML Namespace support for sync was not added until 2.1.
With 2.0.x, you have to set the property on the KafkaProducerMessageHandler bean programmatically.
EDIT
#Autowired
private KafkaProducerMessageHandler handler;
#PostConstruct
public void init() {
this.handler.setSync(true);
}

Related

Configuring custom Kafka Consumer Deserializer in application.properties file. [spring boot]

I want to consume avro messages and deserialize them without using the Confluent schema registry. I have the schema locally. So, for that, I followed this https://medium.com/#mailshine/apache-avro-quick-example-in-kafka-7b2909396c02 for Consumer part only. Now, I want to configure this deserializer in the application.properties file (the Spring boot way).
I tried adding
spring.kafka.consumer.value-deserializer=com.example.AvroDeserializer
But this results in error saying "Could not find a public no-argument constructor for com.example.AvroDeserializer".
Is there any way to call the constructor with argument from application.properties configuration.
Or
Do I need to configure this in Code instead of properties?
Thanks in advance!!
You can do it using properties, but you have to do it via the configure method (with a no-arg constructor) - this is because Kafka instantiates the deserializer, not Spring.
See the org.springframework.kafka.support.serializer.JsonDeserializer source code for an example.
Otherwise, you can just inject your Deserializer into the consumer factory...
#Bean
MyDeserializer(DefaultKafkaConsumerFactory<String, Foo> factory) {
MyDeserializer<Foo> deser = new MyDeserializer<>(...);
factory.setValueDeserializer(deser);
return deser;
}

Spring boot actuator auditevents with custom ReactiveAuthenticationManager

I have setup my own ReactiveAuthenticationManager
public class CustomReactiveAuthenticationManager implements ReactiveAuthenticationManager
and then in SecurityWebFilterChain:
.authenticationManager(this.authenticationManager)
However after this setup im not getting anything in the actuator auditevents endpoint:
{"events":[]}
What do I need to change to have audit events even if I use a custom ReactiveAuthenticationManager?
This isn't a problem with your custom AuthenticationManager. It is a limitation of Spring Security. At the time of writing, events are not published when using reactive Spring Security. An enhancement that will remove the limitation is being tracked in this Spring Security issue.

Spring cloud stream kafka binder to create consumer with on demand configuration

I am using Spring boot 1.5.9.RELEASE and Spring cloud Edgware.RELEASE across the Microservices.
I've bound a consumer using #EnableBinding annotation. The annotation will do rest of the part for me to consume events.
Some requirements came up to configure the topic name and some other configuration properties manually for which I want to override some of the properties of a consumer defined in an application.properties at application boot time.
Is there any direct way to do such?
You can use an initialization bean, it can do the work:
#SpringBootApplication
public class SpringDataDemoApplication {
#Bean
InitializingBean populateDatabase() {
return () -> {
// doWhatYouWantHere...
};
}

#Rollback(true) not working in spring boot 1.3.X

I have updated my pom from spring-boot-starter-parent 1.2.5.RELEASE to 1.3.2.RELEASE.
The problem is that everything stay the same but all the test #Rollback(true) not working at all after migration.
#Transactional
#Rollback(true)
#Test
public void testRollBack() {
dao.saveToDb();
throw new RunTimeException();
}
Configaturation:
#Bean
#Primary
public PlatformTransactionManager txManager() {
return new DataSourceTransactionManager(dataSource());
}
It works perfectly in the same configuration and code and the only change is spring boot version. I cannot see that Transaction is being created in logs as suppose too
Anyone has a clue? Maybe a way to debug and understand what is the problem?
Thanks
TransactionTestExecutionListener has changed quite a lot between Spring Framework 4.1 (used by Spring Boot 1.2) and Spring Framework 4.2 (used by Spring Boot 1.3). It sounds like there's been a change in behaviour which I suspect probably wasn't intentional.
To fix your problem without renaming one of your beans, you need to tell the test framework which transaction manager to use. The easiest way to do that is via the #Transactional annotation:
#Transactional("txManager")
#Rollback(true)
#Test
public void testRollBack() {
dao.saveToDb();
throw new RunTimeException();
}
I have put spring on debug..
There is a problem/bug in the test framework or i don't understand the use correctly.
I checked the code of spring and saw this:
bf.getBean(DEFAULT_TRANSACTION_MANAGER_NAME, PlatformTransactionManager.class);
This happens when we have several transaction manager, instead of getting the bean marked by #Primary annotation spring try to get transaction manager that called "transactionManager".
The solution is just mark the bean in that name.. Tried to open issue to spring-test project but don't know where.. If anyone knows how please advise.
Thanks
EDIT: So the solution is eiether what i have wrote above or just name them transaction(#Transactional("myManager")) and use it in the test method signature

Spring 4 Rest RequestMappingHandlerAdapter not Saving Configured MessageConverters

I am having a problem with configuring the RequestMappingHandlerAdapter; which is used in a Spring 4.1.4 Restful WebService configuration. When I configure the RequestMappingHandlerAdapter message converters, it doesn't not use the message converters that I've configured. I put break points in the RequestMappingHandlerAdapter.setMessageConverters(List<HttpMessageConverter<?>> messageConverters) method and on application startup I see this method being called three times. The first two times this method is called it has the pre-configured message converters, one of which is the Jaxb2RootElementHttpMessageConverter. On the third time, this method is called with my manually configured message converters via application-context.xml bean configuration. At this point, I am thinking that I have successfully reset the message converters with my own configuration; but that is not so because when I invoke my Restful WebService, Spring is calling the Jaxb2RootElementHttpMessageConverter instead of the MarshallingHttpMessageConverter that I manually configured via application-context.xml.
So I need to know how to:
How to tell Jaxb2RootElementHttpMessageConverter to use my configured JAXB2Marshaller; which is configured to work with JAXBIntroductions,
Unregister the Jaxb2RootElementHttpMessageConverter in Spring 4.1.4,
Tell Spring 4.1.4 when it see XML data to use MarshallingHttpMessageConverter instead of the Jaxb2RootElementHttpMessageConverter,
Create my own custom version of Jaxb2RootElementHttpMessageConverter so I can give it the correct JAXB2 Marshaller; which is configured to work with JAXBIntroductions, or
Get the RequestMappingHandlerAdapter to only used the configuration that I give it.
Any help with any of the five options above would be greatly appreciated.
Thank you.
Tonté
I too faced same issue.
You have to remove from the context file.
Its overriding the converters even if we specified list of converters.
I too faced same issue.
You have to remove mvcannotationDriven from the context file.
Its overriding the converters even if we specified list of converters.

Resources