Spring cloud stream kafka binder to create consumer with on demand configuration - spring-boot

I am using Spring boot 1.5.9.RELEASE and Spring cloud Edgware.RELEASE across the Microservices.
I've bound a consumer using #EnableBinding annotation. The annotation will do rest of the part for me to consume events.
Some requirements came up to configure the topic name and some other configuration properties manually for which I want to override some of the properties of a consumer defined in an application.properties at application boot time.
Is there any direct way to do such?

You can use an initialization bean, it can do the work:
#SpringBootApplication
public class SpringDataDemoApplication {
#Bean
InitializingBean populateDatabase() {
return () -> {
// doWhatYouWantHere...
};
}

Related

Mix spring-integration and spring scheduler

We are mixing spring-integration and the scheduling capabilities from spring-boot using:
#SpringBootApplication
#EnableIntegration
#IntegrationComponentScan
#EnableConfigurationProperties
#EnableScheduling
public class MyApplication {
...
}
#EnableScheduling creates a bean named "taskScheduler" which is then used by spring-integration:
public abstract class IntegrationContextUtils {
public static final String TASK_SCHEDULER_BEAN_NAME = "taskScheduler";
...
}
private void registerTaskScheduler() {
if (!this.beanFactory.containsBean(IntegrationContextUtils.TASK_SCHEDULER_BEAN_NAME)) {
...
this.registry.registerBeanDefinition(IntegrationContextUtils.TASK_SCHEDULER_BEAN_NAME, scheduler);
}
}
Problem is, the default poolSize for spring-integration is 10 (which value is needed as we encounter starvation) while the default for spring-boot is 1 (which we also need to avoid concurrency in our scheduled processes).
Questions:
Is this a normal behavior for spring-integration to share his task scheduler bean with spring-boot scheduling capabilities?
Is there a way to specify a unique task scheduler for spring-integration, whether scheduling in boot is enabled or not?
Thanks for your answers
The behviour and logic is correct. And expectations from the convention-on-configuration from Spring Boot perspective is also correct. Only what you miss that #EnableScheduling is not a Spring Boot feature, but rather Spring Framework native: https://docs.spring.io/spring-framework/docs/current/reference/html/integration.html#scheduling. Spring Boot jsut give us an extra freedom of configuration some beans on the matter. So, we just need to rely on its auto-configuration.
If auto-configuration doesn't fit your requirements, you always can provide your own configuration and override whenever it is necessary.
Looking to the #EnableScheduling, its #Scheduled hooks and appropriate TaskSchedulingAutoConfiguration in Spring Boot, it is not so easy to override whatever you want to make Spring Integration happy at the same time. So, we should go a bit opposite direction and really override a Scheduler for Spring Integration endpoints. Every single place where you use poller, you also need to configure a custom Scheduler instead of that auto-configured.

Spring Boot Kafka Multiple Consumers with different properties configuration using appication.yml/properties

I have seen examples where we have a java configuration class and we define multiple KafkaListenerContainer and pass the required containerType to #kafkaListener. But i am exploring if there are any ways to achieve the same using Spring Boot auto Kafka configuration via appication.yml/properties.
No; Boot will only auto-configure one set of infrastructure; if you need multiple, you need to define them as beans.
However, with recent versions (since 2.3.4), you can add a listener container customizer to the factory so you can customize each listener container, even though they are created by the same factory; some properties can also be overridden on the #KafkaListener annotation itself.
Example:
#Component
class Customizer {
public Customizer(ConcurrentKafkaListenerContainerFactory<?, ?> factory) {
factory.setContainerCustomizer(container -> {
if (container.getContainerProperties().getGroupId().equals("slowGroup")) {
container.getContainerProperties().setIdleBetweenPolls(60_000);
}
});
}
}

"httptrace" endpoint of Spring Boot Actuator doesn't exist anymore with Spring Boot 2.2.0

With Spring Boot 2.2.0 the "httptrace" Actuator endpoint doesn't exist anymore. How can I get this functionality back?
The functionality has been removed by default in Spring Boot 2.2.0.
As a workaround, add this configuration to the Spring environment:
management.endpoints.web.exposure.include: httptrace
and provide an HttpTraceRepository bean like this:
#Configuration
// #Profile("actuator-endpoints")
// if you want: register bean only if profile is set
public class HttpTraceActuatorConfiguration {
#Bean
public HttpTraceRepository httpTraceRepository() {
return new InMemoryHttpTraceRepository();
}
}
http://localhost:8080/actuator/httptrace works again.
You need to enable httptrace by having following application properties. By default it is disabled
management.trace.http.enabled: true
management.endpoints.web.exposure.include: httptrace
and Requires an HttpTraceRepository bean. You can use Your own Custom implementation or InMemoryHttpTraceRepository

What Is the Correct Way To Use AbstractReactiveWebInitializer

I've got a Spring WebFlux application running successfully as a standalone spring boot application.
I am attempting to run the same application in a Tomcat container, and following the documentation, I've created a class that extends AbstractReactiveWebInitializer. The class requires that I implement a method getConfigClasses that would return classes normally annotated with #Configuration. If the working spring boot app started with a class called ApplicationInitializer, then the resulting implementations would look like this:
#SpringBootApplication(scanBasePackages = "my.pkg")
#EnableDiscoveryClient
#EnableCaching
public class ApplicationInitializer {
public static void main(String... args) {
SpringApplication.run(ApplicationInitializer.class, args);
}
}
and
public class ServletInitializer extends AbstractReactiveWebInitializer {
#Override
protected Class<?>[] getConfigClasses() {
return new Class[] {ApplicationInitializer.class};
}
}
When deployed, the only thing that starts is ApplicationInitializer, none of the autoconfigured Spring Boot classes (Cloud Config, DataSource, etc) ever kick off.
The documenation states this is the class I need to implement, I just expected the remainder of the spring environment to "just work".
How should I be using this class to deploy a Reactive WebFlux Spring Boot application to a Tomcat container ?
Edit:
After some additional research, I've narrowed it down to likely just Cloud Config. During bean post processing on startup, the ConfigurationPropertiesBindingPostProcessor should be enriched with additional property sources (from cloud config), but it appears to be the default Spring properties instead, with no additional sources.
The misisng properties is causing downstream beans to fail.
Spring Boot does not support WAR packaging for Spring WebFlux applications.
The documentation you're referring to is the Spring Framework doc; Spring Framework does support that use case, but without Spring Boot.
you can extend SpringBootServletInitializer, add add reactive servlet on onStartup method

how to use ApplicationEventPublisher in spring integration with annotation?

I am new to spring integration and I have to do some event based processing ? can anyone tell me how to use ApplicationEventPublisher. sample will be very helpful.
For publishing Spring Application events Spring Integration provides ApplicationEventPublishingMessageHandler component. This is one-way, just send producer and should be configured together with the #ServiceActivator annotations:
#ServiceActivator(inputChannel = "sendEventChannel")
#Bean
public MessageHandler eventProducer() {
return new ApplicationEventPublishingMessageHandler();
}
Also see http://docs.spring.io/spring-integration/reference/html/applicationevent.html.

Resources