How to ensure Spring Cloud Stream Listener to wait to process messages until Application is fully initialized on Start? - spring-boot

With Spring Cloud Stream Kafka app, how can we ensure that the stream listener waits to process messages until some dependency tasks (reference data population, e.g.) are done? Below app fails to process messages because messages are delivered too early. How can we guarantee this kind of ordering within a Spring Boot App?
#Service
public class ApplicationStartupService implements ApplicationRunner {
private final FooReferenceDataService fooReferenceDataService;
#Override
public void run(ApplicationArguments args) throws Exception {
fooReferenceDataService.loadData();
}
}
#EnableBinding(MyBinding.class)
public class MyFooStreamProcessor {
#Autowired FooService fooService;
#StreamListener("my-input")
public void process(KStream<String, Foo> input) {
input.foreach((k,v)-> {
// !!! this fails to save
// messages are delivered too early before foo reference data got loaded into database
fooService.save(v);
});
}
}
spring-cloud-stream: 2.1.0.RELEASE
spring-boot: 2.1.2.RELEASE
I found this is not available in spring cloud stream as of May 15, 2018.
Kafka - Delay binding until complex service initialisation has completed
Do we have a plan/timeline when this is supported?

In the mean time, I achieved what I wanted by using #Ordered and ApplicationRunner. It's messy but works. Basically, stream listener will wait until other works are done.
#Service
#Order(1)
public class ApplicationStartupService implements ApplicationRunner {
private final FooReferenceDataService fooReferenceDataService;
#Override
public void run(ApplicationArguments args) throws Exception {
fooReferenceDataService.loadData();
}
}
#EnableBinding(MyBinding.class)
#Order(2)
public class MyFooStreamProcessor implements ApplicationRunner {
#Autowired FooService fooService;
private final AtomicBoolean ready = new AtomicBoolean(false);
#StreamListener("my-input")
public void process(KStream<String, Foo> input) {
input.foreach((k,v)-> {
while (ready.get() == false) {
try {
log.info("sleeping for other dependent components to finish initialization");
Thread.sleep(10000);
} catch (InterruptedException e) {
log.info("woke up");
}
}
fooService.save(v);
});
}
#Override
public void run(ApplicationArguments args) throws Exception {
ready.set(true);
}
}

Related

#Transactional issue in Spring Boot with Kafka and Mongo Integration

I've the following kafka consumer
#KafkaListener(topics = "#{'${bpi.kafka.topic.topicname}'}",
groupId = "#{'${bpi.kafka.group-id}'}",
properties = {"auto.offset.reset:${bpi.kafka.consumer.auto-offset-reset}"})
public void consumeOverdueEvents(Event event) {
myinterface.handleEvent(Event);
}
My Service looks like the following
#Override
#Transactional(value = "mongoTransactionManager")
public void handleEvent(Event event) {
eventProducer.publishEvent(event.consolidateNewEvent(event));
eventDataGateway.saveEvent(event);
}
}
/*#Component
#RequiredArgsConstructor
public class KafkaEventProducer implements .. {
private final KafkaTemplate<String, Event> kafkaTemplate;
#Value("${bpi.kafka.topic.second_topic_name}")
private String topic;
#Override
public void publishEvent(Event2 event) {
kafkaTemplate.send(topic, "", Event2.create(event));
}
}*/
/*#Component
#RequiredArgsConstructor
public class eventAdapter implements EventDataGateway {
private final MyRepository repository;
#Override
public void saveEvent(Event event) {
repository.save(..);
}
}*/
In order to test the #Transactional, I purposely dropped the mongo db, When I receive one new event it will not be saved but I got 10 published events
PS: The retry is due to the transactional behavior, but the intended behavior is to not publish anything if the database operation fails

Log ApplicationEventPublisher.publishEvent() calls

I've got an Spring Boot 2.2 Application which publishes and consumes spring application events in different packages. Now I want to log every time an event has been published by ApplicationEventPublisher.publishEvent().
One solution could be to write my own event publisher like:
public class LoggableApplicationEventPublisher implements ApplicationEventPublisher {
private final ApplicationEventPublisher eventPublisher;
private final Logger logger;
public ApplicationEventLogger(ApplicationEventPublisher eventPublisher, Logger logger) {
this.eventPublisher = eventPublisher;
this.logger = logger;
}
#Override
public void publishEvent(ApplicationEvent event) {
eventPublisher.publishEvent(event);
logger.info("--> Emitting {}", event);
}
}
Another solution could be to use aspect oriented programming and write an Aspect which is triggered everytime publishEvent() has been triggered:
#Aspect
#Component
public class EventPublishAspect {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
#Pointcut("execution(* org.springframework.context.ApplicationEventPublisher.*(..))")
public void logPublishEvent() {
}
#After("logPublishEvent()")
public void log(JoinPoint point) {
Object[] lArgs = point.getArgs();
LOG.info("Triggered", lArgs[0]);
}
}
I've set up all correctly (dependencies aswell) and this example is working for other pointcuts (like for a call of specific method of my services).
However, this aspect is not working with the declared pointcut for the ApplicationEventPublisher-Interface. Do you know why not? It seems like spring boot injects AbstractApplicationContext on runtime, which is actually implementing this interface.
Solution that does not require aspects (and has faster startup time?)
#Primary
#Bean
DelegatingApplicationEventPublisher applicationEventPublisher(ApplicationContext applicationContext) {
new DelegatingApplicationEventPublisher(applicationContext)
}
#Slf4j
#RequiredArgsConstructor
public class DelegatingApplicationEventPublisher implements ApplicationEventPublisher {
private final ApplicationContext context;
#Override
public void publishEvent(ApplicationEvent event) {
logEvent(event);
context.publishEvent(event);
}
#Override
public void publishEvent(Object event) {
logEvent(event);
context.publishEvent(event);
}
private void logEvent(Object event) {
if (event instanceof PayloadApplicationEvent payloadApplicationEvent) {
log.debug(markers("eventName", payloadApplicationEvent.getPayload().getClass(), "event", payloadApplicationEvent.getPayload()), "publishing...");
} else {
log.debug(markers("eventName", event.getClass(), "event", event), "publishing ...");
}
}
}

Different ways to run custom code before the application starts

Could you describe different ways to run custom code before the application starts for data initialization or something else?
(like ApplicationListener, CommandLineRunner etc.)
What is the difference between all of them? Which cases is better to use each of them in?
I want to know not only one way to do that but an understanding when and what I need to use.
Here is enough old question with too many options to do that: Running code after Spring Boot starts
If it is a wrong place to ask this question, please, point me to the right one.
What options I know:
CommandLineRunner - receive command-line arguments as String
#Slf4j
#Component
public class DemoCommandLineRunner implements CommandLineRunner {
#Override
public void run(String... args) {
log.info("[CommandLineRunner] Args: " + Arrays.toString(args));
}
}
ApplicationRunner - receive command-line arguments with names
#Slf4j
#Component
public class DemoApplicationRunner implements ApplicationRunner {
#Override
public void run(ApplicationArguments args) {
log.info("[ApplicationRunner] Args: ");
nonOptionArgs(args);
optionArgs(args);
}
private void nonOptionArgs(ApplicationArguments args) {
args.getNonOptionArgs().forEach(log::info);
}
private void optionArgs(ApplicationArguments args) {
args.getOptionNames().stream()
.map(args::getOptionValues)
.map(Objects::toString)
.forEach(log::info);
}
}
ApplicationListener - listener for different events (for each event own class)
#Slf4j
#Component
public class DemoApplicationListener implements ApplicationListener<ApplicationEvent> {
#Override
public void onApplicationEvent(ApplicationEvent event) {
logEvent(event);
}
private void logEvent(ApplicationEvent event) {
log.info("[DemoApplicationListener] Event: " + event);
}
}
#EventListener - listener for different events (several events in one bean)
#Slf4j
#Component
public class DemoEventApplicationListener {
#EventListener
public void handleContextRefreshedEvent(ContextRefreshedEvent event) {
logEvent(event);
}
#EventListener
public void handleApplicationReadyEvent(ApplicationReadyEvent event) {
logEvent(event);
}
private void logEvent(ApplicationEvent event) {
log.info("[DemoEventApplicationListener] Event: " + event);
}
}
SmartLifecycle - configure bean lifecycle
#Slf4j
#Component
public class DemoSmartLifecycle implements SmartLifecycle {
private boolean isRunning;
#Override
public void start() {
isRunning = true;
log.info("[DemoSmartLifecycle]: Start");
}
#Override
public void stop() {
isRunning = false;
log.info("[DemoSmartLifecycle]: Stop");
}
#Override
public boolean isRunning() {
return isRunning;
}
}
SmartInitializingSingleton - triggered at the end of the singleton pre-instantiation phase
#Slf4j
#Component
public class DemoSmartInitializingSingleton implements SmartInitializingSingleton {
#Override
public void afterSingletonsInstantiated() {
log.info("[SmartInitializingSingleton] afterSingletonsInstantiated");
}
}
Github repo: https://github.com/venkaDaria/demo-bootstrap-spring
If you need to run some code "once the SpringApplication has started" you should use ApplicationRunner or CommandLineRunner - they work the same way.
ApplicationListener, or #EventListener with ApplicationReadyEvent do the same as well.
See my example.
The option you choose is up to you.

Spring cloud stream kafka - A subscribable channel has no output

I have an application which does a lot of data processing (in the order of ~1.3 million at a time) which happens in bursts. The application consumes data from a kafka topic.
I'm using a version 2.0.1 of spring-cloud-stream-starter-kafka to consume data.
My code is as follows:
Listener:
#Service
public class ListenerService {
#Autowired
private Application2<Foo> application;
#Override
#StreamListener(FooStreams.INPUT)
public void subscribe(#Payload Foo foo) {
application.sync(foo);
}
}
Streams:
public interface FooStreams {
String INPUT = "Foo";
#Input(value = INPUT)
SubscribableChannel subscribe();
}
In the main application, I've bound the stream to kafka like this:
#SpringBootApplication
#EnableBinding({FooStreams.class})
public class Application {
private static final Logger logger = LoggerFactory.getLogger(Application.class);
public static void main(String[] args) {
try {
SpringApplication.run(Application.class, args);
}
catch (Exception e) {
logger.error("Application failed to start");
}
}
}
Is there something I am missing? The issue is that I can see that the memory utilization spikes up during the time of data processing, which doesn't come down after the processing is done.

Spring boot graceful shutdown mid-transaction

I'm working on a spring-boot service that performs sensitive payment processing, and would like to ensure that any shutdown to the app will be done without interrupting these transactions. Curious on how to best approach this in spring-boot.
I read about adding shutdown hooks to spring-boot, and I was thinking maybe to use a CountDownLatch on the class to check if the thread has completed processing - something like this:
#Service
public class PaymentService {
private CountDownLatch countDownLatch;
private void resetLatch() {
this.countDownLatch = new CountDownLatch(1);
}
public void processPayment() {
this.resetLatch();
// do multi-step processing
this.CountDownLatch.countDown();
}
public void shutdown() {
// blocks until latch is available
this.countDownLatch.await();
}
}
// ---
#SpringBootApplication
public class Application {
public static void main(String[] args) {
// init app and get context
ConfigurableApplicationContext context = SpringApplication.run(Application.class, args);
// retrieve bean needing special shutdown care
PaymentService paymentService = context.getBean(PaymentService.class);
Runtime.getRuntime().addShutdownHook(new Thread(paymentService::shutdown));
}
}
Constructive feedback is greatly appreciated - thanks.
I ended up using #PreDestroy annotation on the shutdown method:
#Service
public class PaymentService {
private CountDownLatch countDownLatch;
private synchronized void beginTransaction() {
this.countDownLatch = new CountDownLatch(1);
}
private synchronized void endTransaction() {
this.countDownLatch.countDown();
}
public void processPayment() {
try {
this.beginTransaction();
// - - - -
// do multi-step processing
// - - - -
} finally {
this.endTransaction();
}
}
#PreDestroy
public void shutdown() {
// blocks until latch is available
this.countDownLatch.await();
}
}

Resources