Spring boot graceful shutdown mid-transaction - spring-boot

I'm working on a spring-boot service that performs sensitive payment processing, and would like to ensure that any shutdown to the app will be done without interrupting these transactions. Curious on how to best approach this in spring-boot.
I read about adding shutdown hooks to spring-boot, and I was thinking maybe to use a CountDownLatch on the class to check if the thread has completed processing - something like this:
#Service
public class PaymentService {
private CountDownLatch countDownLatch;
private void resetLatch() {
this.countDownLatch = new CountDownLatch(1);
}
public void processPayment() {
this.resetLatch();
// do multi-step processing
this.CountDownLatch.countDown();
}
public void shutdown() {
// blocks until latch is available
this.countDownLatch.await();
}
}
// ---
#SpringBootApplication
public class Application {
public static void main(String[] args) {
// init app and get context
ConfigurableApplicationContext context = SpringApplication.run(Application.class, args);
// retrieve bean needing special shutdown care
PaymentService paymentService = context.getBean(PaymentService.class);
Runtime.getRuntime().addShutdownHook(new Thread(paymentService::shutdown));
}
}
Constructive feedback is greatly appreciated - thanks.

I ended up using #PreDestroy annotation on the shutdown method:
#Service
public class PaymentService {
private CountDownLatch countDownLatch;
private synchronized void beginTransaction() {
this.countDownLatch = new CountDownLatch(1);
}
private synchronized void endTransaction() {
this.countDownLatch.countDown();
}
public void processPayment() {
try {
this.beginTransaction();
// - - - -
// do multi-step processing
// - - - -
} finally {
this.endTransaction();
}
}
#PreDestroy
public void shutdown() {
// blocks until latch is available
this.countDownLatch.await();
}
}

Related

Log ApplicationEventPublisher.publishEvent() calls

I've got an Spring Boot 2.2 Application which publishes and consumes spring application events in different packages. Now I want to log every time an event has been published by ApplicationEventPublisher.publishEvent().
One solution could be to write my own event publisher like:
public class LoggableApplicationEventPublisher implements ApplicationEventPublisher {
private final ApplicationEventPublisher eventPublisher;
private final Logger logger;
public ApplicationEventLogger(ApplicationEventPublisher eventPublisher, Logger logger) {
this.eventPublisher = eventPublisher;
this.logger = logger;
}
#Override
public void publishEvent(ApplicationEvent event) {
eventPublisher.publishEvent(event);
logger.info("--> Emitting {}", event);
}
}
Another solution could be to use aspect oriented programming and write an Aspect which is triggered everytime publishEvent() has been triggered:
#Aspect
#Component
public class EventPublishAspect {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
#Pointcut("execution(* org.springframework.context.ApplicationEventPublisher.*(..))")
public void logPublishEvent() {
}
#After("logPublishEvent()")
public void log(JoinPoint point) {
Object[] lArgs = point.getArgs();
LOG.info("Triggered", lArgs[0]);
}
}
I've set up all correctly (dependencies aswell) and this example is working for other pointcuts (like for a call of specific method of my services).
However, this aspect is not working with the declared pointcut for the ApplicationEventPublisher-Interface. Do you know why not? It seems like spring boot injects AbstractApplicationContext on runtime, which is actually implementing this interface.
Solution that does not require aspects (and has faster startup time?)
#Primary
#Bean
DelegatingApplicationEventPublisher applicationEventPublisher(ApplicationContext applicationContext) {
new DelegatingApplicationEventPublisher(applicationContext)
}
#Slf4j
#RequiredArgsConstructor
public class DelegatingApplicationEventPublisher implements ApplicationEventPublisher {
private final ApplicationContext context;
#Override
public void publishEvent(ApplicationEvent event) {
logEvent(event);
context.publishEvent(event);
}
#Override
public void publishEvent(Object event) {
logEvent(event);
context.publishEvent(event);
}
private void logEvent(Object event) {
if (event instanceof PayloadApplicationEvent payloadApplicationEvent) {
log.debug(markers("eventName", payloadApplicationEvent.getPayload().getClass(), "event", payloadApplicationEvent.getPayload()), "publishing...");
} else {
log.debug(markers("eventName", event.getClass(), "event", event), "publishing ...");
}
}
}

How to ensure Spring Cloud Stream Listener to wait to process messages until Application is fully initialized on Start?

With Spring Cloud Stream Kafka app, how can we ensure that the stream listener waits to process messages until some dependency tasks (reference data population, e.g.) are done? Below app fails to process messages because messages are delivered too early. How can we guarantee this kind of ordering within a Spring Boot App?
#Service
public class ApplicationStartupService implements ApplicationRunner {
private final FooReferenceDataService fooReferenceDataService;
#Override
public void run(ApplicationArguments args) throws Exception {
fooReferenceDataService.loadData();
}
}
#EnableBinding(MyBinding.class)
public class MyFooStreamProcessor {
#Autowired FooService fooService;
#StreamListener("my-input")
public void process(KStream<String, Foo> input) {
input.foreach((k,v)-> {
// !!! this fails to save
// messages are delivered too early before foo reference data got loaded into database
fooService.save(v);
});
}
}
spring-cloud-stream: 2.1.0.RELEASE
spring-boot: 2.1.2.RELEASE
I found this is not available in spring cloud stream as of May 15, 2018.
Kafka - Delay binding until complex service initialisation has completed
Do we have a plan/timeline when this is supported?
In the mean time, I achieved what I wanted by using #Ordered and ApplicationRunner. It's messy but works. Basically, stream listener will wait until other works are done.
#Service
#Order(1)
public class ApplicationStartupService implements ApplicationRunner {
private final FooReferenceDataService fooReferenceDataService;
#Override
public void run(ApplicationArguments args) throws Exception {
fooReferenceDataService.loadData();
}
}
#EnableBinding(MyBinding.class)
#Order(2)
public class MyFooStreamProcessor implements ApplicationRunner {
#Autowired FooService fooService;
private final AtomicBoolean ready = new AtomicBoolean(false);
#StreamListener("my-input")
public void process(KStream<String, Foo> input) {
input.foreach((k,v)-> {
while (ready.get() == false) {
try {
log.info("sleeping for other dependent components to finish initialization");
Thread.sleep(10000);
} catch (InterruptedException e) {
log.info("woke up");
}
}
fooService.save(v);
});
}
#Override
public void run(ApplicationArguments args) throws Exception {
ready.set(true);
}
}

Different ways to run custom code before the application starts

Could you describe different ways to run custom code before the application starts for data initialization or something else?
(like ApplicationListener, CommandLineRunner etc.)
What is the difference between all of them? Which cases is better to use each of them in?
I want to know not only one way to do that but an understanding when and what I need to use.
Here is enough old question with too many options to do that: Running code after Spring Boot starts
If it is a wrong place to ask this question, please, point me to the right one.
What options I know:
CommandLineRunner - receive command-line arguments as String
#Slf4j
#Component
public class DemoCommandLineRunner implements CommandLineRunner {
#Override
public void run(String... args) {
log.info("[CommandLineRunner] Args: " + Arrays.toString(args));
}
}
ApplicationRunner - receive command-line arguments with names
#Slf4j
#Component
public class DemoApplicationRunner implements ApplicationRunner {
#Override
public void run(ApplicationArguments args) {
log.info("[ApplicationRunner] Args: ");
nonOptionArgs(args);
optionArgs(args);
}
private void nonOptionArgs(ApplicationArguments args) {
args.getNonOptionArgs().forEach(log::info);
}
private void optionArgs(ApplicationArguments args) {
args.getOptionNames().stream()
.map(args::getOptionValues)
.map(Objects::toString)
.forEach(log::info);
}
}
ApplicationListener - listener for different events (for each event own class)
#Slf4j
#Component
public class DemoApplicationListener implements ApplicationListener<ApplicationEvent> {
#Override
public void onApplicationEvent(ApplicationEvent event) {
logEvent(event);
}
private void logEvent(ApplicationEvent event) {
log.info("[DemoApplicationListener] Event: " + event);
}
}
#EventListener - listener for different events (several events in one bean)
#Slf4j
#Component
public class DemoEventApplicationListener {
#EventListener
public void handleContextRefreshedEvent(ContextRefreshedEvent event) {
logEvent(event);
}
#EventListener
public void handleApplicationReadyEvent(ApplicationReadyEvent event) {
logEvent(event);
}
private void logEvent(ApplicationEvent event) {
log.info("[DemoEventApplicationListener] Event: " + event);
}
}
SmartLifecycle - configure bean lifecycle
#Slf4j
#Component
public class DemoSmartLifecycle implements SmartLifecycle {
private boolean isRunning;
#Override
public void start() {
isRunning = true;
log.info("[DemoSmartLifecycle]: Start");
}
#Override
public void stop() {
isRunning = false;
log.info("[DemoSmartLifecycle]: Stop");
}
#Override
public boolean isRunning() {
return isRunning;
}
}
SmartInitializingSingleton - triggered at the end of the singleton pre-instantiation phase
#Slf4j
#Component
public class DemoSmartInitializingSingleton implements SmartInitializingSingleton {
#Override
public void afterSingletonsInstantiated() {
log.info("[SmartInitializingSingleton] afterSingletonsInstantiated");
}
}
Github repo: https://github.com/venkaDaria/demo-bootstrap-spring
If you need to run some code "once the SpringApplication has started" you should use ApplicationRunner or CommandLineRunner - they work the same way.
ApplicationListener, or #EventListener with ApplicationReadyEvent do the same as well.
See my example.
The option you choose is up to you.

Axon Register Tracking Processor with distributed query model

I had implement CQRS+ES application using axon and spring-boot. I use separate query model and command model application. I use rabbitmq to publish event from command mode. It works correct. But tracking Processor implementation is not work in my application.
This is my query model
#SpringBootApplication
public class SeatQueryPart1Application {
public static void main(String[] args) {
SpringApplication.run(SeatQueryPart1Application.class, args);
}
#Bean
public SpringAMQPMessageSource statisticsQueue(Serializer serializer) {
return new SpringAMQPMessageSource(new DefaultAMQPMessageConverter(serializer)) {
#RabbitListener(exclusive = false, bindings = #QueueBinding(value = #Queue, exchange = #Exchange(value = "ExchangeTypesTests.FanoutExchange", type = ExchangeTypes.FANOUT), key = "orderRoutingKey"))
#Override
public void onMessage(Message arg0, Channel arg1) throws Exception {
super.onMessage(arg0, arg1);
}
};
}
#Autowired
public void conf(EventHandlingConfiguration configuration) {
configuration.registerTrackingProcessor("statistics");
}
}
this is a event handler class
#ProcessingGroup("statistics")
#Component
public class EventLoggingHandler {
private SeatReservationRepository seatReservationRepo;
public EventLoggingHandler(final SeatReservationRepository e) {
this.seatReservationRepo = e;
}
#EventHandler
protected void on(SeatResurvationCreateEvent event) {
Timestamp timestamp = new Timestamp(System.currentTimeMillis());
Seat seat=new Seat(event.getId(), event.getSeatId(), event.getDate(),timestamp ,true);
seatReservationRepo.save(seat);
}
}
this is yml configuration
axon:
eventhandling:
processors:
statistics.source: statisticsQueue
How can i do it correct. (Can anyone suggest tutorial or code sample)
The SpringAMQPMessageSource is a SubscribableMessageSource. This means you cannot use a tracking event processor to process messages. It is only compatible with a Subscribable Event Processor.
Removing configuration.registerTrackingProcessor("statistics"); and leaving it to the default (subscribing) should do the trick.

Should the Spring Boot shutdown endpoint shut down the entire JVM process, or just the application context?

I am getting a 200 response from Spring Boot's shutdown endpoint, and I am seeing that the application context shuts down as expected, but then the JVM process itself remains alive forever. Is this the expected behavior of the shutdown endpoint, or is it expected that the process itself would also terminate gracefully?
In http://docs.spring.io/spring-boot/docs/current/reference/html/production-ready-endpoints.html, it says that the shutdown endpoint "allows the application to be gracefully shutdown (not enabled by default)".
Thanks Stéphane, I found what was preventing the JVM process from terminating after hitting the /shutdown endpoint. There was a ScheduledExecutor in one of my dependencies that was not being shut down with the application context, and it was preventing the JVM process from shutting down (even after the application context was closed). I wrote a simple example to show how to reproduce the behavior, and another example showing how to resolve it.
This example will NOT terminate the JVM process when you hit /shutdown endpoint:
#SpringBootApplication
public class AppSpringConfiguration {
public static void main(String[] args) {
SpringApplication.run(AppSpringConfiguration.class);
}
#Bean
public ClassWithExecutor ce() {
return new ClassWithExecutor();
}
#PostConstruct
public void startScheduledTask() {
ce().startScheduledTask();
}
#RestController
public static class BusinessLogicController {
#RequestMapping(value = "/hi")
public String businessLogic() {
return "hi";
}
}
public static class ClassWithExecutor {
ScheduledExecutorService es;
ClassWithExecutor() {
this.es = Executors.newSingleThreadScheduledExecutor();
}
public void startScheduledTask() {
es.scheduleAtFixedRate(new Runnable() {
#Override
public void run() {
System.out.println("Printing this every minute");
}
}, 0, 3, TimeUnit.SECONDS);
}
}
}
By adding a shutdown hook that shuts down the ScheduledExecutor when the application context is closing, the JVM process now gets terminated after hitting the /shutdown endpoint:
#SpringBootApplication
public class AppSpringConfiguration {
public static void main(String[] args) {
SpringApplication.run(AppSpringConfiguration.class);
}
#Bean
public ClassWithExecutor ce() {
return new ClassWithExecutor();
}
#Bean
ShutdownAction sa() {
return new ShutdownAction(ce());
}
#PostConstruct
public void startScheduledTask() {
ce().startScheduledTask();
}
#RestController
public static class BusinessLogicController {
#RequestMapping(value = "/hi")
public String businessLogic() {
return "hi";
}
}
public static class ShutdownAction implements ApplicationListener<ContextClosedEvent> {
private ClassWithExecutor classWithExecutor;
ShutdownAction(ClassWithExecutor classWithExecutor) {
this.classWithExecutor = classWithExecutor;
}
#Override
public void onApplicationEvent(ContextClosedEvent event) {
classWithExecutor.shutdown();
}
}
public static class ClassWithExecutor {
ScheduledExecutorService es;
ClassWithExecutor() {
this.es = Executors.newSingleThreadScheduledExecutor();
}
public void startScheduledTask() {
es.scheduleAtFixedRate(new Runnable() {
#Override
public void run() {
System.out.println("Printing this every minute");
}
}, 0, 3, TimeUnit.SECONDS);
}
public void shutdown() {
es.shutdownNow();
}
}
}
You have something that prevents the JVM to exit besides your Spring Boot application. If you don't and you have a sample projet that demonstrates the problem, then please create an issue and we'll have a look.
Instead of using the shutdown endpoint, you can use the spring-boot-maven-plugin as of 1.3 that has a start and stop goals to be used in typical integration tests scenarios.
If you have a scheduled executor running you should specify destroy method:
#Bean(destroyMethod = "shutdown")

Resources