Spring ExitCodeEvent using #EventListener not working - spring

I am trying to catch the event code at the time my springboot app is destroyed. I have the following bean:
#Configuration
public class DestroyListenerConfig {
#Bean
DemoListener demoListenerBean() {
return new DemoListener();
}
private static class DemoListener {
#EventListener
public void exitEvent(ExitCodeEvent event) {
System.out.println("Exit code: " + event.getExitCode());
}
}
}
The bean is registering properly, but when I kill the application, the exitEvent() method is not invoked ( the system out never displays, or when run in debug mode from IDE, it never enters the method).
Am I leaving something out? My impression was this is all that is needed. Thanks.

ExitCodeEvent is published from org.springframework.boot.SpringApplication#exit. So, you need to manually call SpringApplication.exit as below.
#Autowired
ApplicationContext applicationContext;
#GetMapping("/shutdown")
void shutdown() {
SpringApplication.exit(applicationContext, () -> 100);
}
If you want to listen to a bean destroy event then you can use #PreDestroy as following:
Note that if you have multiple beans created for this class, you will get multiple triggers.
//Put this into a #Component(or inside RestController, Controller, Service etc) class
#javax.annotation.PreDestroy
public void destroy() {
System.out.println("Triggered - #PreDestroy.");
}
If your app is a webapp and you want to listen to shutdown event (ideally the contextDestroyed event) you can register a MyServletContextListener to ServletListenerRegistrationBean:
#Bean
ServletListenerRegistrationBean<ServletContextListener> servletListener() {
ServletListenerRegistrationBean<ServletContextListener> srb = new ServletListenerRegistrationBean<>();
srb.setListener(new MyServletContextListener());
return srb;
}
class MyServletContextListener implements ServletContextListener {
#Override
public void contextDestroyed(ServletContextEvent event) {
System.out.println("Callback triggered - ContextListener.");
}
}
Ref:
https://github.com/spring-projects/spring-boot/blob/master/spring-boot-project/spring-boot/src/main/java/org/springframework/boot/SpringApplication.java#L1332

Related

Spring cloud stream : how to use #Transactional with new Consumer<> functional programming model

I have StreamListener which I would like to replace using the new functional model and Consumer <>. Unfortunately, I don't know how to transfer #Transactional to new model:
#Transactional
#StreamListener(PaymentChannels.PENDING_PAYMENTS_INPUT)
public void executePayments(PendingPaymentEvent event) throws Exception {
paymentsService.triggerInvoicePayment(event.getInvoiceId());
}
I have tired certain things. Sample code below. I added logging messages to a different queue for tests. Then I throw an exception to trigger a rollback. Unfortunately, messages are queued even though they are not there until the method is completed (I tested this using brakepoints). It seems that the transaction was automatically committed despite the error.
#Transactional
#RequiredArgsConstructor
#Component
public class functionalPayment implements Consumer<PendingPaymentEvent> {
private final PaymentsService paymentsService;
private final StreamBridge streamBridge;
public void accept(PendingPaymentEvent event) {
paymentsService.triggerInvoicePayment(event.getInvoiceId());
streamBridge.send("log-out-0",event);
throw new RuntimeException("Test exception to rollback message from log-out-0");
}
}
Configuration:
spring.cloud.stream.rabbit.bindings.functionalPayment-in-0.consumer.queue-name-group-only=true
spring.cloud.stream.rabbit.bindings.functionalPayment-in-0.consumer.declare-exchange=true
spring.cloud.stream.rabbit.bindings.functionalPayment-in-0.consumer.bind-queue=true
spring.cloud.stream.rabbit.bindings.functionalPayment-in-0.consumer.transacted=true
spring.cloud.stream.source=log
spring.cloud.stream.bindings.log-out-0.content-type=application/json
spring.cloud.stream.bindings.log-out-0.destination=log_a
spring.cloud.stream.bindings.log-out-0.group=log_a
spring.cloud.stream.rabbit.bindings.log-out-0.producer.declare-exchange=true
spring.cloud.stream.rabbit.bindings.log-out-0.producer.bind-queue=true
spring.cloud.stream.rabbit.bindings.log-out-0.producer.queue-name-group-only=true
spring.cloud.stream.rabbit.bindings.log-out-0.producer.binding-routing-key=log
spring.cloud.stream.rabbit.bindings.log-out-0.producer.transacted=true
spring.cloud.stream.rabbit.bindings.log-out-0.producer.exchange-type=direct
spring.cloud.stream.rabbit.bindings.log-out-0.producer.routing-key-expression='log'
Have you tried something along the lines of
#Transactional
public class ExecutePaymentConsumer implements Consumer<PendingPaymentEvent> {
public void accept(PendingPaymentEvent event) {
paymentsService.triggerInvoicePayment(event.getInvoiceId());
}
}
. . .
#Bean
public ExecutePaymentConsumer executePayments() {
return new ExecutePaymentConsumer();
}

Spring Boot with javax Event

I try to get spring boot working with cdi events.
I have the following class which fires the event.
#Component
#UIScope
public class Login extends LoginOverlay
{
#Autowired
private UserInfo userInfo;
#Inject
private Event<UpdateCWViewEvent> cwevent;
#PostConstruct
public void init()
{
addLoginListener(new ComponentEventListener<LoginEvent>()
{
#Override
public void onComponentEvent(LoginEvent event)
{
userInfo.login(event.getUsername(), event.getPassword());
if (userInfo.isLoggedIn())
{
setButtonLabel();
close();
cwevent.fire(new UpdateCWViewEvent());
}
}
});
}
}
And in another class the following method
public void update(#Observes(notifyObserver=Reception.IF_EXISTS) UpdateCWViewEvent event)
{
//do something
}
Now I have the following problem. I need an Implementation of javax.enterprise.event.Event. I tried to take weld and use the standard Eventimpl. Now, I tried to configure a Spring Configuration class to tell my application, there is an implementation of my event.
#Configuration
public class Config
{
#Bean
public Event<UpdateCWViewEvent> cwEvent()
{
//return EventImpl.of(injectionPoint, beanManagerImpl);
}
}
I dont know what to do with the injectionPoint and beanManagerImpl. Does anybody of you had the same problem and solved it? Or does anybody have an alternative to fire easy cdi events in a spring boot application?
Thank you very much and stay healthy!

Spring Cloud Stream - First Kafka messages get error "Dispatcher has no subscribers"

My app successfully sends Kafka messages, but only after Kafka is initialized. Before that i get the error "Dispatcher has no subscribers". How do i wait for subscribers to finish being registered for channels?
Here's a trace of the order of events (timing in second.ms):
17.165 SenderClass created
17.816 initialization class, #PostConstruct starts PollingTask
24.781 PollingTask sends first Kafka message
24.816 First error: "Dispatcher has no subscribers"
25.778 Registering MessageChannel my-channel
still seeing Dispatcher errors
27.067 Channel my-channel' has 1 subscriber
No more errors after this, messages send fine
i'm not sure how to approach this. Wild guesses have included:
Place sending code in #PostConstruct
Add #AutoConfigureBefore(BindingServiceConfiguration.class) to Sender
Add #AutoConfigureAfter(BindingServiceConfiguration.class) to SenderClass
Add #AutoConfigureBefore(BindingServiceConfiguration.class) to Main
Place #DependsOn({"EnableBindingClass"}) on Task
Place #DependsOn({"ApplicationLifeCycle"}) on SchedulerClass, where ApplicationLifeCycle is a class that does nothing but
implements SmartLifecycle with getPhase returning MAX_INT
Making sure ComponentScan is on for whole package (a suggestion from other SO threads)
Various combinations of the above
Created a new app, made it as simple as i could:
public interface Source {
#Output(channelName)
MessageChannel outboundChannel();
}
#EnableBinding(Source.class)
#Component
public class Sender {
#Autowired
private Source source;
public boolean send(SomeObject object) {
return source.outboundChannel().send(MessageBuilder.withPayload(object).build());
}
#Service
public class Scheduler {
#Autowired
Sender sender;
#Autowired
ThreadPoolTaskScheduler taskScheduler;
#PostConstruct
public void initialize() {
taskScheduler.schedule(new PollingTask(), nextTime);
}
private class PollingTask implements Runnable {
#Override
public void run() {
List<SomeObject> objects = getDummyData();
for(SomeObject object : objects)
{
sender.send(interval);
}
Instant nextTime = Instant.now().plusMillis(1_000L);
try {
taskScheduler.schedule(new PollingTask(), nextTime);
} catch (Exception e) {
logger.error(e);
}
}
}
Edit to add Solution
It works now! In my scheduler that starts the things that send the messages i switched from starting things in #PostConstruct to SmartLifecycle::start().
#Service
public class Scheduler implements SmartLifecycle {
#Autowired
Sender sender;
#Autowired
ThreadPoolTaskScheduler taskScheduler;
#Override
public void start() {
taskScheduler.schedule(new PollingTask(), nextTime);
}
private class PollingTask implements Runnable {
#Override
public void run() {
List<SomeObject> objects = getDummyData();
for(SomeObject object : objects)
{
sender.send(interval);
}
Instant nextTime = Instant.now().plusMillis(1_000L);
try {
taskScheduler.schedule(new PollingTask(), nextTime);
} catch (Exception e) {
logger.error(e);
}
}
}
#PostConstruct is too early to send messages; the context is still being built.. Implememt SmartLifecycle, put the bean in a high phase (Integer.MAX_VALUE) and do the sends in start().
Or do the sends in an ApplicationRunner.
I faced a similar problem in Webflux + Spring Cloud Stream functional style. Spring Cloud Function in 2022 is the preferred way. ​
My hypothesis after a lot of debugging was that beans were not created in right order. The bean was probably not registered in spring-cloud-stream's dispatchers before kafka message processing started. similar to what #gary mentioned.
So I added #Order(1) before my consumer beans. Hoping that this bean would be created before it is dispatcher-registrations starts.
​#Bean
​#Order(1)
​public Function<Flux<Message<Pojo>>, Mono<Void>> pojoConsumer() {
This seems to fix my issue for now.

Spring PostConstruct of a container

How can I run some code inside a Spring Container after all beans has been loaded? I know I can use #PostConstruct for a single bean, but I would like to run that piece of code after all PostConstructs are called.
Is is possibile?
---UPDATE---
I tried to follow the ApplicationListener way, this is the implementation:
#Component
public class PostContructListener implements ApplicationListener<ContextRefreshedEvent> {
private static Logger log = LoggerFactory.getLogger(PostContructListener.class);
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent) {
Collection<Initializable> inits= contextRefreshedEvent.getApplicationContext().getBeansOfType(Initializable.class).values();
for (Initializable initializable : inits) {
try{
log.debug("Initialization {} ",initializable.getClass().getSimpleName());
initializable.init();
}catch(Exception e){
log.error("Error initializing {} ",initializable.getClass().getSimpleName(),e);
}
}
}
}
Applying "Initializable" interface to all services I got what I needed, how every this way I broke all autowires, I cannot understand why but seems to be connected to the new "Initializable" interface:
java.lang.IllegalArgumentException: Can not set com.service.MyService field com.controller.RestMultiController.myService to com.sun.proxy.$Proxy41
I think you need this.
public class SpringListener implements ApplicationListener<ContextRefreshedEvent>{
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent ) {
// do things here
}
}
http://docs.spring.io/spring/docs/current/spring-framework-reference/htmlsingle/#context-functionality-events

EclipseLink + JPA Guice Persist and Redeployments

I have an infrastructure based on EclipseLink + JPA Guice Persist
When I redeploy the application always I have caching problems with caching Entitys and I have to reboot the server (Oracle Weblogic 11g) .This problem is treated in a this post: https://bugs.eclipse.org/bugs/show_bug.cgi?id=326552 But, maybe is not a bug ¿?¿? ...
I managed to solve the problem as follows :
Originally I have centralized everything in a GuiceModule:
1.Create the module JPAPersist
2.Binding of a Initializer class thas invokes the persistenceService.start()
public class MyGuiceModule implements Module {
#Override
public void configure(final Binder binder) {
Properties props = _dbConnectionPropertiesForPool();
JpaPersistModule jpaModule = new JpaPersistModule(persistenceUnit);
jpaModule.properties(props);
binder.install(jpaModule);
binder.bind(JPAInitializer.class).asEagerSingleton();
}
public class JPAInitializer {
#Inject
public JPAInitializer(final PersistService service) {
service.start();
}
}
Everything works fine .... but as I said when redeploy remain cached instances
HOW DO I HAVE SOLVED?
I changed the method JPAInitializer
public static class JPAInitializer {
private static PersistService _persistenceService = null;
#Inject
public JPAInitializer(final PersistService service) {
_persistenceService = service;
_persistenceService.start();
}
public static void stop() {
_persistenceService.stop();
}
}
I created a method stop () that stops the service ..but WTF! I have been forced to save the the injected Persistence service in a static variable :((
From the guice / listener filter that is the entrypoint invoke the stop when the application is undeployed (onContextDestroyed)
public void contextDestroyed(ServletContextEvent servletContextEvent) {
JPAInitializer.stop();
}
Now, when i redeploy there is no cache issue or problem, and there is no need to restart the server
It works this way, but I do not know if it's all right to create a static instance PesistenceService., so i'm trying to find another way to invoke the stop.....
Any suggestion?
Found solution .
Create a inteface to handle Guice Persistence Service :
interface MyPersistenceServiceHandler {
public void start();
public void stop();
}
This will be used into the main DB Guice Module :
binder.bind(MyPersistenceServiceHandler .class)
.to(JPAPersistenceServiceControl.class)
.in(Singleton.class);
static class JPAPersistenceServiceControl
implements MyPersistenceServiceHandler {
private final PersistService _service;
#Inject
public JPAPersistenceServiceControl(final PersistService service) {
_service = service;
}
#Override
public void start() {
if (_service == null) throw new IllegalStateException("NO persistence service available!");
_service.start();
}
#Override
public void stop() {
if (_service == null) throw new IllegalStateException("NO persistence service available!");
_service.stop();
}
}
Get the instance in the RESTEndoint/Guice filter through Guice Injector.
jpaServiceHandler = _myGuiceInjector.getInstance(MyPersistenceServiceHandler .class);
Start the service on contextInitialized : jpaServiceHandler.start();
Stop the service on contextDeproyed : jpaServiceHandler.stop();

Resources