Relaying JDA events to Spring Event subsystem hangs whole Spring app - spring

I'm trying to make a command framework for a Discord Bot using Spring Boot with Kotlin. I got it working perfecly with Javacord but recently I decided to switch to JDA and I've hit a snag. The command framework I'm making also relays all Discord events to the Spring Event system and what I'm currently doing is grabbing the generic event listener (https://ci.dv8tion.net/job/JDA/javadoc/net/dv8tion/jda/core/hooks/EventListener.html) and dispatching everything to Spring, using an autowired ApplicationEventPublisher. However, the spring app seems to hang on that line, and after debugging it with some breakpoints it seems that it gets stuck on this particular line (https://github.com/spring-projects/spring-framework/blob/master/spring-context/src/main/java/org/springframework/context/event/AbstractApplicationEventMulticaster.java#L190). Any idea why this is happening? I've seen this (https://github.com/spring-projects/spring-framework/issues/20904) but I'm not sure what to do...
SpringGenericEventPublisher.kt
#Component
class SpringGenericEventPublisher : EventListener {
#Autowired
private lateinit var context: ApplicationEventPublisher
override fun onEvent(event: Event) = context.publishEvent(event)
}
BotConfiguration.kt (where I have the bean that builds the JDA instance)
#Bean
fun bot(config: BotProperties): Bot = JDABuilder() // [Bot] is a typealias for [JDA]
//.setCallbackPool(Executors.newSingleThreadExecutor()) // I've tried this because of the issue I linked above, but I got the same result
.setToken(config.token)
.addEventListener(springGenericEventPublisher)
.build()
Then I have a simple listener just to test
#Component
class FooComponent {
#EventListener(Event::class)
fun onFoo(event: Event) {
println("Reached `onFoo`")
}
}
Any ideas?
Thanks in advance
PS: I should add that I'm also using Spring Data Redis and Spring Data MongoDB and that both fire up with success before that and that the command registry should start after that, just doesn't. The JDA instance logs in perfectly, since if I just print something to the screen instead of publishing the event on SpingGenericEventPublisher it will succeed.

Related

Micrometer with Prometheus Pushgateway not sending the recent metrics

I have a Spring boot application based on CommandLineRunner, when it starts it does some calculations, sends the metrics to Prometheus and shuts down.
I am using Prometheus Push Gateway with Micrometer, mainly based on this tutorial:
https://luramarchanjo.tech/2020/01/05/spring-boot-2.2-and-prometheus-pushgateway-with-micrometer.html
This works fine if I leave the application running however with my particular Spring boot application, it looses the metrics sent just before the shutdown.
I have had similar issue with CloudWatch however it was clear with the Registry implementation CloudWatchMeterRegistry, that it starts a thread in the background and it uses the property cloudwatch.step to schedule the dispatch of the collected metrics. I am struggling to see how PrometheusMeterRegistry is working and not sending the metrics before the application shutsdown.
I have tried to add meterRegistry.close(); just before the shutdown, however it still has the same issue!
After some investigation resolved this by calling the shutdown() method on PrometheusPushGatewayManager.
#SpringBootApplication
public class SpringBootConsoleApplication implements CommandLineRunner {
#Autowired
PrometheusPushGatewayManager prometheusPushGatewayManager;
#PreDestroy
public void onExit() {
System.out.println("Exiting..");
prometheusPushGatewayManager.shutdown();
}
...
And add following in the application.properties:
management.metrics.export.prometheus.pushgateway.shutdown-operation=PUSH

Integration Test a Reactive Spring Cloud Stream

TLDR; How do you test a Reactive Function composition using the Test Binder?
I have a Spring Cloud Stream that uses Reactive Functions and I don't know how to test it. I don't see any official docs on how to do an Integration Test from input source to output destination binder.
In my specific case, I am connecting a Spring Integration flow using a Reactive Supplier and the IntegrationReactiveUtils.messageChannelToFlux() pattern. This works in a development environment - I can pull messages from RabbitMQ using the Spring Integration Flow and they enter the SCSt.
My SCSt has several function chained together, each one is reactive. They are composed like func1|func2|func3. I verified this works with a dev Rabbit (source) and Kafka (Destination).
I can't seem to figure out how to test this, and there doesn't seem to be any official documentation on testing a complete reactive stream. Right now I have code that roughly looks like this:
#Autowired
MessageChannel inputChannel;
#Autowired
private OutputDestination output;
#Test
void myTest() {
//omitted prep of var 'messageToSend'
this.inputChannel.send(messageToSend);
var outputMessage = output.receive(5000);
Assertions.assertNotNull(outputMessage.getPayload());
}
The error I receive is that output.receive(5000) returns null. I suspect a threading issue because I am not subscribing to the Flux and waiting for completion.
I have run a debugger in the Flux functions and see the message going all the way to the end with no errors or weirdness.
I figured this out actually. I had to specify the binder name. I had a test property spring.cloud.stream.bindings.processingStream set, which I thought made 2 new bindings (processingStream-in-0 and processingStream-out-0).
It turns out I had to set the binding name in the test code like output.receive(5000, "processingStream"), without the -out-0 suffix. I can now receive messages from the stream.

is it possible to create a queue listener using web flux spring integration?

#Component
#RequiredArgsConstructor
public class EventListener {
private final EventProcessingService eventProcessingService;
#JmsListener(destination = "inputQueue", constainerFactory = "myContainerFactory)
public void receiveMessage(Message message) {
eventProcessingService.doSome(message).subscribe(); // return Mono<Void>
}
}
#Service
public class EventProcessingService {
public Mono<Void> doSome(Message message) {
//...
}
}
#Configuration
#RequiredArgsConstructor
public class MqIntegration {
private final ConnectionFactory connectionFactory;
#Bean
public Publisher<Message<String>> mqReactiveFlow() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(this.connectionFactory)
.destination("testQueue"))
.channel(MessageChannels.queue())
.toReactivePublisher();
}
}
I have some webflux application which interacts with ibm mq and a JmsListener which listens for messages from the queue when a message is received EventProcessingService makes requests to other services depending on the messages.
I would like to know how I can create a JmsListener that works with reactive threads using Spring Integration. In other words I want to know if it is possible to create an Integration flow which will receive messages from the queue and call the EvenProcessingService when the messages are received so that it does not have a negative effect on the threads inside webflux application
I think we need to clean up some points in your question.
WebFlux is not a project by itself. It is Spring Framework module about Web on top of reactive server: https://docs.spring.io/spring-framework/docs/current/reference/html/web-reactive.html#spring-webflux
The #JmsListener is a part of another Spring Framework module - spring-jms. And there is nothing relevant to threads used by reactive server for WebFlux layer. https://docs.spring.io/spring-framework/docs/current/reference/html/integration.html#jms
Spring Integration is a separate project which implement EIP on top of Spring Framework dependency injection container. It indeed has its own WebFlux module for channel adapters on top of WebFlux API in Spring Framework: https://docs.spring.io/spring-integration/docs/current/reference/html/webflux.html#webflux. And it also has a JMS module on top of JMS module from Spring Framework: https://docs.spring.io/spring-integration/docs/current/reference/html/jms.html#jms. However there is nothing related to #JmsLisntener since its Jms.messageDrivenChannelAdapter() fully covers that functionality and from a big height it does it the same way - via MessageListenerContainer.
All of this is might not be relevant to the question, but it is better to have a clear context of what you are asking so we will feel that we are on the same page with you.
Now trying to answer to your concern.
As long as you don't deal with JMS from WebFlux layer (#RequestMapping or WebFlux.inboundGateway()), you don't effect those non-blocking thread. The JMS MessageListenerContainer spawns its own threads and perform pulling from the queue and message processing.
What you are explaining with your JMS configuration and service looks more like this:
#Bean
public IntegrationFlow mqReactiveFlow() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(this.connectionFactory)
.destination("testQueue"))
.handle(this.eventProcessingService)
.nullChannel();
}
There is really no reason to shift messages just after JMS into a QueueChannel since JMS listening is already an async operation.
We need that nullChannel in the end of your flow just because your service method returns Mono and framework knows nothing what to do with that. Starting with version 5.4.3 the NullChannel is able to subscribe to the Publisher payload of the message produced to it.
You could have though a FluxMessageChannel in between to really simulate a back-pressure for JMS listener, but that won't make to much different for your next service.
I think you are going to have to bypass #JmsListener as that is registering an on message, which although asynchronous isn't going to be reactive. JMS is essentially blocking, so patching a reactive layer on top, is going to be just a patch.
You will need to use the Publisher that you have created to generate the back pressure. I think you are going to have to define and instantiate your own listener bean which does something on the lines of :
public Flux<String> mqReactiveListener() {
return Flux.from(mqReactiveFlow())
.map(Message::getPayload);
}

Spring-Boot on AppEngine Standard Detect Shutdown

Using Spring Boot 2.0.0.RELEASE on Google Appengine Standard. Using autoscaling but trying to get some control about creation/destruction of the instances. Need to be able to do some cleanup and would like to log those events.
Any methods in Spring Boot like #Predestroy or ContextClosedEvent don't seem to work on GAE.
According to the documention, it should be possible to detect shutdown of an instance by adding a shutdown hook.
Documentation LifecycleManager.ShutdownHook.
Have tried to put it in several places without success.
Example as a #Bean:
#Bean
public LifecycleManager lifecycleManager() {
LifecycleManager lifecycle_manager = LifecycleManager.getInstance();
lifecycle_manager.setShutdownHook(new ShutdownHook() {
public void shutdown() {
LifecycleManager.getInstance().interruptAllRequests();
log.error("Shutdown " + getClassSimpleName() + ".");
}
});
log.warn("Created shutdown hook.");
return lifecycle_manager;
}
Shutdown hook is properly installed, but doesn't get fired when the instance goes down.
As you can read in the Google Issue Tracker:
Shutdown hooks only work for manual scaled instances on the standard runtime (...).
The shutdown hooks simply don't work on automatic and basic scaling.

Spring Cloud Stream - Output Messages from EventListener

I'm trying to utilize the #DomainEvents mechanism provided by Spring Data to publish Events with Spring Cloud Stream (Spring Boot 2.0.0.M7 and Finchley.M5). I have a two-part question.
Why does SendTo not work on EventListeners?
Is there a better way to accomplish this?
The DomainEvent is being created and sent to the EventListener without issues. The problem is that the SendTo mechanism didn't seem to be working. The first method below would trigger, but not forward the message. Manually building the Message and sending it as shown in the second method works correctly.
#Async
#TransactionalEventListener
#SendTo(Sink.Output)
StreamedEvent handleEventWithSendTo(MyEvent event) {
// handle and create event
}
#TransactionalEventListener
void handleEvent(MyEvent event) {
// handle and create event
sink.output().send(MessageBuilder.withPayload(payload).build())
}
The call-out in the Spring Cloud Stream docs shows using SendTo on a StreamListener, which is not quite the same thing as an EventListener, but I thought it may work.
For the second part, using DomainEvents requires the service code persisting the Entity to know about the event (to either call registerEvent directly or some method on the Entity which represents the event). I was curious if using the Spring Data callback hooks (e.g. PreUpdate, PostUpdate) would be better. Or if there was a better way all together.

Resources