Write out Spring Boot metrics to Stdout or aggregator? - spring-boot

I have a java application written on top of Spring Boot and I can see the metrics being generated through the /metrics management API. I would like to filter the metrics that are being generated (based on metric prefix) and print to stdout OR send the selected metrics to a 3rd party aggregator (not the ones referenced here)
I tried the code suggested by this answer but it didn't result in any metrics being written to the stdout. This is what I added to my Application.java class:
#Bean
#ServiceActivator(inputChannel = "metricsChannel")
public MessageHandler metricsHandler() {
return System.out::println;
}
What is the best way to intercept the metrics on a preconfigured cadence so I can process and write them to stdout or publish them to an aggregator?
Thanks.

Looks like it's a bug in the Spring Boot: https://github.com/spring-projects/spring-boot/issues/5517.
We have to declare something like this ourselves as a workaround:
#Bean
public MessageChannel metricsChannel() {
return new DirectChannel();
}
#Bean
#ExportMetricWriter
public MessageChannelMetricWriter messageChannelMetricWriter() {
return new MessageChannelMetricWriter(metricsChannel());
}
#Bean
#ServiceActivator(inputChannel = "metricsChannel")
public MessageHandler metricsHandler() {
return System.out::println;
}

Related

Micrometer timer metrics not displayed in Cloudwatch when #Timed is used

I am trying to send my API application metrics to AWS Cloudwatch using the mircometer registry's timer. I have successfully managed to send custom metrics by implementing it manually so far. But, I want to send the metrics using the #Timed annotation, and it doesn't work. Here are the things I have tried so far:
#Timed("test")
#RestController
public class TempHandler{
#GetMapping(value="/path")
#Timed(value="ping")
public ResponseEntity ping(#RequestHeader Map<String,String> headers){
return applyHeaders(Response.status(200).entity("ok"));
}
}
I have created a #Configuration class as well to return a TimedAspect object.
#Configuration
public class TimedConfiguration {
#Bean
public TimedAspect timedAspect(MeterRegistry registry) {
return new TimedAspect(registry);
}
}
I have included the spring AOP dependancy in my build.gradle as well. Please help me to figure out what I am doing wrong. Thanks!

Spring Integration Connecting a Gateway to a Service Activator

I've created a Gateway and a polling notificationChannel which the Gateway uses to route messages. I want a service activator to poll from the channel and do its thing. But I can't seem to grasp a few things about Spring Integration.
In this case would we need an IntegrationFlow Bean? Wouldn't calling the gateway method just send the message trough the channel and the service activator can just poll automatically when there is a new message?
ConfigurationClass:
#EnableIntegration
#Configuration
#IntegrationComponentScan
class IntegrationConfiguration {
#Bean
fun notificationChannel(): MessageChannel {
return MessageChannels.queue().get()
}
#Bean
fun integrationFlow(): IntegrationFlow {
TODO()
}
}
Gateway:
#MessagingGateway(defaultRequestChannel = "notificationChannel")
#Component
interface NotificationGateway {
fun sendNotification(bytes: ByteArray)
}
Service:
#Service
class NotificationService {
#ServiceActivator(inputChannel = "notificationChannel")
fun sendNotification(bytes: ByteArray) {
TODO()
}
}
I am new to Spring Integration and having a rough time since I can't find understandable documentation for my level of knowledge especially on Spring Integration DSL.
My main problem might be that I do now understand the use of the IntegrationFlow Bean
For a simple use-case like yours you indeed don't need an IntegrationFlow. The simple #ServiceActivator as you have now is fully enough to process messages from the notificationChannel. Only what you need is a #Poller in that #ServiceActivator configuration since your notificationChannel is a PollableChannel one and it is not subscribable one.
See Reference Manual for more info: https://docs.spring.io/spring-integration/docs/current/reference/html/#configuration-using-poller-annotation
Also pay attention to the paragraph in the beginning of the doc: https://docs.spring.io/spring-integration/docs/current/reference/html/#programming-considerations

Report Metrics from Kafka to Actuator

I'm trying to get some metrics (client lag, ...) from kafka to provide it for consumption by prometheus.
My approach would be to write a simple springboot application which exposes the metrics for prometheus. I understand that kafka provides metrics to all its consumers via the interface MetricsReporter.
So I implemented a class which should do exactly that:
public class MonitoringIntegration implements MetricsReporter {
#Override
public void init(List<KafkaMetric> list) {
System.out.println("init");
for (KafkaMetric kafkaMetric : list) {
System.out.println(kafkaMetric.metricName());
System.out.println(kafkaMetric.metricValue());
}
}
#Override
public void metricChange(KafkaMetric kafkaMetric) {
System.out.println("Metric Change");
System.out.println(kafkaMetric.metricName());
System.out.println(kafkaMetric.metricValue());
}
#Override
public void metricRemoval(KafkaMetric kafkaMetric) {
System.out.println("Removal");
System.out.println(kafkaMetric.metricName());
System.out.println(kafkaMetric.metricValue());
}
#Override
public void close() {
System.out.println("close");
}
#Override
public void configure(Map<String, ?> map) {
System.out.println("Configuring");
System.out.println(map);
}
}
I registered this class with a bean:
#Configuration
public class MetricConfiguration {
#Bean
public ProducerFactory<?, ?> kafkaProducerFactory(KafkaProperties properties) {
Map<String, Object> producerProperties = properties.buildProducerProperties();
producerProperties.put(CommonClientConfigs.METRIC_REPORTER_CLASSES_CONFIG,
MonitoringIntegration.class.getName());
return new DefaultKafkaProducerFactory<>(producerProperties);
}
#Bean
public ConsumerFactory<?, ?> kafkaConsumerFactory(KafkaProperties properties) {
Map<String, Object> consumererProperties = properties.buildConsumerProperties();
consumererProperties.put(CommonClientConfigs.METRIC_REPORTER_CLASSES_CONFIG,
MonitoringIntegration.class.getName());
return new DefaultKafkaConsumerFactory<>(consumererProperties);
}
}
When I start the application some metrics will be printed out to cmd, but they have all default values (0.0, infinite, ..) and they will only be provided once after the application started.
Why am I not getting the metrics? What did I do wrong?
Cheers,
Fabian
Spring Kafka already exposes Kafka metrics as a JMX metrics. You dont need to update/send the metrics to Prometheus. Prometheus server will automatically read from your application's "/prometheus" endpoint. Enable Spring Actuator with Prometheus in your Spring project and configure the Prometheus server to read from it.
Here is a great example using Spring Boot - https://www.callicoder.com/spring-boot-actuator-metrics-monitoring-dashboard-prometheus-grafana/
MetricsReporter is not used to "report" metric values as they change. Check the docs. (For some reason I cant find the latest API).
https://archive.apache.org/dist/kafka/0.8.2-beta/java-doc/org/apache/kafka/common/metrics/MetricsReporter.html
A plugin interface to allow things to listen as new metrics are created so they can be reported.
metricChange() method will only be called when a metric is changed. This is the reason you see the first few outputs during application startup, because the metrics were created.
The consumer metrics support are only available on spring boot 2.1+ versions.
Auto-configuration Support For New Metrics
Metrics coverage has been improved to include:
Hibernate metrics
Spring Framework’s WebClient
Kafka consumer metrics
Log4j2 metrics
Jetty server thread pool metrics
Server-side Jersey HTTP request metrics
https://github.com/spring-projects/spring-boot/wiki/Spring-Boot-2.1-Release-Notes#auto-configuration-support-for-new-metrics
I recommend you to upgrade to newer versions. But if you really need to use Spring Boot prior versions, you can check my kafka metrics micrometer implementation at:
https://github.com/luiztoscano/spring-boot-kmetrics

How to specify taskExecutor in publishSubscribe()

How to "translate" the following XML configuration to the equivalent Spring integration java-dsl?
<int:publish-subscribe-channel id="channel" task-executor="myex">
</int:publish-subscribe-channel>
<task:executor id="myex" pool-size="10"></task:executor>
I've read the DSL Reference Guide, but still can't figure it out.
MessageChannels chapter points out to the MessageChannels factory. So, <publish-subscribe-channel> XML config translates to Java config like:
#Bean
public MessageChannel channel() {
return MessageChannels.publishSubscribe(myExecutor()).get();
}
Although you can reach the same just with raw Java config:
#Bean
public MessageChannel channel() {
return new PublishSubscribeChannel(myExecutor());
}

Spring Integration - #Filter discardChannel and/or throwExceptionOnRejection being ignored?

I have a java DSL based spring integration (spring-integration-java-dsl:1.0.1.RELEASE) flow which puts messages through a Filter to filter out certain messages. The Filter component works okay in terms of filtering out unwanted messages.
Now, I would like to set either a discardChannel="discard.ch" but, when I set the discard channel, the filtered out messages never seem to actually go to the specified discardChannel. Any ideas why this might be?
My #Filter annotated class/method:
#Component
public class MessageFilter {
#Filter(discardChannel = "discard.ch")
public boolean filter(String payload) {
// force all messages to be discarded to test discardChannel
return false;
}
}
My Integration Flow class:
#Configuration
#EnableIntegration
public class IntegrationConfig {
#Autowired
private MessageFilter messageFilter;
#Bean(name = "discard.ch")
public DirectChannel discardCh() {
return new DirectChannel();
}
#Bean
public IntegrationFlow inFlow() {
return IntegrationFlows
.from(Jms.messageDriverChannelAdapter(mlc)
.filter("#messageFilter.filter('payload')")
...
.get();
}
#Bean
public IntegrationFlow discardFlow() {
return IntegrationFlows
.from("discard.ch")
...
.get();
}
}
I have turned on spring debugging on and, I can't see where discarded messages are actually going. It is as though the discardChannel I have set on the #Filter is not being picked up at all. Any ideas why this might be?
The annotation configuration is for when using annotation-based configuration.
When using the dsl, the annotation is not relevant; you need to configure the .filter within the DSL itself...
.filter("#messageFilter.filter('payload')", e -> e.discardChannel(discardCh())

Resources