Spring boot actuator 2 metrics from custom function - metrics

I am in the spring-boot-actuator world mow...
How can I add my own metrics coming from a custom function from my #Service class?
I would expect to have something like
meterRegistry.registerNewGauge(
"animals_count",
"cats",
animalCounterService::countCatsFromDatabase
);
currently i can only find easy metrics like
meterRegistry.counter("animals_count").increment();
but that doesn't help much when I have to aggregate things like database entries. I need a more flexible one.
I also found something like MeterBinder.bindTo but that didn't worked. No error, nothing in metrics.
I am searching now for months without any success.
thanks

I'm assuming you're using Micrometer for metrics, right?
If so, you can create a gauge and bind it to any object that provides a method that returns double like this:
#Service
public class MyService {
...
public double calculateValueForGauge() {...}
}
MyService service = ...// get from spring
MeterRegistry registry = ... // get from spring
// here is how you can create a gauge and bind it to an arbitrary method of your service
Gauge.builder("some.name.goes.here, service, (service) -> service.calculateValueForGauge())
.register(registry);
For example you can place the code of gauge registration to the listener that will be called when the application context is started:
#EventListener
public void onApplicationStarted(ApplicationReadyEvent event) {
// register gauges here
}

Related

Spring Cloud Stream 3 RabbitMQ consumer not working

I'm able to make Spring+Rabbit work with the non-functional way (prior to 2.0?), but I'm trying to use with the functional pattern as the previous one is deprecated.
I've been following this doc: https://docs.spring.io/spring-cloud-stream/docs/3.1.0/reference/html/spring-cloud-stream.html#_binding_and_binding_names
The queue (consumer) is not being created in Rabbit with the new method. I can see the connection being created but without any consumer.
I have the following in my application.properties:
spring.cloud.stream.function.bindings.approved-in-0=approved
spring.cloud.stream.bindings.approved.destination=myTopic.exchange
spring.cloud.stream.bindings.approved.group=myGroup.approved
spring.cloud.stream.bindings.approved.consumer.back-off-initial-interval=2000
spring.cloud.stream.rabbit.bindings.approved.consumer.queueNameGroupOnly=true
spring.cloud.stream.rabbit.bindings.approved.consumer.bindingRoutingKey=myRoutingKey
which is replacing:
spring.cloud.stream.bindings.approved.destination=myTopic.exchange
spring.cloud.stream.bindings.approved.group=myGroup.approved
spring.cloud.stream.bindings.approved.consumer.back-off-initial-interval=2000
spring.cloud.stream.rabbit.bindings.approved.consumer.queueNameGroupOnly=true
spring.cloud.stream.rabbit.bindings.approved.consumer.bindingRoutingKey=myRoutingKey
And the new class
#Slf4j
#Service
public class ApprovedReceiver {
#Bean
public Consumer<String> approved() {
// I also saw that it's recommended to not use Consumer, but use Function instead
// https://docs.spring.io/spring-cloud-stream/docs/3.1.0/reference/html/spring-cloud-stream.html#_consumer_reactive
return value -> log.info("value: {}", value);
}
}
which is replacing
// BindableApprovedChannel.class
#Configuration
public interface BindableApprovedChannel {
#Input("approved")
SubscribableChannel getApproved();
}
// ApprovedReceiver.class
#Service
#EnableBinding(BindableApprovedChannel.class)
public class ApprovedReceiver {
#StreamListener("approved")
public void handleMessage(String payload) {
log.info("value: {}", payload);
}
}
Thanks!
If you have multiple beans of type Function, Supplier or Consumer (which could be declared by third party libraries), the framework does not know which one to bind to.
Try setting the spring.cloud.function.definition property to approved.
https://docs.spring.io/spring-cloud-stream/docs/3.1.3/reference/html/spring-cloud-stream.html#spring_cloud_function
In the event you only have single bean of type java.util.function.[Supplier/Function/Consumer], you can skip the spring.cloud.function.definition property, since such functional bean will be auto-discovered. However, it is considered best practice to use such property to avoid any confusion. Some time this auto-discovery can get in the way, since single bean of type java.util.function.[Supplier/Function/Consumer] could be there for purposes other then handling messages, yet being single it is auto-discovered and auto-bound. For these rare scenarios you can disable auto-discovery by providing spring.cloud.stream.function.autodetect property with value set to false.
Gary's answer is correct. If adding the definition property alone doesn't resolve the issue I would recommend sharing what you're doing for your supplier.
This is also a very helpful general discussion for transitioning from imperative to functional with links to repos with more in depth examples: EnableBinding is deprecated in Spring Cloud Stream 3.x

Primary/secondary datasource failover in Spring MVC

I have a java web application developed on Spring framework which uses mybatis. I see that the datasource is defined in beans.xml. Now I want to add a secondary data source too as a backup. For e.g, if the application is not able to connect to the DB and gets some error, or if the server is down, then it should be able to connect to a different datasource. Is there a configuration in Spring to do this or we will have to manually code this in the application?
I have seen primary and secondary notations in Spring boot but nothing in Spring. I could achieve these in my code where the connection is created/retrieved, by connecting to the secondary datasource if the connection to the primary datasource fails/timed out. But wanted to know if this can be achieved by making changes just in Spring configuration.
Let me clarify things one-by-one-
Spring Boot has a #Primary annotation but there is no #Secondary annotation.
The purpose of the #Primary annotation is not what you have described. Spring does not automatically switch data sources in any way. #Primary merely tells the spring which data source to use in case we don't specify one in any transaction. For more detail on this- https://www.baeldung.com/spring-data-jpa-multiple-databases
Now, how do we actually switch datasources when one goes down-
Most people don't manage this kind of High-availability in code. People usually prefer to 2 master database instances in an active-passive mode which are kept in sync. For auto-failovers, something like keepalived can be used. This is also a high subjective and contentious topic and there are a lot of things to consider here like can we afford replication lag, are there slaves running for each master(because then we have to switch slaves too as old master's slaves would now become out of sync, etc. etc.) If you have databases spread across regions, this becomes even more difficult(read awesome) and requires yet more engineering, planning, and design.
Now since, the question specifically mentions using application code for this. There is one thing you can do. I don't advice to use it in production though. EVER. You can create an ASPECTJ advice around your all primary transactional methods using your own custom annotation. Lets call this annotation #SmartTransactional for our demo.
Sample Code. Did not test it though-
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface SmartTransactional {}
public class SomeServiceImpl implements SomeService {
#SmartTransactional
#Transactional("primaryTransactionManager")
public boolean someMethod(){
//call a common method here for code reusability or create an abstract class
}
}
public class SomeServiceSecondaryTransactionImpl implements SomeService {
#Transactional("secondaryTransactionManager")
public boolean usingTransactionManager2() {
//call a common method here for code reusability or create an abstract class
}
}
#Component
#Aspect
public class SmartTransactionalAspect {
#Autowired
private ApplicationContext context;
#Pointcut("#annotation(...SmartTransactional)")
public void smartTransactionalAnnotationPointcut() {
}
#Around("smartTransactionalAnnotationPointcut()")
public Object methodsAnnotatedWithSmartTransactional(final ProceedingJoinPoint joinPoint) throws Throwable {
Method method = getMethodFromTarget(joinPoint);
Object result = joinPoint.proceed();
boolean failure = Boolean.TRUE;// check if result is failure
if(failure) {
String secondaryTransactionManagebeanName = ""; // get class name from joinPoint and append 'SecondaryTransactionImpl' instead of 'Impl' in the class name
Object bean = context.getBean(secondaryTransactionManagebeanName);
result = bean.getClass().getMethod(method.getName()).invoke(bean);
}
return result;
}
}

Configuring Spring MockMvc to use custom argument resolver before built-in ones

I have a straightforward test case. I have a controller which has a parameter of a type Spring doesn't support by default, so I wrote a custom resolver.
I create the mock mvc instance I'm using like so:
mvc = MockMvcBuilders.standaloneSetup(controller).setCustomArgumentResolvers(new GoogleOAuthUserResolver()).build();
However, Spring is also registering almost 30 other argument resolvers, one of which is general enough that it is getting used to resolve the argument before mine. How can I set or sort the resolvers so that mine is invoked first?
This worked for me without reflection:
#RequiredArgsConstructor
#Configuration
public class CustomerNumberArgumentResolverRegistration {
private final RequestMappingHandlerAdapter requestMappingHandlerAdapter;
#PostConstruct
public void prioritizeCustomArgumentResolver () {
final List<HandlerMethodArgumentResolver> argumentResolvers = new ArrayList<>(Objects.requireNonNull(requestMappingHandlerAdapter.getArgumentResolvers()));
argumentResolvers.add(0, new CustomerNumberArgumentResolver());
requestMappingHandlerAdapter.setArgumentResolvers(argumentResolvers);
}
}
The issue was that the People class the Google OAuth library I am using extends Map and the mock servlet API provides no way to manipulate the order in which the handlers are registered.
I ended up using reflection to reach into the mocks guts and remove the offending handler.

How to get all self injected Beans of a special type?

I would like to build a Spring application, where new components can be added easily and without much configuration. For example: You have different kinds of documents. These documents should be able to get exported into different fileformats.
To make this functionality easy to maintain, it should (basically) work the following way:
Someone programs the file format exporter
He/ She writes a component, which checks if the file format exporter is licensed (based on Spring Conditions). If the exporter is licensed a specialized Bean is injected in the application context.
The "whole rest" works dynamically based on the injected beans. Nothing needs to be touched in order to display it on the GUI, etc.
I pictured it the following way:
#Component
public class ExcelExporter implements Condition {
#PostConstruct
public void init() {
excelExporter();
}
#Bean
public Exporter excelExporter(){
Exporter exporter= new ExcelExporter();
return exporter;
}
#Override
public boolean matches(ConditionContext context, AnnotatedTypeMetadata metadata) {
return true;
}
}
In order to work with those exporters (display them, etc.) I need to get all of them. I tried this:
Map<String, Exporter> exporter =BeanFactoryUtils.beansOfTypeIncludingAncestors(appContext, Exporter.class, true, true);
Unfortunate this does not work (0 beans returned). I am fairly new to this, would anyone mind to tell me how this is properly done in Spring? Maybe there is a better solution for my problem than my approach?
You can get all instances of a given type of bean in a Map effortlessly, since it's a built in Spring feature.
Simply autowire your map, and all those beans will be injected, using as a key the ID of the bean.
#Autowired
Map<String,Exporter> exportersMap;
If you need something more sophisticated, such as a specific Map implementation or a custom key. Consider defining your custom ExporterMap, as follows
#Component
class ExporterMap implements Map{
#Autowired
private Set<Exporter> availableExporters;
//your stuff here, including init if required with #PostConstruct
}

metrics for my api powered by jersey

I try to instrumente my Jersey webservice with Metrics
http://metrics.codahale.com/manual/jersey/
I don't understand how to use this library?
Do I need to add something in my web.xml file?
Thanks
To instrument your Jersey web service, you must add the metrics-jersey module to your application, it contains a #Provider implementation class (make sure Jersey find it) that allow you to instrument your Jersey resources methods annotated with #Timed, Metered and ExceptionMetered.
By default, Metrics reports through JMX, so you can use JConsole to validate your instrumentations.
Like Alex wrote, there are others reporting options but it requires additional configuration or code (call enable method on the Reporter).
For example you can fetch reports in JSON by HTTP, or have you webservice send reports to a monitoring server such as Graphite.
As I can see, you just need to include metrics lib to the build path. On web-services methods you just use annotation #Timed.
To see the reports, you must enable the reporting style you like - reporters
Drop your linen and start your grin'n. I got this working!
Hook up the servlet. You need a generic spot to make and store the metrics. Build one of these for both MetricsRegistry and HealthCheckRegistry :
public class MetricsServletContextListener extends MetricsServlet.ContextListener {
public static final MetricRegistry METRIC_REGISTRY = new MetricRegistry();
#Override
protected MetricRegistry getMetricRegistry() {
return METRIC_REGISTRY;
}
}
Set the servlet context with the data in some startup area:
sc.getServletContext().setAttribute(
"com.codahale.metrics.servlets.HealthCheckServlet.registry",
healthChecks
);
sc.getServletContext().setAttribute(
"com.codahale.metrics.servlets.MetricsServlet.registry",
MetricsServletContextListener.METRIC_REGISTRY
);
Url is: http://blah/blah/metrics/metrics?pretty=true
Create one of these guys. This hooks up the metrics to Jersey:
#Provider
public class TmaticInstrumentedResourceMethodDispatchAdapterWrapper implements ResourceMethodDispatchAdapter {
private InstrumentedResourceMethodDispatchAdapter adapter = null;
public TmaticInstrumentedResourceMethodDispatchAdapterWrapper() {
adapter = new InstrumentedResourceMethodDispatchAdapter(MetricsServletContextListener.METRIC_REGISTRY);
}
#Override
public ResourceMethodDispatchProvider adapt(ResourceMethodDispatchProvider provider) {
return adapter.adapt(provider);
}
}
Tell jersey about it. Since it uses the #Provider annotation it must be in an area that can scan for it. I had to add mine to the web.xml here but you might not have to:
<init-param>
<param-name>com.sun.jersey.config.property.packages</param-name>
<param-value>blah.endpoint,blah.utils</param-value>
</init-param>
And add the annotatation #Timed to your jersey endpoint.

Resources