Datasource Metrics for Postgres DB are not available at /q/metrics endpoint [QUARKUS] - quarkus

Quarkus App - Rest service to fetch data from Postgres DB
Quarkus version : 2.9.0.Final
Postgres Extension - quarkus-reactive-pg-client
Micrometer Extension - quarkus-micrometer-registry-prometheus
After adding the above extensions , the postgres db metrics are not available at /q/metrics endpoint.
How to get the pg datasource metrics while using reactive pg drivers

The support for metrics in the reactive client is not ready yet. It's predicted to become available on Quarkus 2.16.
Once is done, you need to enable the DB metrics by setting this property:
quarkus.datasource.metrics.enabled=true
As explained here: https://quarkus.io/guides/datasource#datasource-metrics

Related

Can we use Couchbase as the message store for Spring Integration

We are implementing Spring Integration aggregator and using JDBC message data store. In our environment we have Couchbase and Oracle DB. I don't want to use Oracle DB, can we use couchbase as the message store.
If yes, can you please suggest the approach.
There is a MessageStore strategy that you would need to implement and then configure as any other message store in your Spring Integration components that required it. Here is a bit more info.

Spring Cloud Connector Plan Information

I am using Spring Cloud Connector to bind to databases. Is there any way to get the plan of the bound service? When I extend an AbstractCloudConfig and do
cloud().getSingletonServiceInfosByType(PostgresqlServiceInfo.class)...
I will have information on the url and how to connect to the postgres. PostgresqlServiceInfo and others do not carry along the plan data. How can I extend the service info, in order to read this information form VCAP_SERVICES?
Thanks
By design, the ServiceInfo classes in Spring Cloud Connectors carry just enough information to create the connection beans necessary for an app to consume the service resources. Connectors was designed to be platform-neutral, and fields like plan, label, and tags that are available on Cloud Foundry are not captured because they might not be available on other platforms (e.g. Heroku).
To add the plan information to a ServiceInfo, you'd need to write your own ServiceInfo class that includes a field for the value, then write a CloudFoundryServiceInfoCreator to populate the value from the VCAP_SERVICES data that the framework provides as a Map. See the project documentation for more information on creating such an extension.
Another (likely easier) option is to use the newer java-cfenv project instead of Spring Cloud Connectors. java-cfenv supports Cloud Foundry only, and gives access to the full set of information in VCAP_SERVICES. See the project documentation for an example of how you can use this library.

Apache Kafka Connect With Springboot

I'm trying to find examples of kafka connect with springboot. It looks like there is no spring boot integration for kafka connect. Can some one point me in the right direction to be able to listen to changes on mysql db?
Kafka Connect doesn't really need Spring Boot because there is nothing for you to code for it, and it really works best when ran in distributed mode, as a cluster, not embedded within other (single-instance) applications. I suppose if you did want to do it, then you could copy relevent portions of the source code, but that of course isn't using Spring Boot, and you'd have to wire it all yourself
The framework itself consists of a few core Java dependencies that have already been written (Debezium or Confluent JDBC Connector, for your mysql example), and two config files. One for Kafka Connect to know the bootstrap servers, serializers, etc. and another for the actual MySQL connector. So, if you want to use Kafka Connect, run it by itself, then just write the consumer in the Spring app.
The alternatives to Kafka Connect itself would be to use Apache Camel within a Spring application (Spring Integration) or Spring Cloud Dataflow and interfacing with those Kafka "components" (which aren't using the Connect API, AFAIK)
Another option, specific for listening to MySQL, is to use Debezium Engine within your code.

Spring Boot (Spring Data JPA) - configure PostgreSQL Read Replicas

What is the simplest way to configure read replicas with Spring Boot and Spring Data JPA? I'm searching a lot and cannot find solution.
AWS RDS Aurora Postgresql gives 2 endpoints:
master (write)
replicas (read)
I want to configure my application to use this endpoints.
Why do you need to configure the read replica with Spring Boot, I am guessing that you just need to connect with the read replica, right?
Have you looked the aws sdk for java?
You need to look over a class there named DescribeDBClusterEndpointsResult.

how to use redismetricrepository in spring boot

I am working on spring boot actuator and able to see the metrics of my application. But I want to store these metrics to some db. In Spring doc it has been mentioned that RedisMetricRepository provides option for storing metrics to redis db. But I dont how to make use of this RedisMetricRepository to store metrics to redis db.Kindly help me out how to use RedisMetricRepository for storing metrics to redis db.
You can just create a #Bean of type RedisMetricRepository. I suspect that will just store the metrics in Redis immediately. I prefer to buffer in memory and export to Redis periodically. Here's a sample using #Scheduled to export to Redis every 5s): https://github.com/scratches/aggregator/blob/master/generator/src/main/java/demo/GeneratorApplication.java#L61.

Resources