Avoid sending zero values to InfluxDB - spring-boot

I'm using Spring Boot Actuator + Micrometer to send values to micrometer, so i have the following maven:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-influx</artifactId>
</dependency>
So, i have a method that start timer and stop timer sample (micrometer):
public Timer.Sample starTimer() {
return Timer.start(registry);
}
public void stopTimer(Class clazz, Timer.Sample sample) {
sample.stop(registry.timer("timer-dev", Arrays.asList(Tag.of("modulo", modulo), Tag.of("class", clazz.getName()))));
}
So, this works very well. InfluxDB receive value sent from Spring Boot Application and show in Grafana.
The problem: After sending "timer-dev" first time to influx, spring boot keeping sending values "0" continually, i would like avoid spring boot sending zero values, it should send only when timer-dev greater then zero. The "timer-dev" should be sent only when this method is called not everytime.
Any tips ?

Related

Access token not propagated in Spring Boot 2.7

We use Spring Boot with an OIDC integration to provide authentication and authorization to our users.
Our application acts as Client in the OIDC code flow, calling downstream Resource servers through Http requests to serve user requests.
To have the Client pass the access token of the authenticated user to Resource servers we use the ServletOAuth2AuthorizedClientExchangeFilterFunction and apply that to the org.springframework.web.reactive.function.client.WebClient handling downstream requests.
We recently upgraded from Spring Boot 2.6.7 to 2.7.3 and discovered that the Authentication header containing the access token is no
longer added to outgoing requests, if those requests are scheduled on a thread other than the one serving the original request:
public class MyController {
public Mono<ProductSearchResult> searchByName(SearchProductsQuery query) {
List<String> sortOrder = new ArrayList<>();
System.out.println("--OUTER: " + Thread.currentThread().getName());
return resourceServere.searchByName(query)
.doOnNext(searchResponse -> sortOrder.addAll(searchResponse.getIds()))
.flatMap(searchResponse -> Mono.zip(
getSomething(searchResponse.getIds()),
getSomethingElse(searchResponse.getIds()),
Mono.just(searchResponse.pagination())))
.map(tuple3 -> SearchResultMapper.map(tuple3.getT1(), tuple3.getT2(), tuple3.getT3()));
}
private Mono<List<String>> getSomething(List<String> ids) {
System.out.println("--INNER: " + Thread.currentThread().getName());
if (ids.isEmpty()) {
return Mono.just(new ArrayList<>());
}
return otherResourceServerClient.getStuff(ids);
}
}
Which prints
--OUTER: http-nio-8080-exec-6
--INNER: reactor-http-nio-3
Debugging ServletOAuth2AuthorizedClientExchangeFilterFunction we discovered that the request in the http-nio-thread has the Authentication:
Debugger stopped in ServletOAuth2AuthorizedClientExchangeFilterFunction
whereas the request in the reactor-http thread does not:
Debugger stopped in ServletOAuth2AuthorizedClientExchangeFilterFunction
I can certainly provide more information about our setup, I'm just a bit uncertain what would be relevant.
For starters though, we depend on both spring-starter-webflux and spring-boot-starter-web, as well as on spring-boot-starter-security
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-security</artifactId>
</dependency>
Curious to hear if anyone has experienced the same issue?

Spring Boot, Sleuth, OTEL, and Honeycomb

I have a scenario where I have Spring Boot integrated with OTEL and shipping to Honeycomb.io. I am trying to add an environment tag to each trace. I have created a class:
#Component
public class EnvironmentSpanProcessor implements SpanProcessor {
#Value("${ENVIRONMENT")
private String environment;
Queue<SpanData> spans = new LinkedBlockingQueue<>(50);
#Override
public void onStart(Context context, ReadWriteSpan readWriteSpan) {
readWriteSpan.setAttribute("env", environment);
}
#Override
public boolean isStartRequired() {
return false;
}
#Override
public void onEnd(ReadableSpan readableSpan) {
this.spans.add(readableSpan.toSpanData());
}
#Override
public boolean isEndRequired() {
return true;
}
}
I have set break points in this class, and they never hit on startup, even though the bean can be seen in actuator. I have put breakpoints on:
SdkTracerProvider otelTracerProvider(SpanLimits spanLimits, ObjectProvider<List<SpanProcessor>> spanProcessors,
SpanExporterCustomizer spanExporterCustomizer, ObjectProvider<List<SpanExporter>> spanExporters,
Sampler sampler, Resource resource, SpanProcessorProvider spanProcessorProvider) {
SdkTracerProviderBuilder sdkTracerProviderBuilder = SdkTracerProvider.builder().setResource(resource)
.setSampler(sampler).setSpanLimits(spanLimits);
List<SpanProcessor> processors = spanProcessors.getIfAvailable(ArrayList::new);
processors.addAll(spanExporters.getIfAvailable(ArrayList::new).stream()
.map(e -> spanProcessorProvider.toSpanProcessor(spanExporterCustomizer.customize(e)))
.collect(Collectors.toList()));
processors.forEach(sdkTracerProviderBuilder::addSpanProcessor);
return sdkTracerProviderBuilder.build();
}
in OtelAutoConfiguration and am not seeing them firing either on startup.
My pom.xml relevant section is:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-sleuth-brave</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-sleuth-otel-autoconfigure</artifactId>
</dependency>
<dependency>
<groupId>io.opentelemetry</groupId>
<artifactId>opentelemetry-exporter-otlp</artifactId>
</dependency>
<dependency>
<groupId>io.opentelemetry</groupId>
<artifactId>opentelemetry-extension-trace-propagators</artifactId>
</dependency>
<dependency>
<groupId>io.grpc</groupId>
<artifactId>grpc-netty-shaded</artifactId>
<version>1.47.0</version>
</dependency>
And my configuration from application.yaml
sleuth:
enabled: true
web:
additional-skip-pattern: /readiness|/liveness
client.skip-pattern: /readiness
sampler:
probability: 1.0
rate: 100
propagation:
type: OT_TRACER
otel:
config:
trace-id-ratio-based: 1.0
log.exporter.enabled: true
exporter:
otlp:
endpoint: https://api.honeycomb.io
headers:
x-honeycomb-team: ${TELEMETRY_API_KEY}
x-honeycomb-dataset: app-telemetry
sleuth-span-filter:
enabled: true
resource:
enabled: true
I am getting traces, so it appears the system itself is working, however I cannot get my env tag added.
Preemptive thank you to #marcingrzejszczak for the help so far on my gist: https://gist.github.com/fpmoles/b880ccfdef2d2138169ed398e87ec396
I'm unsure why your span processor is not being picked up by Spring and being added to your list of processors being registered with the tracer provider.
An alternative way to set process consistent values, like environment, would be to set it as a resource attribute. This is more desireable because it's set once and delivered once per batch of spans sent to the configured backend (eg Honeycomb). Using a span processor adds the same attribute to every span.
This can be done in a few different ways:
If using AutoConfigure, you can set via system property or environment variable
Set directly on the resource during your otelTracerProvider method:
resource.setAttribute("environment", "${environment}");
FYI Honeycomb has OTel Java SDK & Agent distros to help simplify sending data that reduces required configuration and sets sensible defaults.

Micrometer StackdriverMeterRegistry only publishes custom metrics to GCP Monitoring, not automatic instrumentation metrics?

I have this example https://quarkus.io/guides/micrometer (micrometer quickstart directory) running which uses Quarkus and Micrometer together. The example uses Prometheus as the MeterRegistry but I changed it to use the StackdriverMeterRegistry in hopes the same auto instrumentation that shows up in Prometheus would show up in Google Cloud Monitoring.
However, I only see the custom metrics I made appear into Google Cloud Monitoring, and not the auto instrumentation provided by micrometer.
I am unsure if I should think that this is just an issue with the Micrometer StackdriverMeterRegistry library itself or if I am doing something wrong. Any guidance is appreciated.
Code changes:
// Update the constructor to create the gauge
ExampleResource(MeterRegistry registry) {
/* Code for micrometer */
StackdriverConfig stackdriverConfig = new StackdriverConfig() {
#Override
public String projectId() {
return "projectId";
}
#Override
public String get(String key) {
return null;
}
};
this.registry = StackdriverMeterRegistry.builder(stackdriverConfig).build();
registry.config().commonTags("application", "projectId");
registry.gaugeCollectionSize("example.list.size", Tags.empty(), list);
}
Added to pom.xml
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-stackdriver</artifactId>
</dependency>
After tinkering and speaking directly with the Micrometer team I found out the issue. The documentation is a bit confusing but I had imported the StackDriver extension wrong and the default registry being used for the quarkus project was getting all the auto instrumentation but not the StackDriver one. So this default registry needed to be changed to the StackDriver one.
I have uploaded a basic example of using Quarkus StackDriver and Micrometer together using the basic example found on the Micrometer Quarkus documentation page.
https://github.com/jayleenli/quarkus-micrometer-stackdriver-quickstart
The changes:
Add to pom.xml
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-micrometer</artifactId>
</dependency>
<dependency>
<groupId>io.quarkiverse.micrometer.registry</groupId>
<artifactId>quarkus-micrometer-registry-stackdriver</artifactId>
<version>2.2.2</version>
</dependency>
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-core</artifactId>
<version>1.7.3</version>
</dependency>
Then add some Quarkus properties, I used application.properties but there are other ways you can do this.
application.properties
quarkus.micrometer.export.stackdriver.enabled=true
quarkus.micrometer.export.stackdriver.default-registry=true
quarkus.micrometer.export.stackdriver.project-id=fake-id
quarkus.micrometer.export.stackdriver.publish=true
quarkus.micrometer.export.stackdriver.resource-type=global
quarkus.micrometer.export.stackdriver.step=1m
In main class
#Path("/")
public class ExampleResource {
#ConfigProperty(name = "quarkus.micrometer.export.stackdriver.enabled")
boolean enabled;
#ConfigProperty(name = "quarkus.micrometer.export.stackdriver.default-registry")
boolean export;
#ConfigProperty(name="quarkus.micrometer.export.stackdriver.project-id")
String projectId;
#ConfigProperty(name="quarkus.micrometer.export.stackdriver.publish")
boolean publish;
#ConfigProperty(name="quarkus.micrometer.export.stackdriver.resource-type")
String resourceType;
#ConfigProperty(name="quarkus.micrometer.export.stackdriver.step")
String step;

Configure HTTPS in Spring Boot Apache Camel REST API with keystore having multiple certs using camel-jetty component

I am trying to configure https in my apache camel Spring Boot REST application (using apache-camel v3.11.1, springboot v2.5.3) with keystore having multiple certificates.
Problem:
Application run failed
org.apache.camel.RuntimeCamelException: java.lang.IllegalStateException: KeyStores with multiple certificates are not supported on the base class org.eclipse.jetty.util.ssl.SslContextFactory. (Use org.eclipse.jetty.util.ssl.SslContextFactory$Server or org.eclipse.jetty.util.ssl.SslContextFactory$Client instead)
at org.apache.camel.RuntimeCamelException.wrapRuntimeCamelException(RuntimeCamelException.java:51) ~[camel-api-3.11.1.jar:3.11.1]
Project setup:
pom.xml: (dependencies only, to show that I am not using spring-boot-web-starter)
..
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-jetty-starter</artifactId>
</dependency>
..
..<!-- all other required dependencies are in place-->
..
</dependencies>
..
application.properties
#camel.component.jetty.keystore=keystore-with-one-certificate.jks # WORKS
camel.component.jetty.keystore=keystore-with-multiple-certificates.jks # DOESN'T WORK
camel.component.jetty.ssl-key-password=password
camel.component.jetty.ssl-password=password
Rest Route:
restConfiguration()
.component("jetty")
.scheme("https")
.port("8080");
rest()
.path("/api")
.get("/{name}")
..
..
.to("direct:x");
Looked at answers in the below posts, but still not able to resolve the exception that I get,
https://stackoverflow.com/a/60598953/6363894,
https://stackoverflow.com/a/55499113/6363894
I know that exception clearly states to use org.eclipse.jetty.util.ssl.SslContextFactory$Server, but I don't understand how/where to use SslContextFactory.Server object.
SslContextFactory.Server sslContextFactory = new SslContextFactory.Server();
sslContextFactory.setKeyStoreResource(findKeyStorePath());
sslContextFactory.setKeyStorePassword("password");
sslContextFactory.setKeyManagerPassword("password");
sslContextFactory.setNeedClientAuth(true);
Also I've created a bean for sslContextParameters and added that to restConfiguration as below, this time application runs successfully but then when I test, SSL handshake fails.
restConfiguration()
.component("jetty")
.endpointProperty("sslContextParameters", "#sslContextParameters")
.scheme("https")
.port("8080");
#Bean(name = "sslContextParameters")
public SSLContextParameters setSSLContextParameters() {
KeyStoreParameters ksp = new KeyStoreParameters();
ksp.setResource("keystore-with-multiple-certificates.jks");
ksp.setPassword("password");
KeyManagersParameters kmp = new KeyManagersParameters();
kmp.setKeyStore(ksp);
kmp.setKeyPassword("password");
SSLContextServerParameters scsp = new SSLContextServerParameters();
scsp.setClientAuthentication("REQUIRE");
SSLContextParameters scp = new SSLContextParameters();
scp.setServerParameters(scsp);
scp.setKeyManagers(kmp);
return scp;
}
Any help on how to configure SslContextFactory.Server object with the restConfigurations() or any other way I can achieve this? I'll update the post, if any more details are required.

Object distortion while passing from REST service to Spring app

I've got strange problem and I hope you will to help me to solve it.
I try to pass list of objects, where each object contains LocalDate parameter (JodaTime library) from test service to my controller.
This is method from my service. It returns list of objects. Look at the dates printed out in the loop.
#RequestMapping("/getListaRecept")
#ResponseBody
public ListaRecept sendAnswer(){
ListaRecept listaReceptFiltered = prescriptionCreator.createListaRecept();
for(Recepta r : listaReceptFiltered.getListaRecept()){
System.out.println(r.toString());
}
return listaReceptFiltered;
}
Dates are correct
Recepta{id=3, nazwa='nurofen', status=NOT_REALIZED, date=2017-07-27}
Recepta{id=1, nazwa='ibuprom', status=ANNULED, date=2014-12-25}
Recepta{id=2, nazwa='apap', status=REALIZED, date=2016-08-18}
And now I'm invoking this method from my SpringBoot app using restTemplate. And then received list is printed out
private final RestTemplate restTemplate;
public SgrService2(RestTemplateBuilder restTemplateBuilder) {
this.restTemplate = restTemplateBuilder.build();
this.restTemplate.getMessageConverters()
.add(0, new StringHttpMessageConverter(Charset.forName("UTF-16")));
}
public ListaRecept getList() {
for(Recepta r : this.restTemplate.getForObject("http://localhost:8090/getListaRecept",
ListaRecept.class).getListaRecept()){
System.out.println(r.toString());
}
return this.restTemplate.getForObject("http://localhost:8090/getListaRecept",
ListaRecept.class);
}
As you can see all dates were replaced with current date :/
Recepta{id=3, nazwa='nurofen', status=NOT_REALIZED, date=2017-09-30}
Recepta{id=1, nazwa='ibuprom', status=ANNULED, date=2017-09-30}
Recepta{id=2, nazwa='apap', status=REALIZED, date=2017-09-30}
I have no idea what is going on...
Here you have pom dependencies
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.9</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.9.1</version>
</dependency>
Thank you in advance for your help
It seems to me that you are using the wrong jackson module, instead of jsr310 (which I guess is for Java 8 date types), try using the artifact jackson-datatype-joda and register the module JodaModule.

Resources