How to configure opentelemetry using depedencies in spring boot microservices? - spring-boot

Add dependency of opentelemetry jar in all the docker files
Enable promethues for all spring microservices
Enable/disable opentelemetry
I wanted to implement opentelemetry-javaagent dependency in my spring boot cloud project in every microservices. here they have used jar files and show the metrics on SigNoz but I wanted to use the dependency for opentelemetry.
The problem is I am getting dependency but they are not working. No metrics are being projected in the SigNoz dashboard.

If you don't want to use the agent, you may use the direct dependency in your application code.
https://mvnrepository.com/artifact/io.opentelemetry/opentelemetry-api.
<dependency>
<groupId>io.opentelemetry</groupId>
<artifactId>opentelemetry-api</artifactId>
</dependency>
And configure the SDK -
Resource resource = Resource.getDefault()
.merge(Resource.create(Attributes.of(ResourceAttributes.SERVICE_NAME, "logical-service-name")));
SdkTracerProvider sdkTracerProvider = SdkTracerProvider.builder()
.addSpanProcessor(BatchSpanProcessor.builder(OtlpGrpcSpanExporter.builder().build()).build())
.setResource(resource)
.build();
SdkMeterProvider sdkMeterProvider = SdkMeterProvider.builder()
.registerMetricReader(PeriodicMetricReader.builder(OtlpGrpcMetricExporter.builder().build()).build())
.setResource(resource)
.build();
OpenTelemetry openTelemetry = OpenTelemetrySdk.builder()
.setTracerProvider(sdkTracerProvider)
.setMeterProvider(sdkMeterProvider)
.setPropagators(ContextPropagators.create(W3CTraceContextPropagator.getInstance()))
.buildAndRegisterGlobal();
Then acquire an instance of the tracer -
import io.opentelemetry.api;
//...
Tracer tracer =
openTelemetry.getTracer("instrumentation-library-name", "1.0.0");
And create spans -
Span span = tracer.spanBuilder("my span").startSpan();
// Make the span the current span
try (Scope ss = span.makeCurrent()) {
// In this scope, the span is the current/active span
} finally {
span.end();
}
Taken from here -
https://opentelemetry.io/docs/instrumentation/java/manual/

Related

How do export metrics using OpenTelemetry and Spring Cloud Sleuth

My team is trying to get Spring Cloud sleuth to work with the Opentelemetry api. We observe that the spans, attributes (tags) and events are exported just fine to the OTEL collector.
The metrics we add are not exported with the spans (or separately) which would be our expectation.
We have the following dependencies in our project:
implementation platform('org.springframework.cloud:spring-cloud-sleuth-otel-dependencies:1.1.1')
implementation('io.opentelemetry:opentelemetry-api')
implementation('org.springframework.cloud:spring-cloud-sleuth-api')
...
implementation 'org.springframework.cloud:spring-cloud-sleuth-otel-autoconfigure'
implementation('org.springframework.cloud:spring-cloud-starter-sleuth') {
exclude group: 'org.springframework.cloud', module: 'spring-cloud-sleuth-brave'
}
implementation 'io.opentelemetry:opentelemetry-exporter-otlp:1.22.0'
The code that adds metrics is as follow:
#Autowired
private final OpenTelemetry openTelemetry;
//...
//make two attempts - one at both api's. The first is just using a NOOP meter provider unfortunately.
DoubleCounter doubleCounter = GlobalOpenTelemetry.getMeter("io.opentelemetry.example.metrics")
.counterBuilder("calculated_used_space")
.setDescription("Counts disk space used by file extension.")
.setUnit("MB")
.ofDoubles()
.build();
doubleCounter.add(2.0);
//this is using a SDK meter provider which should export metrics with the span.
DoubleCounter build = openTelemetry.getMeter("com.jysk.some-app.some-metric")
.counterBuilder("some-counter")
.setDescription("some-description")
.setUnit("pcs.")
.ofDoubles()
.build();
build.add(4.0f);
Any insight in how we get Spring Cloud Sleuth to export metrics to a configured collector. We are using the Opentelemetry Collector.

Jaeger log warning messages saying no sender factories available

I am trying to set up Jaeger to collect traces from a spring boot application. When my app starts up, I am getting this warning message
warn io.jaegertracing.internal.senders.SenderResolver - No sender factories available. Using NoopSender, meaning that data will not be sent anywhere!
I use this method to get the jaeger tracer
#Bean
Tracer jaegerTracer(#Value(defaulTraceName) String service)
{
SamplerConfiguration samplerConfig = SamplerConfiguration.fromEnv().withType("const").withParam(1);
ReporterConfiguration reporterConfig = ReporterConfiguration.fromEnv().withLogSpans(true);
Configuration config = new Configuration(service).withSampler(samplerConfig).withReporter(reporterConfig);
return config.getTracer();
}
I have manually instrumented the code, but no traces show up in the jaeger UI. I have been stuck on this problem for a few days now and would appreciate any help given!
In my pom file, I have dependencies on jaeger-core and opentracing-api
Solved by adding dependency in pom file on jaeger-thrift.

Expose metrics from spring application to prometheus without using spring-boot actuator

I have been trying to collect micrometer metrics in a non springboot application and expose them to prometheus.I have added the following dependency and the test method for the same.I would like to know how to proceed and expose the collected metrics to prometheus from my non spring boot application(traditional spring application).
<dependency>
<groupId>io.micrometer</groupId>
<artifactId>micrometer-registry-prometheus</artifactId>
<version>1.2.0</version>
</dependency>
public string testmetrics(){
private PrometheusMeterRegistry registry = new PrometheusMeterRegistry(PrometheusConfig.DEFAULT);
registry.counter("ws metric collection","tktdoc metrics");
String metricsInfo = registry.scrape();
return metricsInfo;
}
You practically have to expose an HTTP endpoint and configure Prometheus with it; the HTTP endpoint will supply the data for the scrapes.
An example showing how to add the HTTP endpoint by starting up an HTTP Server (your application may already be using one) is here.

Spring Boot 2 integrate Brave MySQL-Integration into Zipkin

I am trying to integrate the Brave MySql Instrumentation into my Spring Boot 2.x service to automatically let its interceptor enrich my traces with spans concerning MySql-Queries.
The current Gradle-Dependencies are the following
compile 'io.zipkin.zipkin2:zipkin:2.4.5'
compile('io.zipkin.reporter2:zipkin-sender-okhttp3:2.3.1')
compile('io.zipkin.brave:brave-instrumentation-mysql:4.14.3')
compile('org.springframework.cloud:spring-cloud-starter-zipkin:2.0.0.M5')
I already configured Sleuth successfully to send traces concerning HTTP-Request to my Zipkin-Server and now I wanted to add some spans for each MySql-Query the service does.
The TracingConfiguration it this:
#Configuration
public class TracingConfiguration {
/** Configuration for how to send spans to Zipkin */
#Bean
Sender sender() {
return OkHttpSender.create("https://myzipkinserver.com/api/v2/spans");
}
/** Configuration for how to buffer spans into messages for Zipkin */
#Bean AsyncReporter<Span> spanReporter() {
return AsyncReporter.create(sender());
}
#Bean Tracing tracing(Reporter<Span> spanListener) {
return Tracing.newBuilder()
.spanReporter(spanReporter())
.build();
}
}
The Query-Interceptor works properly, but my problem now is that the spans are not added to the existing trace but each are added to a new one.
I guess its because of the creation of a new sender/reporter in the configuration, but I have not been able to reuse the existing one created by the Spring Boot Autoconfiguration.
That would moreover remove the necessity to redundantly define the Zipkin-Url (because it is already defined for Zipkin in my application.yml).
I already tried autowiring the Zipkin-Reporter to my Bean, but all I got is a SpanReporter - but the Brave-Tracer-Builder requries a Reporter<Span>
Do you have any advice for me how to properly wire things up?
Please use latest snapshots. Sleuth in latest snapshots uses brave internally so integration will be extremely simple.

Spring Boot app that uses Groovy for JDBC

I have spring-boot application that exposes ReST APIs. I am considering offloading all SQL reads to Groovy SQL. The DataSource is configured in Spring. I would like Groovy to use this DataSource. Btw, there will be multiple DataSource objects (connecting to different datbases).
What is the best approach for this - I want best of both worlds (DI and groovy simplicity). The App has to be in Spring (for project reasons) and will be deployed in WebLogic server utilizing WebLogic defined Data sources.
My idea is to call a Groovy method as shown below from the ReST controller method of Spring-boot:
/student/id
Student getStudentDetails(int id){
#Autowired
#Qualifier("someDataSource");
DataSource myDs;
Student student = GroovySqlUtil.findStudentById(id, myDs); // pass the DS to Groovy sothat GrooySQL can use the DS for executing queries.
return student;
}
Is there a better way? Can Groovy directly handle Data sources (multiple DDs)? In that case I won't initialize DataSource in Spring configuration.
There's no requirement for transactions, JPA etc. It's pure SQL read operations.
In fact Groovy, Java, Gradle and Spring-boot can be mixed without any issues. I create a new ReST service in Groovy class and utilized groovy.Sql. Everything works fine.
Just that the Gradle build file need below configuration:
sourceSets {
main {
groovy {
// override the default locations, rather than adding additional ones
srcDirs = ['src/main/groovy', 'src/main/java']
}
java {
srcDirs = [] // don't compile Java code twice
}
}
}
And also have the groovy package in component scan of main Spring configuration class.

Resources