Spring Cloud Dataflow - Stream deployment stuck in "Deploying" - spring-boot

My Custom stream is stuck in "Deploying". But the stream is actually working. Messages are received by the sink. But the status in SCDF is "Deploying".
Per query - Spring Cloud Dataflow Custom App stuck in Deploying state #Sabby Anandan said, SCDF checks /health and /info. But per this post Spring boot actuator "/health" is not working the URL should be actuator/health. This is consistent with my code as well. http://localhost:1234/health does not work, but http://localhost:1234/actuator/health gives me {"status":"UP"}
This is a bug in SCDF? should it check for actuator/health.
Can you please help? If this is a bug, do I have any workaround?
Below are the version details:
-SCDF - spring-cloud-dataflow-server-2.5.1.BUILD-20200518.143034-16
-Skipper - spring-cloud-skipper-server-2.4.1.BUILD-20200518.094106-12
-Boot - 2.3.0

Related

Google Cloud Trace doesn't correlate logs from Spring Boot applications using Sleuth

I've recently upgraded my applications to Spring Boot 2.4.2, Cloud 2020.0.0, changing Spring Cloud GCP dependencies following the migration guide: https://googlecloudplatform.github.io/spring-cloud-gcp/2.0.0/reference/html/index.html#migration-guide-from-spring-cloud-gcp-1-x-to-2-x
Regarding the applications everything seems to be working fine. I can see calls between the microservices propagating the trace-id headers:
gateway:
2021-01-24 20:18:36.471 DEBUG [gateway,0bc6b9664e6604e2,eb9f834718fe33c9] 1 ---
service1:
2021-01-24 20:18:36.700 DEBUG [service1,0bc6b9664e6604e2,570653ac93add270,true]
In the Google Cloud Trace console I can see that the trace id (0bc6b9664e6604e2) was captured (prefixed with 16 extra 0's) and that it shows both microservices (the first post corresponds to gateway and the third call correspons to service1:
However notice the message "No logs found for this trace".
Also the Trace Logs View link complaints about it:
If I open the link it just looks by timestamp, not using the correlation trace id.
The funny thing is if I look for a log statement directly in the GCP Logging view, the trace id is there:
I can then run a GCP Logging query to find all the logs correctly:
Apparently everything seems to be OK. Could you tell why GCP Trace isn't able to correlate with GCP Logging?
For the logs to appear correlated with tracing you need to add the stackdriver log dependency (I think it only works if your log implementation is logback).
Check https://cloud.spring.io/spring-cloud-gcp/reference/html/#integration-with-logging

Micrometer KafkaConsumerMetrics present when running locally but not when deployed

When I run locally I can see that kafka.consumer. are being collected. While when I deploy my service - I see that those metrics are not present.
I use kafka version 1.11.0, java 11 and Spring Boot 2.2.
How I can determine what is missing?
In case anyone has this issue. I've had to explicitly add:
spring.jmx.enabled=true
It is needed since Kafka publishes data to jmx, and Micrometer reads it from there. By default jmx is disabled starting from Spring Boot 2.2.
It worked locally because IDEA added spring.jmx.enabled=true flag under the covers.

Spring Boot Micro Service Tracing Options

I am having below requirement for which is there any open source library will cover all of them.
1.We are building a distributed micro service architecture with Spring Boot.Which includes more than 100 micro services.
2.There is a lot if inter micro service communications possible to achieve single transaction.
3.We want to trace every micro service call and the trace should provide following information.
a.Transaction ID/Trace ID
b. Back end transaction status-HTTP status for REST.Like wise for SOAP as well.
c.Time taken for that call.
d.Request and Response payload.
Currently we are achieving this using indigenous tracing frame work.Is there any open source project will handle all this without any coding from developer.I know we have few options with spring Boot Cloud Zipkin,Seluth etc does this handle above requirements.
My project has similar requirements to yours. IMHO, Spring-cloud-sleuth + Zipkin work well in my case.
For any inter microservices communication, we are using Kafka, and Spring-cloud-sleuth + zipkin has no problem to trace all the call, from REST -> Kafka -> More Kafka -> REST.
To enable Kafka Tracing, just simply add
spring:
sleuth:
propagation-keys: some-key
sampler:
probability: 1
messaging:
kafka:
enabled: true
We are also using Azure ApplicationInsights to do centralized logging, which is well integrated with Spring Cloud.
Hope above give you some confidence of using Sleuth + Zipkin.

Spring cloud bus - rabbitmq unavailability marks the instance DOWN

I use spring cloud config bus (rabbitmq) in my micro-service. Only purpose for me to use rabbitmq in my microservice is spring cloud bus... I have 2 questions below.
When I was experimenting, I found that spring expects rabbitmq to be UP and running during application start. Which is contrary to what Spring cloud evangelises... (Circuit breakers...) To be fair, even service discovery is not expected to be up and running before starting an application. Is there any sensible reason behind this...?
Say, I start my application when rabbitmq is up and running. For some reason, rabbitmq goes down... What I should be losing is just my ability to work with rabbitmq... instead, /health endpoint responds back as DOWN for my micro-service. Any eureka instance listening to heart beats from my micro-service is also marking the instance as down. Any reasons for doing this...?
To my knowledge, this is against the circuit breaker pattern that spring cloud has evangelised.
I personally feel that spring cloud config bus is not an important feature to mark an application as down...
Is there any alternatives to tell my spring boot micro-service that connection to rabbitmq is not a critical service?
Thanks in advance!

How to monitor streaming apps Inside SCDF?

I am novice to Spring Cloud Data flow and Stream Cloud Streaming Applications.
Currently my project diagram looks like following :
I route a POST request from outside client using zuul API gateway to a microservice called Composite. Composite creates a stream using REST POST and deployes onto Spring Cloud Data Flow Server. As far as I know the microservices mongodb and file run as co-existing JVM processes. If My client has to know the status of stream, status of the processed data, How should Composite Microservice interact with Spring Cloud Data Flow Server? Currently when I make POST call to deploy the stream I dont even get the status from SCDF Server. Does SCDF expose any hooks to look at the individual apps? Also how can I change the flow #runtime to create a dynamic mesh?
Currently I am using Local Spring Cloud Data Flow Server for development.
Runtime platform is local
Local runtime is recommended only for development purpose and if you're preparing for production, please make sure to choose a platform variant (eg: cf, k8s, yarn, ..) that comes with non-functional requirements to support reliable and durable execution of all the applications running in streaming pipeline.
As far as I know the microservices mongodb and file run as co-existing JVM processes.
If your stream definition is file | mongodb, you'd have 2 different JVM's even when using Local runtime. They're independent Boot applications.
How should Composite Microservice interact with Spring Cloud Data Flow Server?
Not clear what you mean by "composite" here. All the microservice applications in SCDF communicate via messaging middleware such as Kafka or Rabbit. SCDF provides the orchestration capability to run such applications into various runtime platforms.
Currently when I make POST call to deploy the stream I dont even get the status from SCDF Server
You can use SCDF's REST-APIs to query for current status of the apps and it is platform agnostic. You can view the list of supported APIs by hitting the root URL (see image below) - there's a gap in docs - we will fix it. Following APIs could be useful for status checks.
Does SCDF expose any hooks to look at the individual apps?
Once the apps are deployed in a runtime platform, you can take advantage of Boot's actuator endpoints to explore more details such as trace, metrics, health, env among others at each application level. See Boot's actuator endpoints for more details. For instance, if your mongodb app is running locally and on port 23000, then you can check granular metrics for this application at: http://localhost:23000/metrics.
[As an FYI: future SCDF releases would include integrating Spring Boot + Spring Cloud Sleuth metrics and visual representation of the same.]
Also how can I change the flow #runtime to create a dynamic mesh?
If you're referring to editing a running streaming pipeline with addition/deletes, we are currently exploring design approach to support this functionality.

Resources