Where is logs for Spring Cloud Data Flow Stream - spring-boot

I am using Spring Cloud Data Flow Stream. I have launched 3 spring boot apps as a stream. namely source, processor & sink.
How can i see the logs of these spring boot apps? thanks

Apparently you can look at http://localhost:9393/runtime/apps to see stdout in attributes to see where the logs are

Related

Can Spring Cloud Stream work with Spring Cloud Kubernetes?

I haven't seen any example of combining the two, even though it makes sense they'll work together (because of being both subprojects of Spring Cloud). I want to use Spring Cloud Stream (Reactive Processing) for reading from Kafka and writing to MongoDB with the Reactive Mongo driver. Can I use Spring Cloud Config and Spring Cloud Kubernetes for a remote git configuration server, even though the application is an event-driven application and not a requests-based API?
It's pretty unclear how to plug these Kubernetes and config projects into Spring Cloud Stream, and in general, it's unclear to me if all of the other Spring Cloud projects can work with Spring Cloud Stream reactively. For instance, I couldn't also find a reference for using Spring Cloud Sleuth & Zipkin with Spring Cloud Stream. Can someone make it clearer for me and reference code examples if exists?

How to stream app logs to a cloud platform

Hi I am looking on how to stream application logs from a Spring boot application to a cloud. To elaborate:- Suppose I have a spring app to be deployed as a pcf application for which I need to send the logs generated from the application to an external cloud platform.
You can configure appenders for loggers like logback, log4j2 to stream logs on external cloud platforms like aws cloudwatch with ease.

Spring cloud data flow over Google Pub/Sub

I have worked with spring cloud data flow and rabbitmq and kafka but i would like to know if it would be possible to install scdf with google pub/sub.
I don't want to create a stream (new app spring cloud stream) with source or sink to gcp i want google pub/sub over spring cloud data flow server to use as an intermediate messaging broker.
Any suggestions?
Yes, you can use Spring dataflow and GCP Pub Sub link.
Through the use of one of many Spring Cloud Stream Binders, many
different messaging middleware products can be used. The following
popular platforms are among the list supported by Spring Cloud Data
Flow:
1.Kafka Streams
2.Amazon Kinesis
3.Google Pub/Sub
4.Solace PubSub+
5.Azure Event
Hubs You can find more information like the the current list of Spring
Cloud Stream Binders here library as a dependency to the application.
Spring Cloud Stream supports Google PubSub (partner maintained) binder implementation link.
Here you can find a related SO question Spring dataflow and GCP Pub Sub.

Using Spring Cloud Data Flow with Azure Service Bus

I am trying to find some examples of Spring Cloud Data Flow and Azure Service Bus setup. .
I have found https://github.com/microsoft/spring-cloud-azure/tree/master/spring-cloud-azure-stream-binder/spring-cloud-azure-servicebus-topic-stream-binder but it is still in RC and I do not see any examples (which cover Spring Cloud Data Flow) there.
Could you please help me to understand if I can use Spring Cloud Data Flow and Azure Service Bus together?
I was able to run examples with Kafka and RabbitMQ, but I cannot find anything about Azure Service Bus that can be used as the integration solution for Spring Cloud Data Flow
Spring Cloud Data Flow doesn't necessarily need to know what messaging layer you choose to run your applications on. Hence, your question is more likely related to how one can run the application with Azure Service Bus using Spring Cloud Stream.
This is one such a sample: https://learn.microsoft.com/en-us/java/azure/spring-framework/configure-spring-cloud-stream-binder-java-app-azure-event-hub?view=azure-java-stable
Once the application is built using Spring Cloud Stream (using the Azure event hub binder), you can then manage these applications using Spring Cloud Data Flow as Streaming applications.

Kafka ItemWriter for Spring Batch for Spring Cloud Data Flow

I've created a Spring Batch application which reads data from file and write to database. I want to write output data into Kafka. I've tried to use Spring Cloud Data Flow but could not even run it.
I've followed that tutorial: https://spring.io/guides/gs/batch-processing/
when I register it to Spring Cloud Data Flow it gives a status as N/A. I've checked that tutorial: http://www.baeldung.com/spring-cloud-data-flow-batch-processing and added #EnableTask to my application class but result is same.
My question is: I want to write data to Kafka and use that Spring Batch job at Spring Cloud Data Flow. How can I do that?

Resources