Spring Cloud Data Flow: deploy stream defined with Java DSL - spring

I'm using Spring Cloud Data Flow.
Is there any way to deploy a stream defined with the Java DSL providing the .jar uri instead of executing it as a client?
I want to threat it like the applications I can register (e.g. log-sink) because in fact is an application.
Thanks!

You can use the Java DSL (which implements the Spring Cloud Data Flow REST client). For documentation on using Java DSL, you can refer to the documentation here.

Related

Can Spring Cloud Stream work with Spring Cloud Kubernetes?

I haven't seen any example of combining the two, even though it makes sense they'll work together (because of being both subprojects of Spring Cloud). I want to use Spring Cloud Stream (Reactive Processing) for reading from Kafka and writing to MongoDB with the Reactive Mongo driver. Can I use Spring Cloud Config and Spring Cloud Kubernetes for a remote git configuration server, even though the application is an event-driven application and not a requests-based API?
It's pretty unclear how to plug these Kubernetes and config projects into Spring Cloud Stream, and in general, it's unclear to me if all of the other Spring Cloud projects can work with Spring Cloud Stream reactively. For instance, I couldn't also find a reference for using Spring Cloud Sleuth & Zipkin with Spring Cloud Stream. Can someone make it clearer for me and reference code examples if exists?

Can we use Spring Boot to expose Rest API's for Spark Jobs?

I have been given an task where I have the Apache Spark SQL and they want me to expose out the result as SPring REST API's using Spring Boot.
Is this possible? What I would be given would be Spark SQL's.
The Data currently runs on our DataBricks, But then the data also needs to be exposed as REST API's.
Yes, you can do that. You can look up for Spring Spark integration.
https://docs.spring.io/spring-hadoop/docs/current/reference/html/springandhadoop-spark.html
Also, you can look at this question for more references.
https://docs.spring.io/spring-hadoop/docs/current/reference/html/springandhadoop-spark.html

Implement Spring Cloud Sleuth's TracingWebFillter in Spring WebFlux

I have a Spring Boot Reactive application using Functional Endpoints ( Kotlin) that uses Spring Cloud Sleuth. I want to customize the response headers to include the trace id. I looked at the manual and saw the section on using the TraceFilter : 1
I tried to do the same using a TraceWebFilter, but it doesn't expose the constructors. Is there a way to implement this customization when using the Reactive Web Framework.
I am using version 2.0.0.M5 for Spring Cloud Sleuth
Thanks in advance!
Please check out the latest master - we've migrated to Brave. You can write your WebFilter that will delegate to TraceWebFilter. Or you can just copy our TraceWebFilter and alter it in the way you need to.

Kafka bindings without #EnableBinding annotations in Spring

I'm using spring cloud to connect to my Kafka broker. It works fine. Now I want to create my binding by code instead of annotation.
Is there a convenient way to do it?
Could you elaborate why do you want to do the binding programmatically instead of using #EnableBinding.
While Spring Cloud Stream simplifies exactly that, if you prefer to use your own way of connecting (for any other specific reason), then you might want to check the Spring Integration adapters to do the binding. But, in this case, you are on your own by setting up the lifecycle and all other goodies that Spring Cloud Stream provides.
If you still want to use Spring Cloud Stream but don't want to use the annotation, then check here to see all the configuration that Spring Cloud Stream does when you annotate and apply your use case.
Please follow https://github.com/spring-cloud/spring-cloud-stream/issues/954. We plan to add this feature to 1.3.0.RC1.

Existing Spring Application to Cloud Foundry

We are planning to move a Spring based application to Cloud Foundry.
The application currently uses WAS server and access data sources using JNDI lookup.
We are using spring features like MVC, AOP etc.
I have certain questions in mind :
Is it possible to switch to Tomcat and configure dataSources using Spring-cloud-connectors and possible conflicts we might run into ?
Currently , datasources are configured in XML files, Should I use the same XML files or switch to annotations.
Can anyone please provide some clarity over this and other known issues with this approach?
Spring Cloud Connectors are by far the easiest way to bind to data sources in Cloud Foundry. I would recommend converting your JNDI lookups to use these service connections as described here:
http://docs.cloudfoundry.org/buildpacks/java/spring-service-bindings.html
cloud foundry automatically reconfigures your datasource when it find a database service attached the cloud app. its super coooool....

Resources