What are the advantages of using Spring Integration style within Spring Cloud Stream - spring

I'm tasked with the requirement of using Spring Cloud Stream (with Kafka bindings). Going through Spring Cloud Stream's Quick Start and judging from the Config file it uses, LogSinkConfiguration, it seems the recommendation here is to use Spring Integration patterns, e.g. #ServiceActivator(inputChannel = Sink.INPUT) to connect to the input channel. However all the tutorials I've found are using a different set of annotations from the Spring Cloud Stream library (vs Spring Integration), i.e. #StreamListener(Processor.INPUT) in this walk-through.
So which is better/newer/preferred (i.e. what's the "best practice"). Should I use the S.I. way of configuring the sink using #ServiceActivator(inputChannel = Sink.INPUT) or the alternative #StreamListener(Processor.INPUT). Are these two methods of hooking up the sink virtually the same, or are there advantages to using one over the other?

With 2.0 we introduced spring-cloud-function support and with the imminent release of 3.0 we are now fully committed to functional support over any annotation driven (Spring Integration included).
So, to answer your question, functional support is the preferred way. Here is the link to more info and feel free to follow up with more concrete questions.

Related

Using Java functional API with Spring Cloud Data Flow and Polled Consumers

I am working on a project that is trying to use the polled consumer API. However, existing documentation, blog posts and sample code seems to use deprecated annotations (such as org.springframework.cloud.stream.annotation.Input). This seems to be because they are relying on the older style of Spring Cloud stream applications rather than using Java functional api (e.g., java.util.function.Function), as shown in other examples such as this one, given in the same repo.
Is there a way to use functional style with polled consumers in Spring Cloud Stream?
You are using outdated documentation. The most current is available from the project site - https://spring.io/projects/spring-cloud-stream#learn.
The section you are looking for is - https://docs.spring.io/spring-cloud-stream/docs/3.1.5/reference/html/spring-cloud-stream.html#spring-cloud-streams-overview-using-polled-consumers

Why StreamListener is deprecated

I am using Spring Cloud Stream 3.1.2 for KafkaStreams. The programming models are:
Functional Programming
Imperative Programming
The latter one uses annotations as all the other annotations that the Spring is provided to be used. But, it is mentioned that
Starting with 3.1.0 version of the binder, we recommend using the functional programming model described above for Kafka Streams binder based applications. The support for StreamListener is deprecated starting with 3.1.0 of Spring Cloud Stream.
As I think the older model is more readable (at least for me). Can anyone explain why it is decided to be deprecated in favor of functional programming and will it be removed?
From the Spring blog post (https://spring.io/blog/2019/10/17/spring-cloud-stream-functional-and-reactive) it says a functional programming model in Spring Cloud Stream (SCSt). It’s less code, less configuration. Most importantly, though, your code is completely decoupled and independent from the internals of SCSt. This is in favor to use event stream abstractions (such as Flux and Mono) that are provided by Project Reactor (https://projectreactor.io/). Imperative functions are triggered on each individual event, while reactive functions are triggered once.

What is Spring MVC Based on Reactor?

I have just been reading everything I can about Spring and Reactor and do realize that Reactor is supposed to be included in the upcoming Spring Framework 5 (anyone using this in production btw?)
My interest is to use it in Spring MVC, since it is not currently part of the framework, how can Reactor be used in Spring MVC? It appears from online examples to use Reactor in Spring now while waiting for Framework 5, is to use reactor-bus.
Is Spring MVC + Reactor in its current state just adding the reactor-bus to a MVC app ?
A look at the Github shows that reactor-bus seems to be in legacy mode ?
What is the current way to give reactive capabilities to the existing Spring MVC ?
Spring 5 and WebFlux will give you the most benefit, because the framework itself is using Reactive Programming and is fully non-blocking, with opportunities for end-to-end asynchronicity if your DB is also async-capable (think Cassandra, Redis, MongoDB, Couchbase, along with the reactive Spring Data Kay).
That said, a library like Reactor can have benefits even if your app is not fully reactive. For example, if you have a service layer with a lot of orchestration needed. If these services represent tasks as asynchronous types (ideally Flux/Mono/Publisher, but also Future), you can bridge them to Reactor and use the powerful operators to build up complex asynchronous processing pipelines.
The last piece is to let Spring 4.x work with these asynchronous results. The framework has a form of support for that in the DeferredResult<T> type, which you could obtain from a Flux or Mono (the example below is simplistic and doesn't show that composition of operators that I mentioned above, which would be hidden in the service):
#GetMapping()
public DeferredResult<User> getCurrentUser() {
DeferredResult<User> result = new DeferredResult();
Mono<User> mono = myService.getCurrentUser();
mono.subscribe(
value -> result.setResult(value),
error -> result.setErrorResult(error)
);
return result;
}

Which should I use mail outbound-channel-adapter or org.springframework.mail.MailSender [duplicate]

I have too many emails. I should write scheduler in order to send messages to them. Messages are different. I use spring framework 4.x.
I can write simple class, which connects to SMTP server. But in this case I should write my thread library too in order to send emails parallel.
Do spring have already written library which give me more flexible way to do this tasks? I do not want to use threads. It will be nice if spring already have this functionality.
Do I need Spring integration for this?
Best regards,
Yes, you definitely can do that with Spring Integration, because there is an ExecutorChannel implementation with can be supplied with an TaskExecutor from the Spring Core:
<channel id="sendEmailChannel">
<dispatcher task-executor="threadPoolTaskExecutor"/>
</channel>
<int-mail:outbound-channel-adapter channel="sendEmailChannel" mail-sender="mailSender"/>
But anyway you should keep in mind that all Spring Integration components are based on the Java and that ExecutorService is used on the background.
From other side if you need only the mail sending stuff from the Spring Integration, it would be an overhead and can simply use Core Spring Framework legacy like JavaMailSender as a bean and #Async for the sendMail method to achieve your parallel requirement.
UPDATE
could you tell me whether I need JMS for this situation?
I don't see any JMS-related stuff here. You don't have (or at least don't show) any real integration points in your solution. The same I can say even about Spring Integration just for email sending. However with the Spring Boot your SI config will be enough short. From other side if you'll study Spring Integration better eventually you'll get more gain to rely on the Integration components for your systems, as internally, as well as externally with other systems through JMS, AMQP, Kafka etc.
To be honest: a lot of years ago my first acquaintance with Spring Integration was due the requirement to get files from the FTP and have ability to pick up new files automatically. I found the solution only in the Spring Integration 1.0.0.M1. After that short XML config for the <int-ftp:inbound-channel-adapter> I loved Spring Integration and since that time it became as a part of my life. :-)
So, it's up to you to go ahead with Spring Integration in your simple app, or just follow with more formal solution with JavaMailSender direct usage.
You should use java executors framework. For example you can write something like the code below:
ExecutorService executor = Executors.newWorkStealingPool();
executor.execute(() -> mailSender.send(mail));

Spring Integration endpoint for Gigaspaces

Is there a Spring Integration endpoint which connects to Gigaspaces?
As a general point, I am also interested to know what is the best documentation for using Spring together with Gigaspaces. Am surprised that there does not appear to be a lot of material written on this. Is Gigaspaces still the preferred option for scaling Spring applications, or are there better solutions?
GigaSpaces XAP using Spring as its native configuration. XAP container running Spring container internally.
Every XAP component (data grid node , data grid proxy , event handler ....) exposed via Spring. IDE integration and Unit tests done via Spring application context.
You can deploy spring app as is into XAP. XAP will scale it and make it HA.
See more:
http://docs.gigaspaces.com/xap102tut/spring-integration.html
http://docs.gigaspaces.com/sbp/spring-data.html
Is the question is "How my application based on spring-integration can easily push objects into GigaSpaces XAP?" or "How can I use the spring-integration framework from my code deployed in GigaSpaces XAP (so collocated with the data)?"
For the first question, I am unaware of any off-the-shelf end-points. But it is very easy to program: you will have to decorate your pojo (by annotation for example, to say where are indexes etc.)
You can use JMS integration (use GS XAP as a JMS broker), but I don't think it is the best way here...
For the second question, a GigaSpaces XAP application is mainly a Spring context. By default, there is no use of Spring integration, but it is very very easy to integrate as we have already in a Spring stack.

Resources