Implementing Polling in Middle of spring Integration flow DSL - spring

I am writing a spring integration DSL flow.
Which will look like below diagram.
As you can see in flow I need to read 1 mil entities from database. I want o avoid reading those in single go.
I want to implement polling which will read N entities in fixed interval and send it for processing.
In the examples I read for polling, The polling is used as the first step of the Flow. in my case I want to implement in in middle of the flow.
Please let me know how do I implement this.
Any help is appreciated.
Thanks in Advance.

If you want to trigger the start of some polled flow using some external stimulus, use a control bus:
#SpringBootApplication
public class So63337649Application {
public static void main(String[] args) {
SpringApplication.run(So63337649Application.class, args);
}
#Bean
IntegrationFlow trigger(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Amqp.inboundAdapter(connectionFactory, "foo"))
.transform(msg -> "#poller.start()")
.channel("control.input")
.get();
}
#Bean
IntegrationFlow control() {
return f -> f.controlBus();
}
#Bean
IntegrationFlow mainFlow() {
return IntegrationFlows.from(() -> "foo", e -> e
.id("poller")
.autoStartup(false)
.poller(Pollers.fixedDelay(5000)))
.handle(System.out::println)
.get();
}
}

Related

Reactive Programming: Spring WebFlux: How to build a chain of micro-service calls?

Spring Boot Application:
a #RestController receives the following payload:
{
"cartoon": "The Little Mermaid",
"characterNames": ["Ariel", "Prince Eric", "Sebastian", "Flounder"]
}
I need to process it in the following way:
Get the unique Id for each character name: make an HTTP call to "cartoon-characters" microservice, that returns ids by names
Transform data received by the controller:
replace character names with appropriate ids that were received on the previous step from "cartoon-characters" microservice.
{
"cartoon": "The Little Mermaid",
"characterIds": [1, 2, 3, 4]
}
Send an HTTP POST request to "cartoon-db" microservice with transformed data.
Map the response from "cartoon-db" to the internal representation that is the controller return value.
The problem that I got:
I need to implement all these steps using the paradigm of Reactive Programming (non-blocking\async processing) with Spring WebFlux (Mono|Flux) and Spring Reactive WebClient - but I have zero experience with that stack, trying to read about it as much as I can, plus googling a lot but still, have a bunch of unanswered questions, for example:
Q1. I have already configured reactive webClient that sends a request to "cartoon-characters" microservice:
public Mono<Integer> getCartoonCharacterIdbyName(String characterName) {
return WebClient.builder().baseUrl("http://cartoon-characters").build()
.get()
.uri("/character/{characterName}", characterName)
.retrieve()
.bodyToMono(Integer.class);
}
As you may see, I have got a list of cartoon character names and for each of them I need to call getCartoonCharacterIdbyName(String name) method, I am not sure that the right option to call it in series, believe the right option: parallel execution.
Wrote the following method:
public List<Integer> getCartoonCharacterIds(List<String> names) {
Flux<Integer> flux = Flux.fromStream(names.stream())
.flatMap(this::getCartoonCharacterIdbyName);
return StreamSupport.stream(flux.toIterable().spliterator(), false)
.collect(Collectors.toList());
}
but I have doubts, that this code does parallel WebClient execution and also, code calls flux.toIterable() that block the thread, so with this implementation I lost non-blocking mechanism.
Are my assumptions correct?
How do I need to rewrite it to having parallelism and non-blocking?
Q2.
Is it technically possible to transform input data received by the controller (I mean replace names with ids) in reactive style: when we operate with Flux<Integer> characterIds, but not with the List<Integer> of characterIds?
Q3. Is it potentially possible to get not just transformed Data object, but Mono<> after step 2 that can be consumed by another WebClient in Step 3?
Actually it's a good question since understanding the WebFlux, or project reactor framework, when it comes to chaining micro-services requires a couple of steps.
The first is to realize that a WebClient should take a publisher in and return a publisher. Extrapolate this to 4 different method signatures to help with thinking.
Mono -> Mono
Flux -> Flux
Mono -> Flux
Flux -> Mono
For sure, in all cases, it is just Publisher->Publisher, but leave that until you understand things better. The first two are obvious, and you just use .map(...) to handle objects in the flow, but you need to learn how to handle the second two. As commented above, going from Flux->Mono could be done with .collectList(), or also with .reduce(...). Going from Mono->Flux seems to generally be done with .flatMapMany or .flatMapIterable or some variation of that. There are probably other techniques. You should never use .block() in any WebFlux code, and generally you will get a runtime error if you try to do so.
In your example you want to go to
(Mono->Flux)->(Flux->Flux)->(Flux->Flux)
As you said, you want
Mono->Flux->Flux
The second part is to understand about chaining Flows. You could do
p3(p2(p1(object)));
Which would chain p1->p2->p3, but I always found it more understandable to make a "Service Layer" instead.
o2 = p1(object);
o3 = p2(o2);
result = p3(o3);
This code is just much easier to read and maintain and, with some maturity, you come to understand the worth of that statement.
The only problem I had with your example was doing a Flux<String> with WebClient as a #RequestBody. Doesn't work. See WebClient bodyToFlux(String.class) for string list doesn't separate individual values. Other than that, it's a pretty straightforward application. You'll find when you debug it that it gets to the .subscribe(System.out::println) line before it gets to the Flux<Integer> ids = mapNamesToIds(fn) line. This is because the Flow is setup before it is executed. Takes a while to understand this but it is the point of the project reactor framework.
#SpringBootApplication
#RestController
#RequestMapping("/demo")
public class DemoApplication implements ApplicationRunner {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
Map<Integer, CartoonCharacter> characters;
#Override
public void run(ApplicationArguments args) throws Exception {
String[] names = new String[] {"Ariel", "Prince Eric", "Sebastian", "Flounder"};
characters = Arrays.asList( new CartoonCharacter[] {
new CartoonCharacter(names[0].hashCode(), names[0], "Mermaid"),
new CartoonCharacter(names[1].hashCode(), names[1], "Human"),
new CartoonCharacter(names[2].hashCode(), names[2], "Crustacean"),
new CartoonCharacter(names[3].hashCode(), names[3], "Fish")}
)
.stream().collect(Collectors.toMap(CartoonCharacter::getId, Function.identity()));
// TODO Auto-generated method stub
CartoonRequest cr = CartoonRequest.builder()
.cartoon("The Little Mermaid")
.characterNames(Arrays.asList(names))
.build();
thisLocalClient
.post()
.uri("cartoonDetails")
.body(Mono.just(cr), CartoonRequest.class)
.retrieve()
.bodyToFlux(CartoonCharacter.class)
.subscribe(System.out::println);
}
#Bean
WebClient localClient() {
return WebClient.create("http://localhost:8080/demo/");
}
#Autowired
WebClient thisLocalClient;
#PostMapping("cartoonDetails")
Flux<CartoonCharacter> getDetails(#RequestBody Mono<CartoonRequest> cartoonRequest) {
Flux<StringWrapper> fn = cartoonRequest.flatMapIterable(cr->cr.getCharacterNames().stream().map(StringWrapper::new).collect(Collectors.toList()));
Flux<Integer> ids = mapNamesToIds(fn);
Flux<CartoonCharacter> details = mapIdsToDetails(ids);
return details;
}
// Service Layer Methods
private Flux<Integer> mapNamesToIds(Flux<StringWrapper> names) {
return thisLocalClient
.post()
.uri("findIds")
.body(names, StringWrapper.class)
.retrieve()
.bodyToFlux(Integer.class);
}
private Flux<CartoonCharacter> mapIdsToDetails(Flux<Integer> ids) {
return thisLocalClient
.post()
.uri("findDetails")
.body(ids, Integer.class)
.retrieve()
.bodyToFlux(CartoonCharacter.class);
}
// Services
#PostMapping("findIds")
Flux<Integer> getIds(#RequestBody Flux<StringWrapper> names) {
return names.map(name->name.getString().hashCode());
}
#PostMapping("findDetails")
Flux<CartoonCharacter> getDetails(#RequestBody Flux<Integer> ids) {
return ids.map(characters::get);
}
}
Also:
#Data
#NoArgsConstructor
#AllArgsConstructor
#Builder
public class StringWrapper {
private String string;
}
#Data
#Builder
public class CartoonRequest {
private String cartoon;
private List<String> characterNames;
}
#Data
#Builder
#NoArgsConstructor
#AllArgsConstructor
public class CartoonCharacter {
Integer id;
String name;
String species;
}

Correlate messages between 2 JMS queues using Spring integration components

I have 2 JMS queues and my application subscribes to both of them with Jms.messageDrivenChannelAdapter(...) component.
First queue receives messages of type Paid. Second queue receives messages of type Reversal.
Business scenario defines correlation between messages of type Paid and type Reversal.
Reversal should wait for Paid in order to be processed.
How can I achieve such "wait" pattern with Spring Integration?
Is it possible to correlate messages between 2 JMS queues?
See the documentation about the Aggregator.
The aggregator correlates messages using some correlation strategy and releases the group based on some release strategy.
The Aggregator combines a group of related messages, by correlating and storing them, until the group is deemed to be complete. At that point, the aggregator creates a single message by processing the whole group and sends the aggregated message as output.
The output payload is a list of the grouped message payloads by default, but you can provide a custom output processor.
EDIT
#SpringBootApplication
public class So55299268Application {
public static void main(String[] args) {
SpringApplication.run(So55299268Application.class, args);
}
#Bean
public IntegrationFlow in1(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Jms.messageDrivenChannelAdapter(connectionFactory)
.destination("queue1"))
.channel("aggregator.input")
.get();
}
#Bean
public IntegrationFlow in2(ConnectionFactory connectionFactory) {
return IntegrationFlows.from(Jms.messageDrivenChannelAdapter(connectionFactory)
.destination("queue2"))
.channel("aggregator.input")
.get();
}
#Bean
public IntegrationFlow aggregator() {
return f -> f
.aggregate(a -> a
.correlationExpression("headers.jms_correlationId")
.releaseExpression("size() == 2")
.expireGroupsUponCompletion(true)
.expireGroupsUponTimeout(true)
.groupTimeout(5_000L)
.discardChannel("discards.input"))
.handle(System.out::println);
}
#Bean
public IntegrationFlow discards() {
return f -> f.handle((p, h) -> {
System.out.println("Aggregation timed out for " + p);
return null;
});
}
#Bean
public ApplicationRunner runner(JmsTemplate template) {
return args -> {
send(template, "one", "two");
send(template, "three", null);
};
}
private void send(JmsTemplate template, String one, String two) {
template.convertAndSend("queue1", one, m -> {
m.setJMSCorrelationID(one);
return m;
});
if (two != null) {
template.convertAndSend("queue2", two, m -> {
m.setJMSCorrelationID(one);
return m;
});
}
}
}
and
GenericMessage [payload=[two, one], headers={jms_redelivered=false, jms_destination=queue://queue1, jms_correlationId=one, id=784535fe-8861-1b22-2cfa-cc2e67763674, priority=4, jms_timestamp=1553290921442, jms_messageId=ID:Gollum2.local-55540-1553290921241-4:1:3:1:1, timestamp=1553290921457}]
2019-03-22 17:42:06.460 INFO 55396 --- [ask-scheduler-1] o.s.i.a.AggregatingMessageHandler : Expiring MessageGroup with correlationKey[three]
Aggregation timed out for three

Right way to split, enrich items then send each item to another channel?

Is this the right way to split a list of items, enrich each item and then send each of those enriched items to another channel?
It seems like even though each item is being enriched only the last one is sent to the output channel...
Here is the snipper from my test where I see from the flow for only page2 being invoked.
this.sitePackage = new Package();
this.sitePackage.add(page1);
this.sitePackage.add(page2);
this.sitePackage.add(page3);
//Publish using gateway
this.publishingService.publish(sitePackage);
If I do this however...
this.sitePackage.add(page1);
this.sitePackage.add(page1);
this.sitePackage.add(page2);
this.sitePackage.add(page2);
this.sitePackage.add(page3);
this.sitePackage.add(page3);
I see all the pages being published but the last one is page2 not page3 (even though from debugging I can see the instance has page 3 properties).
It seems like every other item is being seen by the flows...
My flows go like this...
Starting with the PublishPackage flow. This is the main entry flow and intended to split the items out of the package and send each of them, after enriching the payload, to flows who are attached to the publishPackageItem channel...
#Bean
IntegrationFlow flowPublishPackage()
{
return flow -> flow
.channel(this.publishPackageChannel())
.<Package>handle((p, h) -> this.savePackage(p))
.split(Package.class, this::splitPackage)
.channel(this.publishPackageItemChannel());
}
#Bean
#PublishPackageChannel
MessageChannel publishPackageChannel()
{
return MessageChannels.direct().get();
}
#Bean
#PublishPackageItemChannel
MessageChannel publishPackageItemChannel()
{
return MessageChannels.direct().get();
}
#Splitter
List<PackageEntry> splitPackage(final Package bundle)
{
final List<PackageEntry> enrichedEntries = new ArrayList<>();
for (final PackageEntry entry : bundle.getItems())
{
enrichedEntries.add(entry);
}
return enrichedEntries;
}
#Bean
GatewayProxyFactoryBean publishingGateway()
{
final GatewayProxyFactoryBean proxy = new GatewayProxyFactoryBean(PublishingService.class);
proxy.setBeanFactory(this.beanFactory);
proxy.setDefaultRequestChannel(this.publishPackageChannel());
proxy.setDefaultReplyChannel(this.publishPackageChannel());
proxy.afterPropertiesSet();
return proxy;
}
Next, the CMS publish flows are attached to the publishPackageItem channel and based on the type after splitting, routed to a specific element channel for handling. After splitting the page only specific element types may have a subscribing flow.
#Inject
public CmsPublishFlow(#PublishPackageItemChannel final MessageChannel channelPublishPackageItem)
{
this.channelPublishPackageItem = channelPublishPackageItem;
}
#Bean
#PublishPageChannel
MessageChannel channelPublishPage()
{
return MessageChannels.direct().get();
}
#Bean
IntegrationFlow flowPublishContent()
{
return flow -> flow
.channel(this.channelPublishPackageItem)
.filter(PackageEntry.class, p -> p.getEntry() instanceof Page)
.transform(PackageEntry.class, PackageEntry::getEntry)
.split(Page.class, this::traversePageElements)
.<Content, String>route(Content::getType, mapping -> mapping
.resolutionRequired(false)
.subFlowMapping(PAGE, sf -> sf.channel(channelPublishPage()))
.subFlowMapping(IMAGE, sf -> sf.channel(channelPublishAsset()))
.defaultOutputToParentFlow());
//.channel(IntegrationContextUtils.NULL_CHANNEL_BEAN_NAME);
}
Finally, my goal is to subscribe to the channel and handle each element accordingly. I subscribe this flow to the channelPublishPage. Each subscriber may handle the element differently.
#Inject
#PublishPageChannel
private MessageChannel channelPublishPage;
#Bean
IntegrationFlow flowPublishPage()
{
return flow -> flow
.channel(this.channelPublishPage)
.publishSubscribeChannel(c -> c
.subscribe(s -> s
.<Page>handle((p, h) -> this
.generatePage(p))));
}
I somehow feel that the problem is here:
proxy.setDefaultRequestChannel(this.publishPackageChannel());
proxy.setDefaultReplyChannel(this.publishPackageChannel());
Consider do not use the same channel for requests and for waiting replies. This way you bring some loop and really unexpected behavior.

Convert #JmsListener code to String Integration DSL

#JmsListener(destination = "myListener")
public void receive(Event even) {
if (event.myObj().isComp()) {
service1.m1(even);
}
if (event.myObj2().isdone()) {
service2.m2(event);
}
}
I tried various combinations, and one of them is below
#Bean
public IntegrationFlow flow1() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(connectionFactory).destination("incomingQueue"))
.<Event>filter(e -> ((Event)e).myObj().isComp()).handle(service1, "m1")
.<Event>filter(e -> ((Event)e).myObj2().isdone()).handle(service2, "m2")//looks like its not called
.get();
}
But it does not executes on 2nd filter/condition. Please suggest what I am missing here
It worked, after I put #ServiceActivator annotation on m1 as well as m2. My bad, I missed this annotation while converting code to SI

Spring integration Java DSL : creating sftp inbound adapter

I want to create a flow using DSL. The flow is from the adapter, message will flow to channel.
#Bean
public IntegrationFlow sftpInboundFlow() {
prepareSftpServer();
return IntegrationFlows
.from(Sftp.inboundAdapter(this.sftpSessionFactory).getId("SftpInboundAdapter")
.preserveTimestamp(true)
.remoteDirectory("sftpSource")
.regexFilter(".*\\.txt$")
.localFilenameExpression("#this.toUpperCase() + '.a'").localDirectory(file).channel(MessageChannels.queue("sftpInboundResultChannel"))
.get());
}
Not sure of compilation error at getId() method . tried to convert from Java 8 lambda to Java 7
I think you want to add an id attribute for your component to register it with that bean name in the application context. You config must look like:
return IntegrationFlows
.from(Sftp.inboundAdapter(this.sftpSessionFactory)
.preserveTimestamp(true)
.remoteDirectory("sftpSource")
.regexFilter(".*\\.txt$")
.localFilenameExpression("#this.toUpperCase() + '.a'")
.localDirectory(file),
new Consumer<SourcePollingChannelAdapterSpec>() {
#Override
public void accept(SourcePollingChannelAdapterSpec e) {
e.id("SftpInboundAdapter");
}
})
.channel(MessageChannels.queue("sftpInboundResultChannel"))
.get();
There is no such a getId(String) method.
Yes I'll fix its JavaDocs eventuelly, but you are facing really compilation error, hence wrong language usage.

Resources