Spring integration RecipientListRouter doesn't create multiple payloads - spring

Please can any one help me with this issue, I configured my ReceipientListRouter as the documentation suggested:
#Bean
public IntegrationFlow routerFlow() {
return IntegrationFlows.from(CHANNEL_INPUT)
.routeToRecipients(r -> r
.applySequence(true)
.ignoreSendFailures(true)
.recipient(CHANNEL_OUTPUT_1)
.recipient(CHANNEL_OUTPUT_2)
.sendTimeout(1_234L))
.get();
}
#ServiceActivator(inputChannel = CHANNEL_OUTPUT_1, outputChannel = CHANNEL_END)
public Object foo(Message<?> message) {
message.gePayload();
// processing1() ...
}
#ServiceActivator(inputChannel = CHANNEL_OUTPUT_2, outputChannel = CHANNEL_END)
public Object bar(Message<?> message) {
message.gePayload();
// processing2() ...
}
I expect to get this workflow:
CHANNEL_INPUT(payload-1) |----> CHANNEL_OUTPUT_1(payload-2)
|----> CHANNEL_OUTPUT_2(payload-3)
where payload-2 on the input of the foo activator equals the payload-1 and payload-3 on the input of the bar activator equals payload-1
But the actual workflow is:
the payload-2 on the input of the foo activator equals payload-1 but the payload-3 on the input of the bar activator equals payload-2 message of the output of foo activator
it seems like this is the actual workflow
CHANNEL_INPUT(payload-1)----> CHANNEL_OUTPUT_1(payload-2)----> CHANNEL_OUTPUT_2(payload-3)
after debugging I notice that message.getHeader() are not the same (it actually contain the "sequenceNumber" and the "sequenceSize") but for the message.getPayload are as described above

While the message is immutable, the payload is not (unless it's an immutable object such as a String).
If you mutate the payload in service1, the mutation will be seen in service2.
You need to clone/copy the payload before mutating it if you don't want service2 to see the mutation.

Related

Spring Cloud Stream Kafka Streams Binder 3.x: No output to the second output topic in case of multiple output bindings

I have the following processor bean method signature:
#Bean
public BiFunction<KStream<String, MyClass>, KStream<String, String>, KStream<String, MyClass>[]> myStream() {
return (inputStream1, intputStream2) -> {
intputStream2
.peek((k, v) -> {
log.debug(...);
});
return inputStream1
.mapValues(...)
.branch((k,v) -> true, (k,v) -> true);
};
}
The relevant properties:
spring.cloud.stream.function.definition: ...;myStream
spring.cloud.stream.bindings:
myStream-in-0:
destination: inp0
myStream-in-1:
destination: inp1
myStream-out-0:
destination: out0
myStream-out-1:
destination: out1
Spring Cloud Kafka Stream version Hoxton.SR4 (spring-cloud-stream-binder-kafka-streams:jar:3.0.4.RELEASE), embedded Kafka version 2.5.0.
I am testing my topology using embedded Kafka:
#RunWith(SpringRunner.class)
#SpringBootTest(
properties = "spring.cloud.stream.kafka.binder.brokers=${spring.embedded.kafka.brokers}"
)
#EmbeddedKafka(partitions = 1,
topics = {
"inp0", "inp1", "out0", "out1"
},
brokerPropertiesLocation = "kafka.properties"
)
#Slf4j
public class MyApplicationTests {
#Test
public void embeddedKafkaTest() throws IOException, InterruptedException {
Consumer<String, MyClass> out0Consumer = createConsumer("out0ConsumerGroup");
Consumer<String, MyClass> out1Consumer = createConsumer("out1ConsumerGroup");
this.embeddedKafka.consumeFromAnEmbeddedTopic(out0Consumer, "out0");
this.embeddedKafka.consumeFromAnEmbeddedTopic(out1Consumer, "out1");
latch = new CountDownLatch(1);
// ... publish ...
latch.await(15, TimeUnit.SECONDS);
ConsumerRecords<String, MyClass> out0 = KafkaTestUtils.getRecords(out0Consumer);
assertThat(out0.count(), is(greaterThanOrEqualTo(1)));
ConsumerRecords<String, MyClass> out1 = KafkaTestUtils.getRecords(out1Consumer);
assertThat(out1.count(), is(greaterThanOrEqualTo(1)));
}
private <K,V> Consumer<K, V> createConsumer(String groupName) {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps(groupName, "true", this.embeddedKafka);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return new DefaultKafkaConsumerFactory<K, V>(consumerProps).createConsumer();
}
My tests show that the messages from myStream reach and land in the topic "out0" as expected, but "out1" topic remains empty and the unit test fails on the second assertion.
I've tried a couple of things, but it looks like the output to the second output topic is simply not being produced (the output to the first output topic is produced well).
Can you see any mistakes in my setup?
And one more thing: the return statement in the myStream bean method definition shows a compiler warning:
Unchecked generics array creation for varargs parameter
But it looks like that's how the Spring Cloud Kafka Stream 3.x API requires the return type to be defined?
You are passing two predicates to the branch method and both of them always evaluate to true. The first predicate always wins and produces data to the first output binding. The branch method invocation exits after the first predicate evaluate to true. See the javadoc for more details. You should use different predicates (possibly checking certain conditions on key/value). If the first predicate fails and the second one succeeds, then you will see data produced to the second output topic.
With respect to that compiler warning, I think you can safely ignore that as the API will ensure that the predicate objects passed into the branch invocation will have proper type. Since the implementation of the method uses generic varargs, you get that exception. See this thread for details on that compiler warning.

how to send partial source data attributes to different targets

I am building an integration between a source and two targets, here source data object has 10 attributes , of which one target needs around 6 attributes and another target needs 4 attributes only, appreciate any help here on how i can achieve with spring
You can configure the source to send the Message to a PublishSubscribeChannel .
Then configure two Transformers to subscribe to this pub-sub channel. One of the transformer will transform the message the 6 attributes version while other to the 4 attributes version. Both transformers will then send the transformed messages to a separate channel .The two target system will look for the messages sent to these separated channels and process them.
In term of annotation configuration , it looks like the following: (Assuming the message the source sent out is Foo)
#Bean
public MessageChannel pubSubChannel() {
return new PublishSubscribeChannel();
}
#Bean
public MessageChannel outputChannelWith4Attributes() {
return new DirectChannel();
}
#Bean
public MessageChannel outputChannelWith6Attributes() {
return new DirectChannel();
}
#Component
public class MyTransformer {
#Transformer(inputChannel = "pubSubChannel", outputChannel = "outputChannelWith4Attributes")
public Foo transformTo4Attribute(Foo foo) {
//do the transformation logic here
return result;
}
#Transformer(inputChannel = "pubSubChannel", outputChannel = "outputChannelWith6Attributes")
public Foo transformTo6Attribute(Foo foo) {
//do the transformation logic here
return result;
}
}
And configure the source to send the message with payload Foo to pubSubChannel .Also configure the targets to process message from outputChannelWith4Attributes and outputChannelWith6Attributes.

Use JSON transformer in Spring Integration

I have a problem that seems unaddressed in any of the examples I can find.
My application reads an ActiveMQ topic of JSON messages. It will build a completely new outbound REST call based on this data. Note that this is not a "transformation". It is given "X" produce "Y" i.e. ServiceActivator.
My flows thus far are
public IntegrationFlow splitInputFlow() {
return IntegrationFlows.from("inboundJmsChannel")
.split()
.log(LoggingHandler.Level.DEBUG)
.route(Message.class, m -> m.getHeaders().get("x-bn-class").equals("Healthcheck.class") ? "healthcheckChannel" : "metricChannel")
.get();
}
public IntegrationFlow healthcheckFlow() {
return IntegrationFlows.from("healthcheckChannel")
.log(LoggingHandler.Level.DEBUG)
.transform(Transformers.fromJson(Healthcheck.class))
.handle("healthcheckActivator", "process")
.get();
}
There are dozens of examples on how to use spring transformers. I have even considered trying a MessageConverter. But I don't see why it would help and it doesn't seem to be the normal approach.
The main problem here is that Integration calls healthcheckActivator.process(String payload). The payload itself is the expected valid JSON string.
I am a little surprised it does not call healtcheckActivator.process(Message payload) but But that wouldn't help so it doesn't much matter.
The real question is why does it not call healtcheckActivator.process(Healthcheck healthcheck)?
Well actually I understand "why". It is because DSL generates an internal channel to tie the steps together and as far as I understand anything on a channel is a spring.messaging.Message.
I can easily instantiate my Healthcheck object once I get inside the SA. But that leaves the nagging question: What possible good is the entire transform step? If it always "serializes" the object back into a Message -- what's the point.
Like I said I think I'm missing something fundamental here.
EDIT
My new (and probably last) idea is maybe I'm publishing it wrong.
To publish it I am using
jmsTemplate.convertAndSend(topicName, healthcheck, messagePostProcessor -> {
messagePostProcessor.setJMSType("TextMessage");
messagePostProcessor.setStringProperty("x-bn-class", "Healthcheck.class");
messagePostProcessor.setStringProperty("x-bn-service-name", restEndpoint.getServiceName());
messagePostProcessor.setStringProperty("x-bn-service-endpoint-name", restEndpoint.getEndpointName());
messagePostProcessor.setLongProperty("x-bn-heathcheck-timestamp", queryDate);
messagePostProcessor.setStringProperty("x-bn-healthcheck-status", subsystemStatus.getStatus(subsystemStatus));
messagePostProcessor.setIntProperty("httpStatus", httpStatus.value());
return messagePostProcessor;
});
What arrives in the SI process(String payload) method is:
LoggingHandler - GenericMessage [payload={"healthcheckType":"LOCAL","outcome":"PASS","dependencyType":"DB","endpoint":"NODE TABLE","description":"Read from DB","durationSecs":0.025}, headers={x-bn-service-name=TG10-CS2, x-bn-service-endpoint-name=TG Q10-CS2 Ready Check, jms_destination=topic://HEALTH_MONITOR, _type=com.healthcheck.response.Healthcheck, x-bn-heathcheck-timestamp=1558356538000, priority=4, jms_timestamp=1558356544244, x-bn-healthcheck-status=SEV0, jms_redelivered=false, x-bn-class=Healthcheck.class, httpStatus=200, jms_type=TextMessage, id=b29ffea7-7128-c543-9a14-8bab450f0ac6, jms_messageId=ID:39479-1558356520091-1:2:1:1:1, timestamp=1558356544409}]
I hadn't noticed the _type parameter in the jms_destination header before. But before I started screwing around with this (because it didn't work) that is the correct class name for what the other team provided.
I have not implemented a JMS message converter. But the supplied SimpleMessageConverter seems that it should do exactly what I want.
Your understanding is correct; works fine for me, so something else is going on...
#SpringBootApplication
public class So56169938Application {
public static void main(String[] args) {
SpringApplication.run(So56169938Application.class, args);
}
#Bean
public IntegrationFlow flow() {
return IntegrationFlows.from(() -> "{\"foo\":\"bar\"}", e -> e.poller(Pollers.fixedDelay(5000)))
.transform(Transformers.fromJson(Foo.class))
.handle("myBean", "method")
.get();
}
#Bean
public MyBean myBean() {
return new MyBean();
}
public static class MyBean {
public void method(Foo foo) {
System.out.println(foo);
}
}
public static class Foo {
private String foo;
String getFoo() {
return this.foo;
}
void setFoo(String foo) {
this.foo = foo;
}
#Override
public String toString() {
return "Foo [foo=" + this.foo + "]";
}
}
}
and
Foo [foo=bar]
Foo [foo=bar]
Foo [foo=bar]
Foo [foo=bar]
Foo [foo=bar]
Foo [foo=bar]
Well, Spring Integration is a Messaging framework. It transfers messages from endpoint to endpoint via channels in between. That's already the target endpoint responsibility to deal with consumed message the proper way. The framework doesn't care about the payload. It is really a business part of the target application. That's how we can make framework components as generic as possible leaving the room for target business types for end-users.
Anyway the Framework provides some mechanisms to interact with payloads. We call it POJO method invocation. So, you provide some business with arbitrary contract, however following some Spring Integration rules: https://docs.spring.io/spring-integration/docs/current/reference/html/#service-activator.
So, according your description it is really a surprise that it doesn't work for healtcheckActivator.process(Healthcheck healthcheck). Your transform(Transformers.fromJson(Healthcheck.class)) should really produce a Message with Healthcheck object as a payload. The framework consults a method signature and tries to map a payload and/or headers to the method invocation arguments, having the whole message as a container for data to delegate to the method call.
From here it would be great to see your healtcheckActivator.process() method to determine why the transform(Transformers.fromJson(Healthcheck.class)) result cannon be mapped to that method arguments.

Spring Integration Channeling With Bean Name vs Method Name

I have PublishSubscribeChannel like this:
#Bean(name = {"publishCha.input", "publishCha2.input"}) //2 subscribers
public MessageChannel publishAction() {
PublishSubscribeChannel ps = MessageChannels.publishSubscribe().get();
ps.setMaxSubscribers(8);
return ps;
}
I have also subscriber channels:
#Bean
public IntegrationFlow publishCha() {
return f -> f
.handle(m -> System.out.println("In publishCha channel..."));
}
#Bean
public IntegrationFlow publishCha2() {
return f -> f
.handle(m -> System.out.println("In publishCha2 channel..."));
}
And finally another subscriber:
#Bean
public IntegrationFlow anotherChannel() {
return IntegrationFlows.from("publishAction")
.handle(m -> System.out.println("ANOTHER CHANNEL IS HERE!"))
.get();
}
The problem is, when I call channel with method name "publishAction" like below from another flow, it only prints "ANOTHER CHANNEL HERE" and ignores other subscribers. However, if I call with
.channel("publishCha.input"), this time it enters publishCha and publishCha2 subscribers but ignoring the third subscriber.
#Bean
public IntegrationFlow flow() {
return f -> f
.channel("publishAction");
}
My question is, why those two different channeling methods yields different results?
.channel("publishAction") // channeling with method name executes third subscriber
.channel("publishCha.input") // channelling with bean name, executes first and second subscribers
Edit: narayan-sambireddy requested how I send messages to channel. I send it via Gateway:
#MessagingGateway
public interface ExampleGateway {
#Gateway(requestChannel = "flow.input")
void flow(Order orders);
}
In Main:
Order order = new Order();
order.addItem("PC", "TTEL", 2000, 1)
ConfigurableApplicationContext ctx = SpringApplication.run(Start.class, args);
ctx.getBean(ExampleGateway.class).flow(order);
Your problem with the third subscriber that you miss the purpose of the name in the #Bean:
/**
* The name of this bean, or if several names, a primary bean name plus aliases.
* <p>If left unspecified, the name of the bean is the name of the annotated method.
* If specified, the method name is ignored.
* <p>The bean name and aliases may also be configured via the {#link #value}
* attribute if no other attributes are declared.
* #see #value
*/
#AliasFor("value")
String[] name() default {};
So, method name as a bean name is ignored in this case, therefore Spring Integration Java DSL doesn't find a bean with the publishAction and creates one - DirectChannel.
You can use method reference though:
IntegrationFlows.from(publishAction())
Or, if that is in a different configuration class, you can re-use one of the predefined name"
IntegrationFlows.from(publishCha.input)
This way DSL will re-use existing bean and will just add one more subscriber to that pub-sub channel.

Right way to split, enrich items then send each item to another channel?

Is this the right way to split a list of items, enrich each item and then send each of those enriched items to another channel?
It seems like even though each item is being enriched only the last one is sent to the output channel...
Here is the snipper from my test where I see from the flow for only page2 being invoked.
this.sitePackage = new Package();
this.sitePackage.add(page1);
this.sitePackage.add(page2);
this.sitePackage.add(page3);
//Publish using gateway
this.publishingService.publish(sitePackage);
If I do this however...
this.sitePackage.add(page1);
this.sitePackage.add(page1);
this.sitePackage.add(page2);
this.sitePackage.add(page2);
this.sitePackage.add(page3);
this.sitePackage.add(page3);
I see all the pages being published but the last one is page2 not page3 (even though from debugging I can see the instance has page 3 properties).
It seems like every other item is being seen by the flows...
My flows go like this...
Starting with the PublishPackage flow. This is the main entry flow and intended to split the items out of the package and send each of them, after enriching the payload, to flows who are attached to the publishPackageItem channel...
#Bean
IntegrationFlow flowPublishPackage()
{
return flow -> flow
.channel(this.publishPackageChannel())
.<Package>handle((p, h) -> this.savePackage(p))
.split(Package.class, this::splitPackage)
.channel(this.publishPackageItemChannel());
}
#Bean
#PublishPackageChannel
MessageChannel publishPackageChannel()
{
return MessageChannels.direct().get();
}
#Bean
#PublishPackageItemChannel
MessageChannel publishPackageItemChannel()
{
return MessageChannels.direct().get();
}
#Splitter
List<PackageEntry> splitPackage(final Package bundle)
{
final List<PackageEntry> enrichedEntries = new ArrayList<>();
for (final PackageEntry entry : bundle.getItems())
{
enrichedEntries.add(entry);
}
return enrichedEntries;
}
#Bean
GatewayProxyFactoryBean publishingGateway()
{
final GatewayProxyFactoryBean proxy = new GatewayProxyFactoryBean(PublishingService.class);
proxy.setBeanFactory(this.beanFactory);
proxy.setDefaultRequestChannel(this.publishPackageChannel());
proxy.setDefaultReplyChannel(this.publishPackageChannel());
proxy.afterPropertiesSet();
return proxy;
}
Next, the CMS publish flows are attached to the publishPackageItem channel and based on the type after splitting, routed to a specific element channel for handling. After splitting the page only specific element types may have a subscribing flow.
#Inject
public CmsPublishFlow(#PublishPackageItemChannel final MessageChannel channelPublishPackageItem)
{
this.channelPublishPackageItem = channelPublishPackageItem;
}
#Bean
#PublishPageChannel
MessageChannel channelPublishPage()
{
return MessageChannels.direct().get();
}
#Bean
IntegrationFlow flowPublishContent()
{
return flow -> flow
.channel(this.channelPublishPackageItem)
.filter(PackageEntry.class, p -> p.getEntry() instanceof Page)
.transform(PackageEntry.class, PackageEntry::getEntry)
.split(Page.class, this::traversePageElements)
.<Content, String>route(Content::getType, mapping -> mapping
.resolutionRequired(false)
.subFlowMapping(PAGE, sf -> sf.channel(channelPublishPage()))
.subFlowMapping(IMAGE, sf -> sf.channel(channelPublishAsset()))
.defaultOutputToParentFlow());
//.channel(IntegrationContextUtils.NULL_CHANNEL_BEAN_NAME);
}
Finally, my goal is to subscribe to the channel and handle each element accordingly. I subscribe this flow to the channelPublishPage. Each subscriber may handle the element differently.
#Inject
#PublishPageChannel
private MessageChannel channelPublishPage;
#Bean
IntegrationFlow flowPublishPage()
{
return flow -> flow
.channel(this.channelPublishPage)
.publishSubscribeChannel(c -> c
.subscribe(s -> s
.<Page>handle((p, h) -> this
.generatePage(p))));
}
I somehow feel that the problem is here:
proxy.setDefaultRequestChannel(this.publishPackageChannel());
proxy.setDefaultReplyChannel(this.publishPackageChannel());
Consider do not use the same channel for requests and for waiting replies. This way you bring some loop and really unexpected behavior.

Resources