Hi I am new to Spring Integration and I was wondering if anyone knew how to work around an issue i am having. When I setup the mongo outbound-channel-adapter, I am receiving a list of objects in the custom MappingMongoConverter write method. Does anyone know how to write these to Mongo as individual documents? At the moment I can store them all in one document, but would prefer one document per object in the list.
Many thanks,
Regards.
Simply add a <splitter/> upstream of the outbound channel adapter (it only needs an input channel and output channel - the default splitter will split a Collection into its individual elements).
Related
I say 'not working as expected' but actually is more like 'I don't really know if I'm doing the proper work in here', I feel like I'm mixing stuff from different approaches and doesn't really correlate.
Right now I've been using Spring Cloud Streams to process String-type messages from a PubSub subscription and so far so good, message in message out without much of a hassle.
What I'm trying to achieve now is to gather, let's say, 1000 messages, process them and send them altogether to another PubSub Topic. Still unsure about sending them as a List or individually like now, but all at the same time (this shouldn't be related to this question though).
Now I just discovered the following property.
spring.cloud.stream.bindings.input.consumer.batch-mode=true
Together with the following ones more specific to the GCP stuff.
spring.cloud.gcp.pubsub.publisher.batching.enabled=true
spring.cloud.gcp.pubsub.publisher.batching.delay-threshold-seconds=300
spring.cloud.gcp.pubsub.publisher.batching.element-count-threshold=100
So first question is... Are they linked by any means? Must I have the first one together with the other three?
What happened after I added the previous properties to my application.properties file is actually no change at all. Messages keep arriving and leaving the application without any issue and with no batch approach whatsoever.
Currently using the functional features the following way.
#Bean
public Function<Message<String>, String> sampleFunction() {
... // Stream processing in here
return processedString;
}
I was expecting this to crash with some message since the method only receives a String, not a list of String. Since it didn't crash, I modified the method above to receive a list of String (maybe Spring does some magic behind the scenes to still receive messages as String but collect them in a list for the method to process afterwards?).
#Bean
public Function<Message<List<String>>, String> sampleFunction() {
... // Stream processing in here
return processedString;
}
But this just crashes since it's trying to parse a single String message as a List of String.
How could I prepare the code to batch all those String messages into a List? Is there any example on this?
...batch-mode only works with binders that support it (e.g. Kafka, RabbitMQ). It doesn't look like the GCP binder supports it (I see no references to the property).
https://github.com/spring-cloud/spring-cloud-gcp/blob/master/spring-cloud-gcp-pubsub-stream-binder/src/main/java/org/springframework/cloud/gcp/stream/binder/pubsub/PubSubMessageChannelBinder.java
https://docs.spring.io/spring-cloud-stream/docs/3.1.0/reference/html/spring-cloud-stream.html#_batch_consumers
Publisher batching is not related to consumer batching.
It is my first post to this here and I am not sure if this was covered here before, but here goes: I have a Kafka Streams application, using Processor API, following the topology below:
1. Consume data from an input topic (processor.addSource())
2. Inserts data into a DB (processor.addProcessor())
3. Produce its process status to an output topic (processor.addSink())
App works big time, however, for traceability purposes, I need to have in the logs the moment kstreams produced a message to the output topic, as well as its RecordMetaData (topic, partition, offset).
Example below:
KEY="MY_KEY" OUTPUT_TOPIC="MY-OUTPUT-TOPIC" PARTITION="1" OFFSET="1000" STATUS="SUCCESS"
I am not sure if there is a way to override the default kafka streams producer to add this logging or maybe creating my own producer to plug it on the addSink process. I partially achieved it by implementing my own ExceptionHandler (default.producer.exception.handler), but it only covers the exceptions.
Thanks in advance,
Guilherme
If you configure the streams application to use a ProducerInterceptor, then you should be able to get the information you need. Specifically, implementing the onAcknowledgement() will provide access to everything you listed above.
To configure interceptors in a streams application:
Properties props = new Properties();
// add this configuration in addition to your other streams configs
props.put(StreamsConfig.producerPrefix(ProducerConfig.INTERCEPTOR_CLASSES_CONFIG), Collections.singletonList(MyProducerInterceptor.class));
You can provide more than one interceptor if desired, just add the class name and change the list implementation from a singleton to a regular List. Execution of the interceptors follows the order of the classes in the list.
EDIT: Just to be clear, you can override the provided Producer in Kafka Streams via the KafkaClientSupplier interface, but IMHO using an interceptor is the cleaner approach. But which direction to go is up to you. You pass in your KafkaClientSupplier in an overloaded Kafka Streams constructor.
I am using spring integration,and I am using default correlation strategy, that is i am not explicitly writing code for correlation strategy,everything works fine till the splitter, after the splitter there is a service activator which does some processing and then puts the message into a channel from which the aggregator has to pick it,but the aggregator doesnt pick it, so i put an interceptor to find out what was going on and found out that before the message is put into the aggregator channel, aggregation related headers like correlation id etc are present,but once its put into the channel the headers are lost.Now i am not sure why the aggregator or the channel before that is losing the headers.Any help would be much appreciated.
UPDATE:- i using an spliier then activator then another splitter then an activator then an aggregator and then another aggregator... The code below is for inner splitter and aggregator combination
Thanks for your help.
I was able to finally solve this.
The problem was i was passing along org.json.JSONOBject from and to the spring integration components.
Now the JSONObject is not serialized, and i guess splitter and aggregator components only work with serialized objects. The simplest way was to conver the JSONObjects to string by calling toString() method on them.It would have been so much easier if the stack trace told me that i was using a non-serialized object instead of telling me "Null correlation not allowed. Maybe the CorrelationStrategy is failing?"
I have removed my code that i had put here to be safe.
I am a newbie at Spring Batch and have recently started using it.
I have a requirement where I need to post/write the messages read from each DB record on different queues using single Job. As I have to use reader to read the messages from DB and use processor to decide on which queue I have to post it.
So my question is Can I use single JMSwriter to post the messages on different queues as I have to use single Job and DB Reader.
Thanks in Advance
As I know JMSwriter not supports it (it writes to default destination of jmsTemplate).
But you may just implement your own ItemWriter, inject all jmsTemplates in it and write custom decistion logic to select appropriate destionation and write to it.
Another way - use ClassifierCompositeItemWriter , put a set of JmsWriters to it and select one by your classifier
I am using Spring Integration and have a large XML file containing a collection of child items, I want to split the file into a set of messages, the payload of each message will be one of the child XML fragments.
Using splitter is the obvious but this requires returning a collection of messages and this will exhaust the memory; I need to split the file into individual messages but process them one at a time (or more likely with a multi threaded task-executor).
Is there a standard way to do this without writing a custom component that writes the sub-messages to a channel programatically.
i have been looking for a similar solution and I have not found either any standard way of doing this.
Here is a rather dirty fix, if anyone needs this behavior implemented:
Split the files manually using a Service Activator or a Splitter with a custom bean.
<int:splitter input-channel="rawChannel" output-channel="splitChannel" id="splitter" >
<bean class="com.a.b.c.MYSplitter" />
</int:splitter>
Your custom bean should implement ApplicationContextAware so the application context can be injected by Spring.
Manually retrieve the output channel and send each sub-message
MessageChannel xsltChannel = (MessageChannel) applicationContext.getBean("splitChannel");
Message<String> message = new GenericMessage<String>(payload));
splitChannel.send(message);
For people coming across this very old question. Splitters can now handle results of type Iterable, Iterator, Stream, and Flux (project reactor). If any of these types are returned, messages are emitted one-at-a-time.
Iterator/Iterable since 4.0.4; Stream/Flux since 5.0.0.
There is also now a FileSplitter which emits file contents a line-at-a-time via an Interator - since 4.1.2.