Spring XD custom job execution event - spring-xd

I can receive pre-defined job Execution events for my spring xd job as follows:
xd>job create --name myHttpJob --definition "httpJob" --deploy
xd>stream create --name jobExecutionEvents --definition "tap:job:myHttpJob.job >log" --deploy
Reference: http://docs.spring.io/spring-xd/docs/1.0.3.RELEASE/reference/html/#_retrieve_job_notifications
However, I would like to fire my own custom event and be able to do something with it. So maybe create a my own custom jobExecution event and publish it or create a totally new custom event and new listener. I'm having trouble finding the best way to do this, my question is what is the best way to do this in Spring XD?

It's not so easy to create a new tap for a custom event (you'd need a plugin to bind it to the bus).
However, you can easily add a listener to your job config and publish events to the aggregated events channel bean.
See https://github.com/spring-projects/spring-xd/blob/master/spring-xd-dirt/src/main/resources/META-INF/spring-xd/plugins/job/job-module-beans.xml
Only your events (and any others you enable) will go to the aggregated events channel.
If you want to explore adding your own tap, see https://github.com/spring-projects/spring-xd/blob/master/spring-xd-dirt/src/main/java/org/springframework/xd/dirt/plugins/job/JobEventsListenerPlugin.java for how the standard channels are bound to the bus.
Of course, you could always have your listener publish to rabbit outside of the XD infrastructure, via an <int-amqp:outbound-channel-adapter/> from within your
job configuration (but don't use a bus-based queue for that).
EDIT in response to your comment below.
I just tried it with Spring XD 1.1.0.RELEASE with no problems.
I added this
<int:inbound-channel-adapter expression="'foo'" channel="xd.job.aggregatedEvents">
<int:poller fixed-delay="5000"/>
</int:inbound-channel-adapter>
to the timestampfile file job (added the int namespace too).
This sends the literal foo to the aggregated events channel.
I then did this...
xd:>job create --name jobxxx --definition timestampfile
Successfully created job 'jobxxx'
xd:>job deploy jobxxx
Deployed job 'jobxxx'
xd:>stream create foo --definition "tap:job:jobxxx > log" --deploy
Created and deployed new stream 'foo'
...and saw this on the console...
18:29:13,392 INFO xdbus.tap:job:jobxxx.a1de5739-4399-4186-94de-33c5290a8411-1 sink.foo - foo
18:29:18,388 INFO xdbus.tap:job:jobxxx.a1de5739-4399-4186-94de-33c5290a8411-1 sink.foo - foo
18:29:23,390 INFO xdbus.tap:job:jobxxx.a1de5739-4399-4186-94de-33c5290a8411-1 sink.foo - foo
18:29:28,390 INFO xdbus.tap:job:jobxxx.a1de5739-4399-4186-94de-33c5290a8411-1 sink.foo - foo

Related

Spring boot start on listening to messages on application start

I have a Spring Boot application that starts listening on Azure IOT Hub at application start. It is done this way:
#EventListener
public void subscribeEventMessages(ContextRefreshedEvent event) {
client
.receive(false) // set this to false to read only the newly available events
.subscribe(this::hubAllEventsCallback);
}
My problem is, that this uses ContextRefreshedEvent but in fact i only want to start it once on application start.
I also checked other methods how start something at the beginning, like CommandLineRunner.
On the other hand if implementing listeners for more standard stuff like JMS there are specific Annotations like #JmsListener or providing Beans of specific Types.
My question is: Can i leverage some of these more message(subscribe) related mechanisms to start my method?
If we don't want our #EventListener to listen on "context refresh" but only on "context start", please (try) replace:
ContextRefreshEvent
with ContextStartEvent
...which is "sibling class" with exactly this semantical difference.

Need some guidance with Spring Integration Flow

I am new to Spring Integration and have read quite some documentation and other topics here on StackOverflow. But I am still a bit overwhelmed on how to apply the newly acquired knowledge in a Spring Boot Application.
This is what should happen:
receive message from a Kafka topic, eg from "request-topic" (payload is a custom Job POJO). InboundChannelAdapter?
do some preparation (checkout from a git repo)
process files using a batch job
commit&push to git, update Job object with commit-id
publish message to Kafka with updated Job object, eg to "reply-topic". OutboundChannelAdapter?
Using DSL or plain Java configuration does not matter. My problem after trying several variants is that I could not achieve the desired result. For example, handlers would be called too early, or not at all, and thus the reply in step 5 would not be updated.
Also, there should only be one flow running at any given time, so I guess, a queue should be involved at some point, probably at step 1(?).
Where and when should I use QueueChannels, DirectChannel (or any other?), do I need GatewayHandlers, eg to reply with a commit-id?
Any hints are appreciated.
Something like this:
#Bean
IntegrationFlow flow() {
return IntegrationFlows.from(Kafka.inboundGateway(...))
.handle(// prep)
.transform(// to JobLaunchRequest)
.handle(// JobLaunchingGateway)
.handle(// cleanUp and return result)
.get();
}
It will only process one request at a time (with default concurrency).

Set MDC properties only for rabbitmq events

I want to apply filters to log only for rabbit events using MDC properties.
And set trace Id && correlation id from the event header.
I already have a RequestResponseLoggingFilter which is used for setting the tenant Id.
I am not sure how to trigger this filter only for async rabbit events.
If you are using a MessageListener or #RabbitListener you can add a MessagePostProcessor to the listener container (or listener container factory, respectively) in the afterReceivePostProcessors property.
The post processor(s) are called after a message is received and before the listener is called.

How to start/stop nt-aws:s3-inbound-channel-adapter manually

How to customize start/stop of aws s3 inbound-channel-adapter . I want to set auto-startup="false" initially and start manually when server starts.Looking for a solution which is similar like we have the below solution for file inbound channel adaptor.
inboundFileAdapterChannel.send(new GenericMessage("#'s3FilesChannelId.adapter'.start()"));
Config:
If i try the same approach for s3 inbound adapter channel . I am getting the below error
APPLICATION FAILED TO START
Description:
A component required a bean named 's3FilesChannelId.adapter' that could not be found.
Action:
Consider defining a bean named 's3FilesChannelId.adapter' in your configuration.
Let's assume we have a channel adapter like this:
<int-aws:s3-inbound-channel-adapter id="s3Inbound"
channel="s3Channel"
session-factory="s3SessionFactory"
auto-create-local-directory="true"
auto-startup="false"
delete-remote-files="true"
preserve-timestamp="true"
filename-pattern="*.txt"
local-directory="."
remote-file-separator="\"
local-filename-generator-expression="#this.toUpperCase() + '.a' + #fooString"
comparator="comparator"
temporary-file-suffix=".foo"
local-filter="acceptAllFilter"
remote-directory-expression="'foo/bar'">
<int:poller fixed-rate="1000"/>
</int-aws:s3-inbound-channel-adapter>
Pay attention to the auto-startup="false" and to the id="s3Inbound".
So, it isn't going to be started automatically after application context initialization.
However using that s3Inbound id we can do that manually whenever it is convenient for us.
Your story about inboundFileAdapterChannel is not clear though, but you still can inject a Lifecycle for the mentioned channel adapter and perform its start():
#Autowired
#Qualifier("s3Inbound")
private Lifecycle s3Inbound;
...
this.s3Inbound.start();
The piece of code about inboundFileAdapterChannel seems like a reference to the Control Bus approach, but that's already slightly different story: https://docs.spring.io/spring-integration/docs/current/reference/html/system-management-chapter.html#control-bus

Spring Boot and long running tasks

In my Spring Boot application I have to implement an import service. Users can submit a bunch of JSON files and application will try to import the data from these files. Depending on the data amount at JSON files the single import process can take a 1 or 2 hours.
I do not want to block the users during the import process so I plan to accept the task for importing and notify user that this data is scheduled for processing. I'll put the data into the queue and a free queue-consumer on the other end will start the import process. Also, I need to have a possibility to monitor a jobs in the queue, terminate them if needed.
Right now I'm thinking to use Embedded Apache ActiveMQ in order to introduce message producer and consumer logic but before this I'd like to ask - from the architecture point of view - is it a good choice for the described task or it can be implemented with a more appropriate tools.. like for example plain Spring #Async and so on ?
It is possible to treat files concurrently with Camel like this
from("file://incoming?maxMessagesPerPoll=1&idempotent=true&moveFailed=failed&move=processed&readLock=none").threads(5).process()
Take a look at http://camel.apache.org/file2.html
But i think that it is better for your requirements to use a standalone ActiveMQ, a standalone service to move files to ActiveMQ and standalone consumer to be capable to kill or restart each one independently.
It is better to use ActiveMQ as you said and you can easily create a service to move messages to a queue with Camel like this :
CamelContext context = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory("vm://localhost?broker.persistent=true");
context.addComponent("test-jms", JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
context.addRoutes(new RouteBuilder() {
public void configure() {
// convertBodyTo to use TextMessage or maybe send them as file to the Queue from("file://testFolderPath").convertBodyTo(String.class).to("test-jms:queue:test.queue");
}
});
context.start();
Here some examples
http://www.programcreek.com/java-api-examples/index.php?api=org.apache.camel.component.jms.JmsComponent
https://skills421.wordpress.com/2014/02/08/sending-local-files-to-a-jms-queue/
https://github.com/apache/camel/blob/master/examples/camel-example-jms-file/src/main/java/org/apache/camel/example/jmstofile/CamelJmsToFileExample.java
https://github.com/apache/camel/tree/master/examples
To monitor and manage you can use jmx with VisualVM or Hawtio http://hawt.io/getstarted/index.html
http://camel.apache.org/camel-jmx.html
To consume you can use DefaultMessageListenerContainer with concurrent consumers on the queue and for this you need to change the prefetchPolicy on the ConnectionFactory of the DefaultMessageListenerContainer , Multithreaded JMS client ActiveMQ

Resources