Extending Spring Integration while maintaining previous functionality? - spring

so I'm new to Spring Integration, and mostly to Spring as well, so I might not be up on all of the terminology, but I'm running across the following scenario:
I have a small Spring Integration application with three SI flows... each flow has its own Gateway, and each Gateway has its own request channel and reply channel. These flows receive a null invocation (for all intensive purposes... basically just a 'GO' signal / empty message) and reply with a status message, depending upon the (trivial) business logic results.
I would now like to wire each of these flows together to run in one 'master flow', given one request, without taking away their ability to run separately, and I'd like to wire it up completely through annotation / XML (IE. given a controller that invokes the main gateway's service interface, no additional code needs to be written outside of annotation / XML configuration.)
Is this feasible, what Integration components should I be using to do so, and/or should I just be adjusting the expected channels for each of these gateways to be meeting each other end-to-end (and if so, how would that strategy compensate to allow each of the flows to be called on a case-by-case basis)?
In addition, if this is not feasible, would it be appropriate to use a service activator to invoke each of the child flows? I wanted to avoid coding more, but if that is the only option, I guess that it'll have to do.
Thanks!

Probably the simplest way to do this is use Spring Profiles (a Spring 3.1 feature). When deployed in stand-alone mode, the final element can be a "bridge to nowhere"...
<int:bridge input-channel="app1Final" />
... when the final element in a flow has no output channel, the message is returned to the gateway's reply channel. If you prefer to explicitly configure the bridge to point to the gateway's reply-channel, that's ok too; it just not needed.
In the "linked" profile, you configure the bridge thus...
<int:bridge input-channel="app1Final" output-channel="app2Inbound"/>
...where app2Inbound is the same as that app's gateway's request-channel.
<beans profile="default">
<int:bridge input-channel="app1Final" />
</beans>
<beans profile="linked">
<int:bridge input-channel="app1Final" output-channel="app2Inbound"/>
</beans>
To run with the linked profile, set system property 'spring.profiles.active' to 'linked'

Related

Spring Mqtt - Publish messages to multiple topics programmatically

How can I publish messages with different topics programmatically?
<mqtt:outbound-channel-adapter id="mqttOut"
auto-startup="true"
client-id="foo"
url="tcp://localhost:1883"
client-factory="clientFactory"
default-qos="0"
default-retained="false"
default-topic="bar"
async="true"
async-events="true" />
I tried Spring integration MQTT publish & subscribe to multiple topics, but were not able to configure.
Also tried with MqttPahoMessageHandlerAdapter which has a publish() but protected.
Going with org.eclipse.paho.client.mqttv3.MqttAsyncClient and org.eclipse.paho.client.mqttv3.MqttCallback is very easy. But I would like to stick with spring all the way.
Appreciate if somebody can points me to a correct direction.
Declare a <publish-subscribe-channel id="toMqtt" />; set it as the channel attribute on each outbound channel adapter; the message will be sent to each adapter.
You can do that with Spring Integration anyway! Having a lot of EIP components implementation and Spring power on board (injection, SpEL etc,), plus switching on a bit of imagination, we can reach any end-application requirements even without any Java code.
So, <mqtt:outbound-channel-adapter> allows determine topic at runtime. Instead of default-topic you should supply MqttHeaders.TOPIC MessageHeader.
So, if you have a requirement to send the same message to several topics, you just build a copy of that message for each topic. The <splitter> can help us:
<int:splitter input-channel="enricheMessage" output-channel="sendMessage" apply-sequence="false">
<int-groovy:script>
['topic1', 'topic2', 'topic3'].collect {
org.springframework.integration.support.MessageBuilder.withPayload(payload)
.copyHeaders(headers)
.setHeader(org.springframework.integration.mqtt.support.MqttHeaders.TOPIC, it)
.build()
}
</int-groovy:script>
</int:splitter>
sendMessage can be ExecutorChannel to achieve the parallel publishing.
UPDATE
You can achieve the same iteration and message enrichment logic with similar Java code using ref and method on <splitter>.
Of course, we can do that even with SpEL , but it will look a bit complex with Collection Projection.

Spring Integration/RabbitMQ/AMQP: How do I create outbound-channel-adapters for dynamic input channels?

I'm working on abstracting out any sort of messaging framework for some code I'm working on. Basically, I'm using a combination of Spring AOP and Spring Integration to generate messages without the Java code knowing anything about RabbitMQ, JMS, or even Spring Integration. That said, what I'm using to generate the messages is contained in its own .jar, and it re-used by several other areas of the application. I currently have the messaging system set up such that the channels on which messages are sent are specified by the code that calls the system (i.e., channels are generated automatically based on the external method invocation) by specifying the channel name in the message header and using a header-value router to create the channels if they don't exist. My issue comes in on the endpoint of these channels - the intention of the current structure is to allow Spring to change to any messaging structure as requirements specify or change. I know how to take a static channel and use outbound channel converters/gateways to send it to a pre-specified RabbitMQ/JMS queue and process from there; what I'm struggling with is how to tell Spring that I need every channel created by the router to have a RabbitMQ (or whatever other messaging system gets implemented) outbound channel adapter that's dynamically generated based on the channel name since we don't know channel names beforehand.
Is this possible? And if not, would you mind providing input as to what could perhaps be a better way?
Thanks ahead of time!
Here's a basic template of what my config file looks like - I have an initial channel ("messageChannel") which gets sent to a publish-subscribe-channel and queuing channel depending on one of the message headers and is routed from there.
<!--Header value based channel configurations-->
<int:channel id="messageChannel" />
<int:channel id="queue" />
<int:publish-subscribe-channel id="topic" />
<!--Header-based router to route to queue or topic channels-->
<int:header-value-router input-channel="messageChannel"
header-name="#{ T(some.class.with.StringConstants).CHANNEL_TYPE}" />
<!--Re-routes messages according to their destination and messaging type-->
<int:header-value-router input-channel="queue"
header-name="#{ T(some.class.with.StringConstants).MESSAGE_DESTINATION}" />
<int:header-value-router input-channel="topic"
header-name="#{ T(some.class.with.StringConstants).MESSAGE_DESTINATION}" />
<!--AOP configuration - picks up on any invocation of some.class.which.generates.Messages.generateMessage()
from a Spring-managed context.-->
<aop:config>
<aop:pointcut id="eventPointcut"
expression="execution(* some.class.which.generates.Messages.generateMessage(..))" />
<aop:advisor advice-ref="interceptor" pointcut-ref="eventPointcut"/>
</aop:config>
<int:publishing-interceptor id="interceptor" default-channel="messageChannel">
<int:method pattern="generateMessage" payload="#return" channel="messageChannel" />
</int:publishing-interceptor>
See the dynamic-ftp sample; it uses a dynamic router that creates new outbound endpoints/channels on demand.

Spring Integration :Can we have two Inbound Channel in the same time

I am using
<int:inbound-channel-adapter id="dummyMessageA" channel="messages" method="getMessage" auto-startup="true" ref="messageGenerator">
<int:poller error-channel="errorChannel" fixed-rate="10000"/>
</int:inbound-channel-adapter>
<int:inbound-channel-adapter id="dummyNotif" channel="notifs" method="gtNotif" auto-startup="true" ref="notifGenerator">
<int:poller error-channel="errorChannel" fixed-rate="10000"/>
</int:inbound-channel-adapter>
These inbound channels are independent but when I deploy my Web Application, Only the second inbound channel adapter is taken into consideration (although the other one was working before adding the dummyNotif). Is this normal, should I add something in the config (NB : I don't aggregate the messages)
My guess you catched this issue https://jira.spring.io/browse/INT-3240 - 'Inbound Channel Adapter Parser doesn't generate unique bean Id for MessageSources'. That's mean that you use Spring Integration 3.0.
So, just upgrade to the latest - 3.0.2.RELEASE - and let us know.
UPDATE
Regarding the same id for several beans. By default Spring allow to do it and the last bean wins. All others will be ignored and skipped.
It can be disabled by AbstractRefreshableApplicationContext#setAllowBeanDefinitionOverriding(false).
From other side if you setup DEBUG logging level for org.springframework category you'll the message in the logs that your beans are overriden.
as far as your question is concerned, Spring Integration allows to have multiple inbound-channel-adapter definition in a single context.
However, from your comments, seems that you have some different issue in your configuration multiple Service Activators with same Id.
It can be disabled as #Artem described in his answer.

Passing a Spring bean to a Camel component

I have a custom component of type FooComponent which is added to the route by the following lines:
from("foo://bar?args=values&etc")
.bean(DownstreamComponent.class)
...
FooComponent creates an endpoint and consumer (of type FooConsumer) which in turn emits messages which get to the DownstreamComponent and the rest of the pipeline.
For monitoring, I need the FooComponent consumer to call a method on a non-Camel object, which I'm creating as a Spring bean. The Camel pipeline is very performance sensitive so I'm unable to divide the FooComponent into two halves and insert the monitor call as a Camel component between them (my preferred solution, since FooComponent shouldn't really have to know about the monitor). And I'm reluctant to turn the method call into a Camel Message that will be picked up by the monitoring component later in the pipeline, as the pipeline filtering becomes complicated later and I don't want to meddle with it more than necessary.
Somewhere inside FooConsumer, I have:
// in the class
#Autowired
Monitor monitor;
// inside the consumer's run method
monitor.noticeSomething();
The problem is that monitor will never be set to the Monitor bean which is created in the rest of the application. As I understand it, it's because FooConsumer itself is not visible to Spring -- an object of that type is created normally inside FooComponent.
So, how can I get FooComponent to find the Monitor instance that it needs to use?
Can I pass it in when the route is created? This seems tricky because the definition is a faux URL "foo://bar?args=values&etc"; I haven't found how to pass Java objects that way.
Can I get Spring to find that #Autowired annotation inside FooConsumer and inject the monitor object somehow?
If you have a singleton instance of Monitor you ought to be able to #Autowire it in the FooComponent class. As Camel will let Spring dependency inject when creating the FooComponent.
Then you can pass on the monitor instance when you create the endpoint / consumer from your component.
The easiest thing to do is to create a Monitor property on the FooComponent class, and wire it in like any other bean.
<bean id="monitorBean" class="my.Monitor"/>
<bean id="foo" class="my.FooComponent">
<property name="monitor" ref="monitorBean"/>
</bean>
Then in your FooConsumer, when you need to get hold of the monitor, call:
Monitor monitor = ((FooComponent) getEndpoint().getComponent()).getMonitor();
If you were changing the monitor bean on a per-endpoint basis, you could use Camel's nifty # syntax to locate a bean with that id, and inject it into an Endpoint property.
foo:something?monitor=#monitorBean
Then to use it in your FooConsumer you simply say:
Monitor monitor = ((FooEndpoint) getEndpoint()).getMonitor();

Exposing multiple implementations of a interface as OSGI service

I have an interface that has two implementations. I want to expose both implementations as OSGi services, but when I am doing that one overrides the other. Please find the configuration that I am doing:
<bean id="formService" class="com.dbt.form.service.FormService"/>
<bean id="formAPIService" class="com.dbt.form.service.FormAPIService"/>
<osgi:service
ref="formAPIService"
interface="com.dbt.form.service.ifc.IFormService"/>
<osgi:service
ref="formService"
interface="com.dbt.form.service.ifc.IFormService" />
Here formService is overriden by formAPIService implementation.
Please help me on how to sort out this issue.
The second service does NOT override the first... both of these services will be published separately, and you can confirm this by typing the inpect cap service command in the OSGi Gogo shell.
What MAY happen is that your consumer code will only choose one of the available service instances. In this case you need to write your consumer to either bind to all instances, or use a combinations of rankings or target filters to determine which particular service you want. You should give more information on how you are using these services since that is where the problem lies (probably).
Read this page...Chapter 8. The Service Registry section 8.2.2.3.
You can use bean-name attribute of osgi reference tag. While importing a service bean-name refers to the id attribute of that service when its exported.

Resources