How to stop poller in spring integration while processing Files - spring

<bean id="inFileHandler"
class="com.yahoo.FileProcessHandler" />
<bean id="executorService" class="java.util.concurrent.Executors" factory-method="newSingleThreadExecutor" destroy-method="shutdownNow" />
<int:channel id="inChannel" />
<int:channel id="outChannel">
<int:queue capacity="5" />
</int:channel>
<bean id="sftpFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="${SFTP_HOST}"></property>
<property name="port" value="${SFTPPORT}"></property>
<property name="user" value="${SFTPUSERNAME}"></property>
<property name="password" value="${SFTPPASSWORD}"></property>
</bean>
<sftp:inbound-channel-adapter id="ftpInBound" channel="inChannel"
session-factory="sftpFactory"
delete-remote-files="true" remote-directory="/Files"
local-directory="file:C:/Bhaji">
<int:poller id="poller" fixed-rate="10000"/>
</sftp:inbound-channel-adapter>
<int:service-activator input-channel="inChannel"
output-channel="nullChannel" ref="inFileHandler" method="handler" />
and the FileProcess handler is
#Autowired
private ExecutorService executorService;
private static Logger log = LoggerFactory.getLogger(FileProcessHandler.class);
public File handler(File input) {
**//Doing some time taking process**
return input;
}
here while doing the time taking process i don't want to poll the SFTP inbound-channel-adapter. after completion of time taken process the poller should start automatically.
Is there any way to do this ? is there any way to achieve it?

What are using the executor service for - let the flow run on the poller thread and the next poll won't happen until the current poll completes.

Looks like it would be enough for you to use fixed-delay instead of fixed-rate on the <poller> and do the task in a TaskScheduler Thread. In this case you will have only one process at a time. And the next one will start only after the finish of previous.

Related

Not able to Stop MQueue listener

I have the following configuration for my MQueue:
<jms:listener-container container-type="default" connection-factory="cachedConnectionFactory" acknowledge="auto">
<jms:listener id="myListenerId" destination="myDestination" ref="myListener" method="onMessage" />
</jms:listener-container>
When I try to stop the reception of JMS messages, I write the following code
jmsManagement = myProject.instance.getContext().getBean('myListenerId',Lifecycle.class);
jmsManagement.stop();
PS :
When I stop() my listener, the isRunning() return False, but I still get messages through the MQueue... the onMessage gets triggered.
jmsManagement is an instance of the class Lifecycle. Even when I changed it to DefaultMessageListenerContainer, same thing.
I'm receiving messages before calling start(), even when autoStartup is set to false.
jmsManagement.shutdown(); didn't stop the listener from being triggered.
Does anyone have an idea about how to stop this MQ listener ?
Is there something I'm missing ?
I actually had to set autoStartup to true.
Since I can't do that using jms:listener-container, I instanciated a DefaultMessageListenerContainer bean and set the autoStartup property to false.
Here's the code that worked for me :
<bean class="org.springframework.jms.listener.DefaultMessageListenerContainer" id="pitagorCPYListener">
<property name="autoStartup" value="false" />
<property name="connectionFactory" ref="cachedConnectionFactory" />
<property name="destination" ref="defaultDestination" />
<property name="messageListener" ref="listenerPitagorCPY" />
</bean>
<bean id="defaultDestination" class="com.ibm.mq.jms.MQQueue">
<constructor-arg value="#{mqConnectionFactory.destination}"/>
</bean>

Spring integration outbound channel adapters not closing the open sockets and leaving the file handles open

We are using spring integration adapters for file ftp in our project, the problem we are facing is, the adapters are not closing the open socket connections.
As a result, other modules which are in the same managed server are failing with "Too many open files" socket connection exception. Is there a way to close the unused open socket connections from the channel adapters Or Can we get the underlying jsch connections and close the sockets from sftp channel adapters.
We have tried caching session factory and it did not close the open sockets. The file handles kept on piling up. Thanks in advance for the inputs.
We have two xmls one with outboundAdapter and the other with InboundAdapter. These two are in different xmls as they are different jobs that are run using spring batch. We are expected to send files to a location.
We are using spring batch 2.2.0 and spring integration 2.1.6 and spring integration 2.1.6.
Here is the configuration:
We have one session factory and it is wrapped by cachingSession factory:
<beans:bean id="sftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<beans:property name="host" value="hostname"/>
<beans:property name="privateKey" value="somepath"/>
<beans:property name="port" value="22"/>
</beans:bean>
<bean id="cachingSessionFactory"
class="org.springframework.integration.file.remote.session.CachingSessionFactory">
<constructor-arg ref="sftpSessionFactory"/>
<constructor-arg value="10"/>
<property name="sessionWaitTimeout" value="1000"/>
</bean>
**and then we have a channel**
<int:channel id="ftpChannel" />
**and then we have the following outbound Channel adapter**
<int-sftp:outbound-channel-adapter id="sftpOutboundAdapter"
session-factory="cachingSessionFactory"
channel="inputChannel"
charset="UTF-8"
use-temporary-filename="false"/>
**With the above configuration we are using the ftpChannel to send the files by constructing a payload like this:**
message = MessageBuilder.withPayLoad(f).build() // MessageBuilder is //org.springframework.integration.support.MessageBuilder and f is the file
ftpChannel.send(message)
**In another inbound job, the following is the configuration of adapters:
Session factory:**
<beans:bean id="sftpSessionFactory2"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<beans:property name="host" value="hostname"/>
<beans:property name="privateKey" value="somepath"/>
<beans:property name="port" value="22"/>
</beans:bean>
**Caching session factory:**
<bean id="cachingSessionFactory2"
class="org.springframework.integration.file.remote.session.CachingSessionFactory">
<constructor-arg ref="sftpSessionFactory2"/>
<constructor-arg value="10"/>
<property name="sessionWaitTimeout" value="1000"/>
</bean>
**and another channel:**
<int:channel id="ftpChannel2" />
**Now we have the following adapter in this xml:**
<int-sftp:outbound-channel-adapter id="sftpInboundAdapter"
session-factory="cachingSessionFactory2"
channel="inputChannel"
charset="UTF-8"
use-temporary-filename="false"/>
With this configuration in the above xml we are trying to get session from the cachingSessionFactory configured in the first xml, getting a session out of it, getting a list of files and then sending some files with ftpChannel2.send() and doing session.close() in finally block. When I do session.isOpen() in after session.close(), I see true being returned.
With these two jobs, I could see a lot of open file handles, which are socket connections and I am absolutely clueless as to how I can close those opened sockets.
The session will be closed when the operation is complete as long as you don't use the caching session factory - that is intended to keep the session open for the next use.
If you turn on DEBUG logging, you should get some insight into what it wrong.
EDIT
Just ran this with no problems:
#Test
public void test() throws Exception {
DefaultFtpSessionFactory sf = new DefaultFtpSessionFactory();
sf.setHost("10.0.0.3");
sf.setUsername("ftptest");
sf.setPassword("ftptest");
FtpSession session = sf.getSession();
Thread.sleep(10000);
session.close();
assertFalse(session.isOpen());
System.out.println("closed");
Thread.sleep(10000);
}
During the first sleep netstat -ntp shows the socket open; socket is gone after the close.
The session is the socket...
public void disconnect() throws IOException
{
closeQuietly(_socket_);
...
}
EDIT2
I had forgotten that with 2.1.x there was the cache-sessions attribute (2.1.x is very old).
I just tested with this (and 2.1.6) ...
<bean id="sftpSessionFactory"
class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="10.0.0.3" />
<property name="privateKey" value="file:/somPathTo/.ssh/id_rsa" />
<property name="port" value="22" />
<property name="user" value="ftptest" />
</bean>
<int:channel id="inputChannel" />
<int-sftp:outbound-channel-adapter id="sftpOutboundAdapter"
session-factory="sftpSessionFactory"
channel="inputChannel"
charset="UTF-8"
cache-sessions="false"
use-temporary-file-name="false"
remote-directory="." />
public class Main {
public static void main(String[] args) throws Exception {
ConfigurableApplicationContext context = new ClassPathXmlApplicationContext("context.xml");
File f = new File("foo.txt");
FileOutputStream fos = new FileOutputStream(f);
fos.write("bar".getBytes());
fos.close();
context.getBean("inputChannel", MessageChannel.class).send(MessageBuilder.withPayload(f).build());
System.out.println("Sleeping - check socket");
Thread.sleep(60000); // check socket
context.close();
System.exit(0);
}
}
With no problems (the socket is closed); if I set the cache-sessions to true, the socket remains open as expected.
I do notice you don't have a remote-directory attribute - that's illegal:
exactly one of 'remote-directory' or 'remote-directory-expression' is required on a remote file outbound adapter

How to stop/start spring DefaultMessageListenerContainer?

I have developed project using Spring JMS to receive the message from Queue. and it is deployed Websphere application Server (WAS 7.5) clustered environment.
it is working fine once it is deployed in server.Later i have update my logger information and deployed in to server. it seems server not picking the latest code base. Even though i have stop/start the cluster.
Please refer below config xml.
<bean id="connectionFactory" class="com.ibm.mq.jms.MQQueueConnectionFactory">
<property name="hostName" value="${hostName}"/>
<property name="port" value="${port}"/>
<property name="queueManager" value="${queueManager}"/>
<property name="transportType" value="${transportType}"/>
<property name="channel" value="${channel}"/>
</bean>
<jms:listener-container container-type="default"
connection-factory="connectionFactory" acknowledge="auto" concurrency="5" >
<jms:listener destination="DEV.TESTQUEUE" ref="jmsMessageListener"
</jms:listener-container>
<bean id="jmsMessageListener" class="JmsMessageListener"/>
public class JmsMessageListener implements MessageListener {
public void onMessage(Message message) {
}
}
Could you please advise how to stop/start the container?
Here is my solution:
final Map<String, DefaultMessageListenerContainer> containers = ctx.getBeansOfType(DefaultMessageListenerContainer.class);
if (containers != null && !containers.isEmpty()) {
for (DefaultMessageListenerContainer container : containers.values()) {
container.stop();
}
}
At last i found answer.
Default executor of DMLC is SimpleAsyncTaskExecutor.
SimpleAsyncTaskExecutor: This implementation does not reuse any threads, rather it starts up a new thread for each invocation.
However, it does support a concurrency limit which will block any invocations that are over the limit until a slot has been freed up.
If you’re looking for true pooling, keep scrolling further down the page. Spring Framework Task Execution and Scheduling.
So thread keep on running in container. this root cause of my issue. then i have restarted my JVM(with the support WAS Admin) and implemented ThreadPoolExecutor.
<jms:listener-container container-type="default"
connection-factory="connectionFactory" acknowledge="auto" concurrency="5" task-executor="taskExecutor">
<jms:listener destination="topCli_Service" ref="jmsMessageListener"
</jms:listener-container>
<bean id="taskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="10" />
<property name="WaitForTasksToCompleteOnShutdown" value="true" />
</bean>

Spring Integration, jms Inbound gateway for WMQ; Unable to consume messages

I have recently started exploring Spring Integration as that is one of the option we want to evaluate for our project.
The issue i am facing is below.
I have created a JMS inbound gateway to listen to WMQ queue and i am expecting the inboudnd gateway (using DML) should pick up the messages as and when they are available on queue(event - driven).
But some how the example isn't working. It fails to pick messages from the queue. However i can see (using a tool) that there are consumers created on the queue.
Help here is really appreaciated.
Code snippet below.
<bean id="mqFactory" class="com.ibm.mq.jms.MQConnectionFactory">
<property name="hostName" value="${mq.hostName}"/>
<property name="port" value="${mq.port}"/>
<property name="queueManager" value="${mq.queueManager}"/>
<property name="channel" value="${mq.channel}"/>
<property name="transportType" value="${mq.transportType}"/>
<property name="SSLCipherSuite" value="${mq.SSLCipherSuite}"/>
</bean>
<bean id="inCachingConnectionFactory" class="org.springframework.jms.connection.CachingConnectionFactory">
<property name="targetConnectionFactory" ref="mqFactory" />
<property name="sessionCacheSize" value="5" />
</bean>
<bean id="requestQueue-mq" class="com.ibm.mq.jms.MQQueue">
<constructor-arg value="${mq.example.queue}"/>
</bean>
<bean id="demoBean" class="com.jpmchase.example.spring.DemoBean">
</bean>
<jms:inbound-gateway id="wMQ_in_gateway" concurrent-consumers="2" max-concurrent-consumers="5" connection-factory="inCachingConnectionFactory" request-destination="requestQueue-mq"
request-channel="demoChannel" />
<integration:channel id="demoChannel">
</integration:channel>
<integration:service-activator input-channel="demoChannel" ref="demoBean"/>
Below is the service-activator java code.
enter #MessageEndpoint
public class DemoBean {
#ServiceActivator
public String upperCase(String input) {
System.out.println("inside the service activator " + input);
return "JMS response: " + input.toUpperCase();
}
here

Spring integration MQTT publish & subscribe to multiple topics

I am trying to build an application which subscribes to multiple mqtt topics, get the information, process it and form xmls and upon processing trigger an event so that these can be sent to some cloud server and the successful response from there to be sent back to the mqtt channel.
<int-mqtt:message-driven-channel-adapter
id="mqttAdapter" client-id="${clientId}" url="${brokerUrl}" topics="${topics}"
channel="startCase" auto-startup="true" />
<int:channel id="startCase" />
<int:service-activator id="startCaseService"
input-channel="startCase" ref="msgPollingService" method="pollMessages" />
<bean id="mqttTaskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="5" />
<property name="maxPoolSize" value="10" />
</bean>
<bean id="msgPollingService" class="com.xxxx.xxx.mqttclient.mqtt.MsgPollingService">
<property name="taskExecutor" ref="mqttTaskExecutor" />
<property name="vendorId" value="${vendorId}" />
</bean>
My question is how do I publish this to multiple channels, i.e. if I have an option to publish X message to Y topic. At present I have the below:
<int:channel id="outbound" />
<int-mqtt:outbound-channel-adapter
id="mqtt-publish" client-id="kj" client-factory="clientFactory"
auto-startup="true" url="${brokerUrl}" default-qos="0"
default-retained="true" default-topic="${responseTopic}" channel="outbound" />
<bean id="eventListner" class="com.xxxx.xxxx.mqttclient.event.EventListener">
<property name="sccUrl" value="${url}" />
<property name="restTemplate" ref="restTemplate" />
<property name="channel" ref="outbound" />
</bean>
I can publish this like:
channel.send(MessageBuilder.withPayload("customResponse").build());
Can I do something like:
channel.send(Message<?>, topic)
Your configuration looks good. However the MessageChannel is an abstraction for loosely-coupling and gets deal only with Message.
So, you request a-la channel.send(Message<?>, topic) isn't correct for Messaging concepts.
However we have a trick for you. From AbstractMqttMessageHandler:
String topic = (String) message.getHeaders().get(MqttHeaders.TOPIC);
.....
this.publish(topic == null ? this.defaultTopic : topic, mqttMessage, message);
So, what you need from your code is this:
channel.send(MessageBuilder.withPayload("customResponse").setHeader(MqttHeaders.TOPIC, topic).build());
In other words you should send a Message with mqtt_topic header to achieve a dynamic publication from <int-mqtt:outbound-channel-adapter>.
From other side we don't recommend to use MessageChannels directly from the application. The <gateway> with service interface is for such a case for end-application. Where that topic can be one of service method argument marked as #Header(MqttHeaders.TOPIC)

Resources