Recently we introduced task executor to spring integrations pollers to fasten our file reading process . However introduction of task executor led to unexpected problems where in our service stopped processing messages in spring integration channels
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:integration="http://www.springframework.org/schema/integration"
xmlns:int-file="http://www.springframework.org/schema/integration/file"
xmlns:int-jdbc="http://www.springframework.org/schema/integration/jdbc"
xmlns:int-amqp="http://www.springframework.org/schema/integration/amqp"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration
http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/integration/file
http://www.springframework.org/schema/integration/file/spring-integration-file.xsd
http://www.springframework.org/schema/integration/jdbc
http://www.springframework.org/schema/integration/jdbc/spring-integration-jdbc.xsd
http://www.springframework.org/schema/integration/amqp
http://www.springframework.org/schema/integration/amqp/spring-integration-amqp.xsd">
<integration:channel id="filesIn" />
<integration:channel id="toArchive" />
<integration:channel id="toRabbitForRO" />
<integration:channel id="outputFilesROIn" />
<integration:channel id="toRabbitForSA" />
<integration:channel id="outputFilesSAIn" />
<int-file:inbound-channel-adapter directory="${rnr.file.directory}" auto-startup="true"
filter="filterFiles" channel="filesIn">
<integration:poller
cron="0 0,5,10,15,20,25,30,35,40,45,50,55 0-9,18-23 * * ?"
task-executor="largeFileTaskExecutor"
max-messages-per-poll="${max-messages}"/>
</int-file:inbound-channel-adapter>
<integration:service-activator input-channel="filesIn" output-channel="toArchive"
ref="processSingleLargeFile" method="process"></integration:service-activator>
<int-file:outbound-channel-adapter channel="toArchive" delete-source-files="true"
directory="file:${rnr.file.directory}/archive">
</int-file:outbound-channel-adapter>
<int-file:inbound-channel-adapter directory="${roOutputDir}"
auto-startup="true" filename-pattern="*.xml" channel="outputFilesROIn">
<integration:poller fixed-delay="200"
task-executor="smallFileTaskExecutor"
max-messages-per-poll="${max-messages}" ></integration:poller>
</int-file:inbound-channel-adapter>
<integration:service-activator input-channel="outputFilesROIn"
output-channel="toRabbitForRO" ref="processMultipleFiles" method="processROFile"></integration:service-activator>
<int-amqp:outbound-channel-adapter
channel="toRabbitForRO" amqp-template="rabbitTemplate" exchange-name="sample-excahnge"
routing-key="sample-key" />
</beans>
Te first task executor introduced in poller works perfectly. However the second poller doesn't seem to work. It does not read file from the directory mentioned
<int-file:inbound-channel-adapter directory="${roOutputDir}"
auto-startup="true" filename-pattern="*.xml" channel="outputFilesROIn">
<integration:poller fixed-delay="200"
task-executor="smallFileTaskExecutor"
max-messages-per-poll="${max-messages}" ></integration:poller>
</int-file:inbound-channel-adapter>
// this file channel adapter is not working . No message appear in output channel **outputFilesROIn**
smallFileTaskExecutor and largeFileTaskexecutor have 2 threads as core pool size .
max-messages-per-poll for each poller is defined as 2
Thread dump when our service is not processing messages : https://fastthread.io/my-thread-report.jsp?p=c2hhcmVkLzIwMjAvMDgvMTkvLS1hcGktZWQzZTJmYzMtMWFkYy00Mzk5LWJkZjgtNzk0NGQwMzdjNjIwMjg2Njk5ZDMtYTFmNC00YzIzLThmZTQtYzQ4Nzg4NmNhNGM1LnR4dC0t&
PS : followed this How to read and process multiple files concurrently in spring? while implementing concurrency in reading files
According to your thread dump, it looks like a default task scheduler with pool of 10 threads is fully busy. Probably your cron is very aggressive and with that largeFileTaskExecutor it has a opportunity to fire scheduled tasks like a crazy. This way your second poller just don't have resources in the task scheduler to do its job.
Consider to adjust your cron or reconfigure that task scheduler to bigger thread pool. Or just don't use executors since it confirms to us that it doesn't increase your performance.
See docs for scheduler thread pool: https://docs.spring.io/spring-integration/docs/current/reference/html/configuration.html#namespace-taskscheduler
Related
I'm trying to configure Spring XD's mail sink to send messages to an outlook account. This is my stream definition:
stream create outlookMailSink --definition "http | mail --to='\"email#address.com\"' --host=outlook.office365.com --subject=payload+' world'" --deploy
I'm testing using this shell command:
http post --data Hello
I am getting the following error message:
Failed message 1: com.sun.mail.smtp.SMTPSendFailedException: 530 5.7.57 SMTP; Client was not authenticated to send anonymous mail during MAIL FROM
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:84)
I have investigated this in the Spring XD documentation and internet searches but I haven't found a solution that works. Can anyone help me with this please?
I found a solution which involves creating a new mail sink module which is a slight modification of spring XD's supplied mail sink module. The stream must also include to, from, host, port, username and password options.
Copy the ..\spring-xd-<version>\xd\modules\sink\mail folder and rename to secure-mail.
In ..\spring-xd-<version>\xd\modules\sink\secure-mail rename both mail.properties and mail.xml to secure-mail.properties and secure-mail.xml respectively.
Replace the contents of secure-mail.xml with the following:
<?xml version="1.0" encoding="UTF-8"?>
<beans:beans xmlns="http://www.springframework.org/schema/integration"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:beans="http://www.springframework.org/schema/beans"
xmlns:file="http://www.springframework.org/schema/integration/file"
xmlns:int-mail="http://www.springframework.org/schema/integration/mail"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:util="http://www.springframework.org/schema/util"
xsi:schemaLocation="http://www.springframework.org/schema/integration/mail http://www.springframework.org/schema/integration/mail/spring-integration-mail.xsd
http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd
http://www.springframework.org/schema/util http://www.springframework.org/schema/util/spring-util.xsd">
<channel id="input" />
<int-mail:header-enricher input-channel="input" output-channel="send">
<int-mail:to expression="${to:null}" />
<int-mail:from expression="${from:null}" />
<int-mail:subject expression="${subject:null}" />
<int-mail:cc expression="${cc:null}" />
<int-mail:bcc expression="${bcc:null}" />
<int-mail:reply-to expression="${replyTo:null}" />
<int-mail:content-type expression="${contentType:null}" />
</int-mail:header-enricher>
<channel id="send" />
<int-mail:outbound-channel-adapter
channel="send" host="${host}" port="${port}" username="${username:}"
password="${password:}" java-mail-properties="javaMailProperties"/>
<util:properties id="javaMailProperties">
<beans:prop key="mail.smtp.starttls.enable">true</beans:prop>
</util:properties>
</beans:beans>
Create the stream as follows:
stream create outlookMailSink --definition "http | secure-mail --to='\"email#address.com\"' --from='\"email#address.com\"' --host=outlook.office365.com --port=587 --username=email#address.com --password=password --subject=payload+' world'" --deploy
Test using shell command: http post --data Hello
The contents of secure-mail.xml is almost identical to mail.xml, the key is to set the property mail.smtp.starttls.enable to be true in order to enable TLS encryption for communication over port 587. OF course you could just modify Spring XD's mail sink module directly and use this - it's up to you.
I'd be interested to hear if anyone has a better solution to this? For example, is it possible to set the mail.smtp.starttls.enable property on start-up of Spring XD thereby allowing you to use the original mail sink module? I tried this by modifying the xd-singlenode startup script - the property was set but it didn't affect the mail sink module.
References:
https://stackoverflow.com/a/24765052/2124106
https://stackoverflow.com/a/23823863/2124106
http://docs.spring.io/spring-integration/reference/html/mail.html
Is there a way to delete / purge all queues in ActiveMQ via the command line (win/linux)?
I could only find the commands for a specific queue.
Or maybe there's a way to do this via the activeMQ admin? Again, I only found how to delete/purge the queues one by one, which can be very tedious.
Thanks!
You can do tweak your activemq.xml a bit:
<broker deleteAllMessagesOnStartup="true" ...>
This works with KahaDB message stores (it has problems with JDBC message stores), all your messages get deleted and subsequently queues are cleared.
As you want all queues to be deleted, restarting the broker won't be a costly option to clean everything up.
The purge will happen on 'every' restart
I developed my own ActiveMQ command line utility (activemq-cli) to do this. You can find it here: https://github.com/antonwierenga/activemq-cli (command 'purge-all-queues' or 'remove-all-queues').
As of version 5.0 it looks like this can be done using the CLI provided with ActiveMQ itself:
$ ActiveMQ/bin/activemq purge
1- go to amq bin folder, in my case:
cd /opt/amq/bin
2- run amq client:
./client
3- run purge on desired queue
activemq:purge <QUEUE NAME HERE>
Another possibility is to deploy a small Camel route in a container (e.g. Apache ServiceMix) or simply by executing a java program which contain the route.
For example here is the route I currently use on my development computer where I also have the ServiceMix installed:
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:ext="http://aries.apache.org/blueprint/xmlns/blueprint-ext/v1.0.0"
xmlns:cm="http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0"
xsi:schemaLocation="
http://www.osgi.org/xmlns/blueprint/v1.0.0 http://www.osgi.org/xmlns/blueprint/v1.0.0/blueprint.xsd
http://camel.apache.org/schema/blueprint http://camel.apache.org/schema/blueprint/camel-blueprint.xsd
http://aries.apache.org/blueprint/xmlns/blueprint-cm/v1.1.0 http://aries.apache.org/schemas/blueprint-cm/blueprint-cm-1.1.0.xsd">
<cm:property-placeholder persistent-id="amq.cleanup" update-strategy="reload">
<cm:default-properties>
<cm:property name="amq.local.url" value="tcp://localhost:61616" />
</cm:default-properties>
</cm:property-placeholder>
<camelContext xmlns="http://camel.apache.org/schema/blueprint">
<onException useOriginalMessage="true">
<exception>java.lang.Exception</exception>
<handled>
<constant>true</constant>
</handled>
<to uri="activemq:queue:CLEANUP_DLQ" />
</onException>
<route id="drop-all-queues" autoStartup="true">
<from uri="activemq:queue:*.>" />
<stop/>
</route>
</camelContext>
<bean id="activemq" class="org.apache.activemq.camel.component.ActiveMQComponent">
<property name="brokerURL" value="${amq.local.url}" />
</bean>
</blueprint>
I seem to be having some trouble configuring my Spring MVC backend to receive and send TCP messages. Looking at the configuration a user suggests in this question - how to plug a TCP-IP client server in a spring MVC application - I tried to place this configuration into my root-context.xml. However, for all of the tags it displays a message such as:
Unable to locate Spring NamespaceHandler for element 'int-ip:tcp-outbound-gateway' of schema namespace 'http://www.springframework.org/schema/integration/ip'
int-ip:tcp-outbound-gateway and int:gateway both display cvc-complex-type.2.4.c: The matching wildcard is strict, but no declaration can be found for element 'int:gateway' (replace int:gateway with int-ip:tcp-outbound-gateway).
Here is my root-context.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:int="http://www.springframework.org/schema/integration"
xmlns:int-ip="http://www.springframework.org/schema/integration/ip"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/integration/ip http://www.springframework.org/schema/integration/ip/spring-integration-ip.xsd
http://www.springframework.org/schema/integration/ http://www.springframework.org/schema/integration/spring-integration.xsd">
<!-- Root Context: defines shared resources visible to all other web components -->
<int:gateway id="gw"
service-interface="org.springframework.integration.samples.tcpclientserver.SimpleGateway"
default-request-channel="input"/>
<int-ip:tcp-connection-factory id="client"
type="client"
host="localhost"
port="1234"
single-use="true"
so-timeout="10000"/>
<int:channel id="input" />
<int-ip:tcp-outbound-gateway id="outGateway"
request-channel="input"
reply-channel="clientBytes2StringChannel"
connection-factory="client"
request-timeout="10000"
remote-timeout="10000"/>
<int:transformer id="clientBytes2String"
input-channel="clientBytes2StringChannel"
expression="new String(payload)"/>
</beans>
What am I doing incorrectly? Also, some general tips as to how I could use Spring to send and receive TCP communications would be appreciated :)
It appears you don't have the spring-integration-ip and spring-integration-core jars on your classpath. You need to bundle them into your war or otherwise make them available on the classpath according to your app server's requirements.
I have several flows in my mule-config.xml but some beans only makes sense to say one flow. Is there a way to define a bean local to a flow. I understand that I can define an inline bean like below:
<custom-transformer name="soapFaultTransformer" class="com.xxx.xx.transformer.VelocityMessageTransformer">
<spring:property name="velocityEngine" ref="velocityEngine" />
<spring:property name="templateName" value="soapFault.vm" />
<spring:property name="beanClass">
<spring:bean class="com.xxx.services.xx.Soap11Fault">
<spring:property name="faultCode" value="Client" />
<spring:property name="faultString" value="Invalid Request" />
<spring:property name="detail" value="..." />
</spring:bean>
</spring:property>
</custom-transformer>
but the inline spring bean is needed to use at 2 places in a single flow? Can I still define it in a single place and refer it in 2 places without making it global bean?
Thank you
How about gathering all the spring bean necessary for a single flow into a separate spring config file that is imported only by that flow?
Your mule config will look like the following:
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:jms="http://www.mulesoft.org/schema/mule/jms" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:spring="http://www.springframework.org/schema/beans" version="EE-3.4.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/jms http://www.mulesoft.org/schema/mule/jms/current/mule-jms.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd ">
<spring:import resource="encapsulated-beans.xml" />
<flow name="flow" >
...
</flow>
</mule>
where encapsulated-beans.xml will be the config file that includes, for example, your com.xxx.services.xx.Soap11Fault bean
As #David said, it's not possible to declare beans specific to a single flow. Declared beans will be available to all flows.
What I want to achieve:
I have set up a Spring Batch Job containing Hadoop Tasks to process some larger files.
To get multiple Reducers running for the job, i need to set the number of Reducers with setNumOfReduceTasks. I'm trying to set this via the JobFactorybean.
My bean configuration in classpath:/META-INF/spring/batch-common.xml :
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:p="http://www.springframework.org/schema/p"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd">
<bean id="jobFactoryBean" class="org.springframework.data.hadoop.mapreduce.JobFactoryBean" p:numberReducers="5"/>
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean" />
<bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager"/>
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher" p:jobRepository-ref="jobRepository" />
</beans>
The XML is included via:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd">
<context:property-placeholder location="classpath:batch.properties,classpath:hadoop.properties"
ignore-resource-not-found="true" ignore-unresolvable="true" />
<import resource="classpath:/META-INF/spring/batch-common.xml" />
<import resource="classpath:/META-INF/spring/hadoop-context.xml" />
<import resource="classpath:/META-INF/spring/sort-context.xml" />
</beans>
I'm getting the beans for the jUnit Test via
JobLauncher launcher = ctx.getBean(JobLauncher.class);
Map<String, Job> jobs = ctx.getBeansOfType(Job.class);
JobFactoryBean jfb = ctx.getBean(JobFactoryBean.class);
The jUnit Test stops with a error:
No bean named '&jobFactoryBean' is defined
So: the JobFactoryBean is not loaded, but the others are loaded correctly and without an error.
Without the line
JobFactoryBean jfb = ctx.getBean(JobFactoryBean.class);
the project tests runs, but there is just one Reducer per job.
The method
ctx.getBean("jobFactoryBean");
returns a Hadoop Job. I would expect to get the factoryBean there...
To test it I have extended the constructor of the Reducer to log each creation of a Reducer to get a notification when one is generated. So far I just get one entry in the log.
I have a 2 VM's with 2 assigned cores and 2 GB ram each, and I'm trying o sort a 75MB file consisting of multiple books from Project Gutenberg.
EDIT:
Another thing i have tried is to set the number of the reducers in the hadoop job via the property, without a result.
<job id="search-jobSherlockOk" input-path="${sherlock.input.path}"
output-path="${sherlockOK.output.path}"
mapper="com.romediusweiss.hadoopSort.mapReduce.SortMapperWords"
reducer="com.romediusweiss.hadoopSort.mapReduce.SortBlockReducer"
partitioner="com.romediusweiss.hadoopSort.mapReduce.SortPartitioner"
number-reducers="2"
validate-paths="false" />
the settings in the mapreduce-site.xml are on both nodes:
<property>
<name>mapred.tasktracker.reduce.tasks.maximum</name>
<value>10</value>
</property>
...and Why:
I want to copy the example of the following blog post:
http://www.philippeadjiman.com/blog/2009/12/20/hadoop-tutorial-series-issue-2-getting-started-with-customized-partitioning/
I need different Reducers on the same machine or a fully distributed environment to test the behaviour of the Partitioner. The first approach would be easier.
P.s.: could a user with a higher reputation create a tag "spring-data-hadoop" Thank you!
Answered the question on the Spring forums where it was also posted (recommend using it for Spring Data Hadoop questions).
The full answer is here http://forum.springsource.org/showthread.php?130500-Additional-Reducers , but in short, the number of reducers is driven by the number of input splits. See http://wiki.apache.org/hadoop/HowManyMapsAndReduces