Apache Camel + Spring Boot + IBM MQ - spring

I want to use Apache Camel to get a message on IBM MQ in a spring boot project.
I use sprin boot annotation based.
I dont find any fully example: pom.xml, receiver, configuration class, ...
Is there anyone to help me? Any link, documentation, ...?
Thanks a lot of

Take a look at a new Spring Boot Starter for MQ that may help here. The README shows how to modify the JMS Getting Started sample here to use IBM MQ instead of ActiveMQ. And the MQ jars - including this starter - are all on Maven Central for easy access.

You could search for an example that uses Spring Boot, Camel and ActiveMQ to get a first impression. Since you use Camel most differences between IBM MQ and ActiveMQ should be hidden.
However, you have to use the standard JMS component instead of the dedicated ActiveMQ component of Camel.

In your Application class, you will need create a bean for a IBM component, I just did it for an application in spring xml, like this:
<bean id="cf" class="com.ibm.mq.jms.MQConnectionFactory">
<property name="transportType" value="1" />
<property name="hostName" value="localhost" />
<property name="port" value="1414" />
<property name="queueManager" value="QMGRSCORE" />
<property name="channel" value="EXTAPP.SRVCONN" />
</bean>
But once I did a bean connection for a MongDB in spring boot, may you can do something like this:
#Bean(name = "myDb")
public MongoClient myDb() {
return new MongoClient();
}
But putting the IBM values inside this bean.

Related

MongoDB Batch Job Broken in Spring XD 1.2.0+

We have a batch job running in Spring XD which reads from MongoDB using the standard MongoItemReader which converts mongo records to our domain model. Up to Spring XD version 1.1.3 this worked fine, however in versions 1.2.0 and 1.2.1 the job is failing with the following error (package name shortened)
java.lang.NoClassDefFoundError: c/s/r/b/b/domain/IndexId
at c.s.r.b.b.domain.IndexId_Instantiator_hxmj4p.newInstance(Unknown Source) ~[na:na]
at
org.springframework.data.convert.BytecodeGeneratingEntityInstantiator$EntityInstantiatorAdapter.createInstance(BytecodeGeneratingEntityInstantiator.java:193) ~
[spring-data-commons-1.10.0.RELEASE.jar:na]
at org.springframework.data.convert.BytecodeGeneratingEntityInstantiator.createInstance
(BytecodeGeneratingEntityInstantiator.java:76) ~[spring-data-commons-1.10.0.RELEASE.jar:na]
at
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:250) ~[spring-data-mongodb-1.7.0.RELEASE.jar:na]
at
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:231) ~[spring-data-mongodb-1.7.0.RELEASE.jar:na]
Looking into this I found the threads NoClassDefFoundError when making a query in spring-data-solr within a play framework application, and NoClassDefFoundError after upgrading to 1.7.0.RELEASE which suggest this is due to a change in spring-data-mongo 1.7.0 and the underlying spring-data-commons to change the default entity instantiation technique to improve performance.
Based on the suggested fix in those threads I've modified the mongo template in my job module XML definition as follows and this fixes the problem:
<bean id="mappingConverter" class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg ref="dbRefResolver"/>
<constructor-arg ref="mongoMappingContext"/>
<property name="instantiators" ref="entityInstantiators" />
</bean>
<bean id="dbRefResolver" class="org.springframework.data.mongodb.core.convert.DefaultDbRefResolver">
<constructor-arg ref="mongoDbFactory"/>
</bean>
<bean id="mongoMappingContext" class="org.springframework.data.mongodb.core.mapping.MongoMappingContext"/>
<bean id="entityInstantiators" class="org.springframework.data.convert.EntityInstantiators">
<constructor-arg value="#{T(org.springframework.data.convert.ReflectionEntityInstantiator).INSTANCE}"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg name="mongoDbFactory" ref="mongoDbFactory" />
<constructor-arg name="mongoConverter" ref="mappingConverter" />
</bean>
However this is verbose and obviously isn't an ideal fix. The problem doesn't show up in our job module integration test so I have a hunch its caused by a combination of the default entity instantiation change and the fact that when a module executes in Spring XD the domain classes will be in the module's class loader and not visible to the spring data mongo classes in XD's main class loader.
So should this be regarded as a bug in Spring XD or Spring Data Mongo? One fix might be an improvement to the Spring Data Mongo mongo:mapping-converter XML configuration to allow forcing the use of the ReflectionEntityInstantiator which would at least reduce the amount of XML needed above. Alternatively maybe Spring XD should handle this scenario automatically?
I don't think there is anything we can do from the XD side since this is a custom job. We have to rely on the spring-data-mongodb functionality.
It looks like you're running into DATACMNS-710, which is fixed in Fowler SR1 (equivalent to Spring Data MongoDB 1.7.1). You might wanna try the just released Gosling release, too.

Mule JMS connector with Websphere default messages provider

We are trying to connect from our mule service to queue. This queue is on Websphere application server and we are using Websphere default messages provider.
How can we set our connector configuration to match this queue ??
We are using default JMS connector..
You need to refer to the section of the IBM Knowledge Center explaining JNDI Connections to SiBus.
If you are using EE, then I recommend using WMQ connector. Documentation available in: http://www.mulesoft.org/documentation...
If you have to use JMS, you need to create a spring bean for WebSphere connection factory and use it in your JMS connector connectionFactory-ref attribute.
<spring:bean name="MQConnectionFactory" class="com.ibm.mq.jms.MQQueueConnectionFactory">
<spring:property name="hostName" value="localhost"/>
<spring:property name="port" value="1414"/>
<spring:property name="queueManager" value="localmanager"/>
<spring:property name="transportType" value="1"/>
</spring:bean>
Don't forget to copy com.ibm.mqjms.jar into your Mule's classpath.
You need to use the com.ibm.mq.jms.MQXAQueueConnectionFactory class instead if you are using XA transactions.

Unit Testing based on JNDI , ejb and spring

In my application I am injecting some of services based on EJB with use of Spring IOC through JndiObjectFactoryBean like below mentioned so during run the junit I am getting this exception "java.lang.IllegalArgumentException: This JNDI operation is not implemented by the JNDI provider."
Could some please let me know how I'll configure for Junit.
<bean id="xxxMenuItemService" class="xxxMenuItemServiceyyy">
<property name="xxxMenuItemDelegator" ref="xxxMenuItemDelegator" />
</bean>
<bean id="approveMenuItemServiceRemote"
class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiName"
value="ejb/XXXXXXXX" />
have a look at the SimpleNamingContextBuilder from org.springframework.mock as it provides a full context builder where you can bind mock or other objects for use by Spring's JNDI lookup.
One thing to do though is to make sure you build the SimpleNamingContextBuilder in the static #BeforeClass of JUnit 4. this means that it is all initialized and waiting before the Spring Application Context is started and you won't have any JNDI lookup failures.

Using Spring Transactions in MyBatis API

We have developed a data persistence framework using Mybatis. The framework uses plain MyBatis APIs. (We were prohibited from using any mybatis-spring, do not ask… why?)
Now we have to integrate this persistence framework with another framework developed by other teams. This other framework heavily uses spring transactions for everything. Our persistent framework DAOs will be used by this framework within its own API ….that means the spring managed transactions will be propagated to MyBatis DAO. It is expected that our MyBatis based persistence framework should participate in spring managed transactions without any issues.
There are two options for us to make this work
(1)Change our persistent framework to use mybatis-spring module. Change DAOs to use mappers directly injected using spring and spring’s SqlSessionFactoryBean. I did build a small example simulating both the frameworks and everything works without any issue. The problem is with this approach that it requires changing almost all the DAOs to use spring injected mapper, extensively test the framework again. We simply do not have time available due to delivery timeline.
(2)Use mybatis-spring, define SqlSeeionFactory using spring – set the datasource and transaction manager used by other framework. Something like
<bean id="smpDataSource" class="oracle.jdbc.pool.OracleDataSource" destroy-method="close">
<property name="connectionCachingEnabled" value="true" />
<property name="URL"> <value>${db.thin.url}</value></property>
<property name="user"> <value>${db.user}</value></property>
<property name="password"><value>${db.password}</value>
</property>
</bean>
<bean id="dbTransactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="smpDataSource" />
</bean>
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="smpDataSource" />
<property name="typeAliasesPackage" value="spike.smp51.domain" />
<property name="mapperLocations" value="classpath*:spike/smp51/mappers/*.xml" </bean>
Then in applicataion code MyBatis DAO gets the sqlseesionfactory from spring like
public static SqlSessionFactory getSqlSessionFactory() throws Exception
{
DefaultSqlSessionFactory sessionFactory = (DefaultSqlSessionFactory)ctx.getBean("sqlSessionFactory");
return sessionFactory;
}
All DAOs already use SqlSeesionFactory to open and close sessions. Just replace that mybatis created sqlseeionfactory with spring created sqlseeionfactory. That way we will have only few lines of changes.
This approach is outlined here
http://mybatis.github.io/spring/using-api.html
The mybatis documentation warns about this approach – specifically that it will not participate in spring transactions.
When I tried the 2nd approach, our framework was able to participate in spring transactions. This is strange. Is the MyBatis documentation incorrect then? I did verify it extensively by creating various transaction boundaries using spring transactions + AOP . MyBatis DAOs are able to participate in spring managed transactions every time. Since this second approach will save us 90% of the development time – we really like to use it – but worried since MyBatis warns following this approach. Has anyone tried this approach? Any feedback is greatly appreciated.
Did you have any return on that ?
I'm wondering if in the doc they're talking about org.apache.ibatis.session.SqlSessionFactory from Mybatis-api while the SqlSessionFactory you're using is from the mybatis-spring lib : org.mybatis.spring.SqlSessionFactoryBean

Do I have to use jetty embeded server to test my active mq structure in a Spring project?

Do I have to use jetty embeded server to test my active mq structure in a Spring project? Any help would be appreciated.
Please see the following links from ActiveMQ:
How to unit test JMS code
Integration Tests > Example Testing Scenario
Unit testing with JMS (ActiveMQ)
At first, add context your listener container bean like the following:
<bean id="sampleListenerContainer" class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="concurrency" value="20-100"/>
<property name="connectionFactory" ref="connectionFactory"/>
<property name="destination" ref="sampleDestination"/>
<property name="messageListener" ref="sampleListenerConsumer"/>
</bean>
Secondly,
DefaultMessageListenerContainer container = BeanProvider.getBeanFactory().getBean("sampleListenerContainer", DefaultMessageListenerContainer.class);
container.start();
Similarly,
container.stop();

Resources