Where does business processing datasource config go in Spring Batch? - spring

I have been learning the Spring Batch framework to try to put in practice at work through the online documentation as well as the Pro Spring Batch book by Appress. I have a quick question.
Scenario
I want to do a simple test where I read from the database, do some processing, and then write to another database.
Question
I understand that there is a configuration file called launch-context.xml that contains the Job Repository database schema to maintain the state of the jobs and each of the steps for each one of them.
Say that I have a Source Database (A) where I do a read from and a Target Database (B) where I write to.
Maybe I have overlooked it but...
Where do I put the data source information for A and B?
I guess it depends on the answer of #1 but if put it under src/main/resources say for example source-datasource.xml and target-datasource.xml How is Spring going to pick it up and wire it appropriately? In Spring web app development I usually put those types of files under the context-param tag.

You can define these datasources in any spring file of your choosing, so yes:
src/main/resources/db/source-datasource.xml
src/main/resources/db/target-datasource.xml
will do.
Let's say you named your datasource beans as a sourceDataSource and a targetDataSource. The way you tell Spring Batch ( or in this case just Spring ) to use them is through the "import" and "dependency injection".
Importing
You can organize your spring configs the way you fit best, but since you already have launch-context.xml, in order for the above datasources to be visible, you need to import them into launch-context.xml as:
<import resource="classpath:db/source-datasource.xml"/>
<import resource="classpath:db/target-datasource.xml"/>
Injecting / Using
<bean id="sourceReader" class="org.springframework.batch.item.database.JdbcCursorItemReader">
<property name="dataSource" ref="sourceDataSource" />
<!-- other properties here -->
</beans:bean>
<bean id="targetWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="dataSource" ref="targetDataSource" />
<!-- other properties here -->
</beans:bean>
where a sourceReader and a targetWriter are the beans you would inject into your step(s).

Related

Spring Context File Reuse with Different Properties

I have a fairly straight-forward spring-integration app that polls an external FTP site, processes any found files using some internal business logic and then posts a results file via FTP. Currently when we add a new customer, we copy the XML config file and a corresponding properties file and change a prefix on all bean definitions, bean references and properties. We then add those to an XML config file that simply does imports for all existing customers and their properties:
<property-placeholder location="
file:config/application.properties
,file:config/test-customer1.properties
,file:config/test-customer2.properties
"/>
<import resource="test-customer1-context.xml" />
<import resource="test-customer2-context.xml" />
These xml files are identical except for a unique customer prefix in the bean names. Seems like there must be a way for me to reuse a single XML file (or a Java Config object) with different sets of properties. My first thought was to implement a custom CustomerScope, but I don't see how the scope implementation can know the proper customer.
Any ideas as to how to accomplish this?
As requested, sample of customer context file
<bean id="testClient1RequestQueue" class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg value="${test.client1.in.queue}"/>
</bean>
<bean id="testClient1ResultsQueue" class="org.apache.activemq.command.ActiveMQQueue">
<constructor-arg value="${test.client1.results.queue}"/>
</bean>
...

Can't seem to get a JNDI JDBC resource working in Liferay

No joy in the Liferay forum on this issue and the clock is ticking on this project. This may be caused by my lack of knowledge of Spring.
I have a JNDI global resource defined in server.xml and a resource link in context.xml in my Tomcat 7 /conf folder. I KNOW the JNDI resource is being loaded because I see the validation query being run as the server starts up. So far so good.
I have a portlet that just provides services to other portlets. In that portlet I have a hibernate.cfg.xml which has a session-factory that also points to the JDBC resource (don't know if this is needed or not). I also have an ext-spring.xml file in the services portlet that has the following:
<bean id="liferayHibernateSessionFactory" class="com.liferay.portal.spring.hibernate.PortletHibernateConfiguration" >
<property name="dataSource" ref="MyJDBCResource" />
</bean>
<bean id="MyJDBCResource" class="org.springframework.jndi.JndiObjectFactoryBean" >
<property name="jndiName" value="java:comp/env/jdbc/MyJDBCResource" />
</bean>
Adding the above in ext-spring.xml fixed an issue with a bean error on that services portlet upon deployment. In that service builder built portlet, a services jar was created and I put that service jar in the Tomcat_Home/lib/ext folder so that I could use the services provided by the portlet in my portlet. So far so good. But, when I invoke the portlet method which calls the services provided by the other portlet with the JNDI references, I get a "user lacks privilege or object not found" error. It is definitely object not found. When the query is run I see absolutely NO activity on the JDBC connection specified by the JNDI resource entry and in drilling down on the connection properties I only see the HSQLDB driver in use. It should be using the MSSQL driver specified in my global resource JNDI entry as far as I understand it.
SO WHAT AM I DOING WRONG? Do I need to add some configuration entries in the portlet that invokes the services?
This seems so simple. In reading the many posts that give instructions on using JNDI/JDBC resources I seem to have followed them correctly. Is there some trick to using JNDI/JDBC resources in LR 6.1.1 and Tomcat 7 that I have missed?
Thanks (and really hoping for some answers!).
First, you could try rewrite JNDI resource reference like this:
<bean id="MyJDBCResource" class="org.springframework.jndi.JndiObjectFactoryBean" >
<property name="jndiName" value="jdbc/MyJDBCResource" />
</bean>
also, you could try different approach on JNDI resource lookup in Spring:
<jee:jndi-lookup id="MyJDBCResource" jndi-name="jdbc/MyJDBCResource" expected-type="javax.sql.DataSource" />
Not sure about first approch, but second will definitively fail early in case no JNDI resource could be found.
Hope this helps.

How to securely store database credentials in a Spring MVC web application?

I have a Spring MVC web application which needs to connect to a database, and the data source credentials are currently (in development) being kept in the application context configuration file, i.e. in WEB-INF\spring\application_context.xml like so:
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="username" value="my_username" />
<property name="password" value="my_password" />
<property name="url" value="my_datasource_url" />
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver" />
</bean>
Before I deploy this application to a public facing Tomcat server I want to make sure that I'm not making these database credentials visible to the world or easily discovered by a crafty hacker. Am I OK keeping things as they are, i.e. in plain text and in an XML file under WEB-INF? If not then is there a best practice for this sort of thing?
Thanks in advance for your help.
Files stored in WEB-INF folder are by definition inaccessible from the outside world. E.g. if you put JSP file there no user can access that file directly - it can be referenced for example by importing or including from another JSP.
That being said your credentials are safe, but it is not a good practice to hard code them in the applicationContext.xml. Are you pushing these credentials to your source control?
The safest way is to extract sensitive and frequently changing configuration in an external .properties file somewhere on your hard disk. Spring is perfectly capable of reading .properties files.
Another thing to consider would be to have a JNDI lookup. JNDI would be contained within the servlet container (tomcat in your case) allowing connections to the database only through webapps currently deployed from Tomcat.
This way also creates a silo'd experience between build management and development so development doesn't have the keys to the car, if you get my metaphor.
You could make it ask the password on startup. But usually what i see is passwords in xml files which are chmodded and chowned to be only accessible by the web software itself. There are no easy good solutions.
One basic step is to make sure that the tomcat-user has only the access it needs on the database , so if it is compromised, the damage is limited.

Correct way to get transactions using Spring Data Neo4j's simple object/graph mapping?

I'm using the simple object/graph mapping in Spring Data Neo4j 2.0, where I perform persistence operations using the Spring Data repository framework. I'm working with the repositories rather than working with the Neo4jTemplate. I inject the repositories into my Spring Web MVC controllers, and the controllers call the repos directly. (No intermediate service layer--my operations are generally CRUDs and finder queries.)
When I do read operations, there are no issues. But when I do write operations, I get "NotInTransactionException". My understanding is that read ops in Neo4j don't require transactions, but write ops do.
What's the best way to get transactions into the picture here, assuming I want to stick with the simple OGM? I'm wanting to use #Transactional, but putting that on the various repository interfaces doesn't work. If I introduce an intermediate service tier in between the controllers and the repositories and then annotate the service beans with #Transactional, then it works, but I'm wondering whether there's a simpler way to do it. Without Spring Data, I'd typically have access to the DAO (repository) implementations, so I'd be able to annotate the concrete DAOs with #Transactional if I wanted to avoid a pass-through service tier. With Spring Data the repos are dynamically generated so that doesn't appear to be an option.
First, note that having transactional DAOs is not generally a good practice. But if you don't have a service layer, then let it be on the DAOs.
Then, you can enable declarative transactions. Here's how I did it:
First, define an annotation called #GraphTransactional:
#Retention(RetentionPolicy.RUNTIME)
#Transactional("neo4jTransactionManager")
public #interface GraphTransactional {
}
Update: spring-data-neo4j have added such an annotation, so you can reuse it instead of creating a new one: #Neo4jTransactional
Then, in applicationContext.xml, have the following (where neo4jdb is your EmbeddedGraphDatabase):
<bean id="neo4jTransactionManagerService"
class="org.neo4j.kernel.impl.transaction.SpringTransactionManager">
<constructor-arg ref="neo4jdb" />
</bean>
<bean id="neo4jUserTransactionService" class="org.neo4j.kernel.impl.transaction.UserTransactionImpl">
<constructor-arg ref="neo4jdb" />
</bean>
<bean id="neo4jTransactionManager"
class="org.springframework.transaction.jta.JtaTransactionManager">
<property name="transactionManager" ref="neo4jTransactionManagerService" />
<property name="userTransaction" ref="neo4jUserTransactionService" />
</bean>
<tx:annotation-driven transaction-manager="neo4jTransactionManager" />
Have in mind that if you use another transaction manager as well, you'd have to specify order="2" for this annotation-driven definition, and also have in mind that you won't have two-phase commit if you have one method that is declared to be both sql and neo4j transactional.

Good example of Spring Configuration using java.util.prefs or Commons Configuration

One application I'm working on has several URLs and other information that is instance specific. The first pass uses a typical Spring PropertyPlaceholderConfigurer with a properties file:
<bean id="propertyConfigurer" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="location" value="classpath:application.properties"/>
</bean>
The main issue with this is of course the property file is an artifact that must be checked in, and for starting a new instance would require updating that artifact. For a streamline deployment, I would like to have the ApplicationContext bootstrap itself based on database table(s). I have seen solutions like this forum post, does anyone here know of better tools or is this defacto approach to this problem? I would also like to be able to update/reload the settings at runtime using JMX or other facilities, but having to restart the app after changes to the database would still be a better solution to the current one.
The way we did it was to put some configuration information in the environment and then pull the relevant info from there.
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="searchSystemEnvironment" value="true" />
</bean>
If configuration changes then the app will need to be restarted. Can also put all the different configurations into the environment and nest the variables like the following:
<bean id="db" class="org.DataSource"
p:databaseServer="${${MODE}_DBSERVER}"
p:databaseName="${${MODE}_DBNAME}" />
where $MODE = dev, qa, etc.

Resources