Spring PropertyPlaceholderConfigurer and passing multiple queries at startup - spring

I am working on some existing application and it loads single DB query at server startup .
Now I want to pass more (may 3-4 queries) in same code instead one query.
i.e.How I can pass multiple queries in spring
Here is code-
myspring.xml
<property name="mypropertyfiles">
<list>
<value>test1.properties</value>
<value>test2.properties</value>
</list>
</property>
<bean id="mybean" class="com.test.MyBean">
<constructor-arg><ref-bean="mypropertyfiles"/><const-arg>
<constructor-arg><ref-bean="dataSource"/><const-arg>
<constructor-arg><value>select pcode, pname from PRODUCT1</value>
// Here only one query. **but I want to pass more queries like**
//select pcode, pname from PRODUCT2,
//select ocode, oname from ORDER1,
//select ocode, oname from ORDER2
<const-arg> <property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE" />
</bean>
MyBean.java
public class MyBean extends PropertyPlaceholderConfigurer
{
private DataSource dSource;
private String dbQuery;
// this is existing map to store PRODUCT1 table details
Map product1Map = new ConcurrentHashMap();
//lly, I will create more product2Map,order1Map , order2Map to store my
//queries data from PRODUCT2,ORDER1,ORDER2 tables
// here it is taking only one query.But I want more queries ie list
MyBean (Resource[] resources , DataSource dSource,String dbQuery){
super();
setLocations(resources);
this.dSource=dSource;
this.dbQuery=dbQuery;
}
#override
public String resolvePlaceholder(String p,Peroperties p){
//some code.....
loadQuery();
}
loadQuery(){
JdbcTemplate j;
j.execute(dbQuery,.......) // exsiting code has one query only{
public Map doInPreparestament(Preaprestatement ps){
rs =ps.executeQuery..
while(rs.next){
product1Map.put(rs.getString("pcode"),rs.getString("pname")); // only one map
}
}
}
}
Questions -
How I can pass multiple queries from "myspring.xml" file
I want to use existing loadQuery method to load all my queries and put into other Maps i.e. product2Map,order1Map ,order2Map

Related

Spring Batch CompositeItemProcessor get value from other delegates

I have a compositeItemProcessor as below
<bean id="compositeItemProcessor" class="org.springframework.batch.item.support.CompositeItemProcessor">
<property name="delegates">
<list>
<bean class="com.example.itemProcessor1"/>
<bean class="com.example.itemProcessor2"/>
<bean class="com.example.itemProcessor3"/>
<bean class="com.example.itemProcessor4"/>
</list>
</property>
</bean>
The issue i have is that within itemProcessor4 i require values from both itemProcessor1 and itemProcessor3.
I have looked at using the Step Execution Context but this does not work as this is within one step. I have also looked at using #AfterProcess within ItemProcessor1 but this does not work as it isn't called until after ItemProcessor4.
What is the correct way to share data between delegates in a compositeItemProcessor?
Is a solution of using util:map that is updated in itemProcessor1 and read in itemProcessor4 under the circumstances that the commit-interval is set to 1?
Using the step execution context won't work as it is persisted at chunk boundary, so it can't be shared between processors within the same chunk.
AfterProcess is called after the registered item processor, which is the composite processor in your case (so after ItemProcessor4). This won't work neither.
The only option left is to use some data holder object that you share between item processors.
Hope this helps.
This page seems to state that there are two types of ExecutionContexts, one at step-level, one at job-level.
https://docs.spring.io/spring-batch/trunk/reference/html/patterns.html#passingDataToFutureSteps
You should be able to get the job context and set keys on that, from the step context
I had a similar requirement in my application too. I went with creating a data transfer object ItemProcessorDto which will be shared by all the ItemProcessors. You can store data in this DTO object in first processor and all the remaining processors will get the information out of this DTO object. In addition to that any ItemProcessor could update or retrieve the data out of the DTO.
Below is a code snippet:
#Bean
public ItemProcessor1<ItemProcessorDto> itemProcessor1() {
log.info("Generating ItemProcessor1");
return new ItemProcessor1();
}
#Bean
public ItemProcessor2<ItemProcessorDto> itemProcessor2() {
log.info("Generating ItemProcessor2");
return new ItemProcessor2();
}
#Bean
public ItemProcessor3<ItemProcessorDto> itemProcessor3() {
log.info("Generating ItemProcessor3");
return new ItemProcessor3();
}
#Bean
public ItemProcessor4<ItemProcessorDto> itemProcessor4() {
log.info("Generating ItemProcessor4");
return new ItemProcessor4();
}
#Bean
#StepScope
public CompositeItemProcessor<ItemProcessorDto> compositeItemProcessor() {
log.info("Generating CompositeItemProcessor");
CompositeItemProcessor<ItemProcessorDto> compositeItemProcessor = new CompositeItemProcessor<>();
compositeItemProcessor.setDelegates(Arrays.asList(itemProcessor1(), itemProcessor2(), itemProcessor3), itemProcessor4()));
return compositeItemProcessor;
}
#Data
public class ItemProcessorDto {
private List<String> sharedData_1;
private Map<String, String> sharedData_2;
}

CSV File To DB2 Database - skip columns - Spring Batch project

I am working on a Spring batch project where I have to push data from a CSV file into a DB. Managed to implement the batch and the rest, currently the data is being pushed as it should but I wonder if there's anyway to skip some of the columns in the CSV file as some of them are irrelevant.
I did a bit of research but I wasn't able to find an answer, unless I missed something.
Sample of my code below.
<bean id="mysqlItemWriter"
class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="dataSource" ref="dataSource" />
<property name="sql">
<value>
<![CDATA[
insert into WEBREPORT.RAWREPORT(CLIENT,CLIENTUSER,GPS,EXTENSION) values (:client, :clientuser, :gps, :extension)
]]>
</value>
</property>
You can implement your FieldSetMapper which will map structure from one line to your POJO in reader.
Lets say you have:
name, surname, email
Mike, Evans, test#test.com
And you have model of Person with only name and email. You are not interested in surname. Here is reader example:
#Component
#StepScope
public class PersonReader extends FlatFileItemReader<Person> {
#Override
public void afterPropertiesSet() throws Exception {
//load file in csvResource variable
setResource(csvResource);
setLineMapper(new DefaultLineMapper<Person>() {
{
setLineTokenizer(new DelimitedLineTokenizer());
setFieldSetMapper(new PersonFieldSetMapper());
}
});
super.afterPropertiesSet();
}
}
And you can define PersonFieldSetMapper:
#Component
#JobScope
public class PersonFieldSetMapper implements FieldSetMapper<Person> {
#Override
public Person mapFieldSet(final FieldSet fieldSet) throws BindExceptio
{
final Person person = new Person();
person.setName(fieldSet.readString(0)); // columns are zero based
person.setEmail(fieldSet.readString(2));
return person;
}
}
This is for skipping columns, if I understood right this is what you want. If you want to skip rows, it can be done as well and I explained how to skip blank lines for example in this question.
if the check for the skip is simple and does not need a database roundtrip, you can use a simple itemProcessor, which returns null for skipped items
real simple pseudo code
public class SkipProcessor implements ItemProcessor<Foo,Foo>{
public Foo process(Foo foo) throws Exception {
//check for a skip
if(skip(foo)) {
return null;
} else {
return foo;
}
}
}
if the skip check is more complex and needs a database roundtrip, you can use the item processor, but the performance (if needed) will suffer
if performance is critical...well then it depends on setup, requirements and your possibilities, i would try it with 2 steps, one step loads cvs into database (without any checks), second steps reads data from database, and the skip check is done with a clever sql JOIN in the SQL for the itemReader

Using one DataSource object for multiple DB connections - using only one connection at a time

Can we define only one DataSource object and wire it dynamically at runtime connecting to different databases ? I need to connect to only one database at a time.
I will be passing the name of the Database as argument. I will lookup the DB URL and other details from a property file and then I need to connect to the DB using the DB URL.
In Short - I do not know the number of databases I need to connect to. I will have all possible database connection details configured in the database.properties file following a certain syntax (like prefixed with DB01 etc.). The name of the DB will be passed as argument and I need to execute the query against that database.
database.properties file
DB01.driver=com.ibm.db2.jcc.DB2Driver
DB01.url=jdbc:db2://localhost:50000/SAMPLE
DB01.username=db2admin
DB01.password=db2admin
DAO class
#Autowired
#Qualifier("DB01") // how do I make this dynamic ?
private DataSource datasource;
private JdbcTemplate jdbcTemplate;
// some more code
public SqlRowSet executeQuery(String sqlQuery)
{
// can I pass the DB name here (the database.properties file will have the DB details
// with this name as given above) and set the DataSource Object accordingly ?
// so that the query will be executed against that DB ?
setJdbcTemplate(new JdbcTemplate(this.datasource));
return getJdbcTemplate().queryForRowSet(sqlQuery);
}
Using Spring v4.1.4 RELEASE. Thanks !!
You can define a Routing DataSource that redirects the getConnection method to one datasource or another based on a key, in your case, it seems to be the database name.
For instance, the spring xml:
....
<bean id="DB01DataSource" parent="parentDatasource" p:url="${DB01.url}" ... />
<bean id="DB02DataSource" parent="parentDatasource" p:url="${DB02.url}" .../>
<bean id="dataSource" class="DBRoutingDataSource">
<property name="targetDataSources">
<map key-type="java.lang.String">
<entry key="DB01" value-ref="DB01DataSource"/>
<entry key="DB02" value-ref="DB02DataSource"/>
</map>
</property>
<property name="defaultTargetDataSource" ref="DB01DataSource"/>
</bean>
....
The DBRoutingDataSource class:
public class DBRoutingDataSource extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
return DBContextHolder.getDB();
}
}
And the DBContextHolder class:
public final class DBContextHolder {
private static final ThreadLocal<String> CONTEXT = new ThreadLocal<String>();
private DBContextHolder() {
// empty
}
public static void setDB(final String db) {
CONTEXT.set(db);
}
public static String getDB() {
return CONTEXT.get();
}
public static void clearDB() {
CONTEXT.remove();
}
}
In your service class before calling your DAO you set the key that will enable the Routing DataSource to get the right connection:
DBContextHolder.setDB("DB01");
try{
dao.executeQuery(sqlSentence);
}finally{
DBContextHolder.clearDB();
}

JdbcBatchItemWriter updating last record in database

I am facing an issue, the JdbcBatchItemWriter is picking the last record for update, I have created ItemPrepartedStatementSetter which is iterating through a Array of objects and setting up preparedStatement. Please provide any input, below is the code
XML :
<bean id="jdbcWriter"
class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="sql"
value="UPDATE XTABLE SET FLG = ? , LST_UPDT_DT =?
where CTGRY= ? AND TYPE =? AND SEQ_NBR =? AND SOURCE_KEY =?" />
<property name="dataSource" ref="dataSource" />
<property name="itemPreparedStatementSetter">
<bean class="com.batch.setter.XItemPrepartedStatementSetter" />
</property>
</bean>
Class
public class XItemPrepartedStatementSetter implements
ItemPreparedStatementSetter<AggrPreference[]> {
#Override
public void setValues(X[] xArr, PreparedStatement preparedStatement)
throws SQLException {
for (int i = 0; i < xArr.length; i++) {
X x = xArr[i];
preparedStatement.setString(1, x.getFLG());
preparedStatement.setTimestamp(2, x.getLST_UPDT_DT());
preparedStatement.setString(3, x.getCTGRY());
preparedStatement.setString(4, x.getTYPE());
preparedStatement.setString(5, x.getSEQ_NBR());
preparedStatement.setBigDecimal(6, x.getSOURCE_KEY());
}
}
}
Before all: why setValues() has X[],PreparedStatement and not AggrPreference[],PreparedStatement as signature?
You are using the JdbcBatchItemWriter in a wrong way because you are not perform an update for every element in X[] arr but only for the last one (and this is the cause you get updated only the last element).
If your domain object is correct (the X[] arr) you have to write a custom ItemWriter that loop through object and use the real JdbcBatchItemWriter to perform write one element at time.
If your intention was to write each X as single item your domain object is wrong and should be a single X object and not an array.
According to those modifications you have to change XItemPrepartedStatementSetter as
public class XItemPrepartedStatementSetter implements ItemPreparedStatementSetter<AggrPreference> {
#Override
public void setValues(AggrPreference x, PreparedStatement preparedStatement) throws SQLException {...}
}
Object returned from ItemProcessor must be of type AggrPreference and SB will write each chunk of object using your new configuration.
I hope I was clear, English is not my native language.

Spring 3.1 and Oracle Audit Trail: Providing application data to triggers

Problem Parameters:
Spring 3.1
Oracle 11.2.0.3
Glassfish 2.1 Application Server, providing a JDBC connection pool.
Problem Description:
I am retrofitting user auditing in an existing set of administrative applications for adding, editing and deleting customer users. I need to store the ID of the administrative user in audit records created by Oracle triggers associated with a number of tables. I want to make the administrative user Id accessible to the triggers by setting the Oracle CLIENT_IDENTIFIER attribute on each a connection retrieved from the connection pool before a database operation and then clear the attribute after the database operation. I have something that works, but I don't really like the way it is done.
The Question:
Is there a way to access connections so an Oracle context attribute can be set before and after a database operation? Maybe some kind of listener responding to an event?
I have looked at:
A million web pages (OK maybe that's an exaggeration, but I've googled for a three or four hours).
Using DataSourceUtils to get connections. This would work, but I really don't want to manage the connections, I just want to intercept them on the way in and out of the pool to set the CLIENT_IDENTIFIER attribute value.
Overriding the getConnection method of the datasource. Since this gets called somewhere inside the JdbcTemplate, I can't get the application data to the method.
I'm hoping that the Spring and/or Oracle Gurus will say "Well it's obvious and the answer is ... " without having to hack through my implementation, but here it is anyway. If nothing else, this does work in case someone is looking for an idea.
My Implementation:
All database operations are done using a JdbcOperations reference to a JdbcTemplate object injected into a Dao. The add, edit and delete operation use the JdbcOperations query method, passing either a PreparedStatementCreator or a BatchPreparedStatementSetter. I access the java.sql.Connection object provided by the application server connection pool in the callback methods for these objects (createPreparedStatement or setValues) to set the CLIENT_IDENTIFIER attribute.
applicationContext.xml datasource configuration:
<!-- Setup the datasource -->
<bean id="dataSource" class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="jndiName" value="jdbc/IpOneDatabasePool"/>
</bean>
<!-- Setup the transaction manager -->
<bean id="txManager" class="org.springframework.transaction.jta.JtaTransactionManager" />
<tx:advice id="txAdvice" transaction-manager="txManager">
<tx:attributes>
<tx:method name="get*" read-only="true"/>
<tx:method name="*"/>
</tx:attributes>
</tx:advice>
<!-- Associate the transaction manager with objects that must be managed. -->
<aop:config>
<aop:pointcut id="userDaoOperation" expression="execution(* com.myCompany.IpOne.dao.UserDaoImpl.*(..))"/>
<aop:advisor advice-ref="txAdvice" pointcut-ref="userDaoOperation"/>
</aop:config>
<!-- Bean providing access to the various prepared statement objects -->
<bean id="daoHelperFactory" class="com.myCompany.IpOne.dao.DaoHelperFactoryImpl" />
<!-- Bean that allows setting of the client identifier for the audit trail -->
<bean id="databaseContextEditor" class="com.myCompany.IpOne.dao.OracleDatabaseContextEditor" />
<!-- Dao that manages persistence of User objects -->
<bean id="userDao" class="com.myCompany.IpOne.dao.UserDaoImpl" >
<property name="dataSource" ref="dataSource"/>
<property name="licenseDao" ref="licenseDao"/>
<property name="appPropertyManager" ref="appPropertyManager"/>
<property name="maximumLicensesPerUserKey" value="max_licences_per_user"/>
<property name="daoHelperFactory" ref="daoHelperFactory"/>
</bean>
This is the user Dao interface
public interface UserDao {
void addUser(User newUser,String adminUserId);
}
This is the user Dao class
public class UserDaoImpl implements UserDao{
private JdbcOperations jdbcOperations;
public void setDataSource(DataSource dataSource) {
this.jdbcOperations = new JdbcTemplate(dataSource);
}
public void addUser(User newUser,String adminUserId) {
PreparedStatementCreator insertUserStatement =
this.daoHelperFactory.getInsertUserStatement(newUser,adminUserId);
KeyHolder keyHolder = this.daoHelperFactory.getKeyHolder();
this.jdbcOperations.update(insertUserStatement, keyHolder);
newUser.setUserId(keyHolder.getKey().intValue());
}
}
This class provides access to the application context.
public class ApplicationContextProvider implements ApplicationContextAware{
private static ApplicationContext ctx = null;
public static ApplicationContext getApplicationContext() {
return ctx;
}
public void setApplicationContext(ApplicationContext ctx) throws BeansException {
this.ctx = ctx;
}
}
Interface for classes that provides various objects used by the Dao.
public interface DaoHelperFactory {
PreparedStatementCreator getInsertUserStatement(User user,String adminUserId);
KeyHolder getKeyHolder();
}
This class is just a factory for PreparedStatementCreator and BatchPreparedStatementSetter objects and other objects used by the Dao. I've changed it to provide the object that actually sets the database context attribute to the various objects being returned.
public class DaoHelperFactoryImpl implements DaoHelperFactory{
private DatabaseContextEditor getDatabaseContextEditor(){
ApplicationContext appContext = ApplicationContextProvider.getApplicationContext();
DatabaseContextEditor databaseContextEditor = (DatabaseContextEditor) appContext.getBean("databaseContextEditor");
return databaseContextEditor;
}
public KeyHolder getKeyHolder(){
return new GeneratedKeyHolder();
}
public PreparedStatementCreator getInsertUserStatement(User user,String adminUserId){
InsertUser insertUser = new InsertUser(user,adminUserId);
insertUser.setDatabaseContextEditor(getDatabaseContextEditor());
return insertUser;
}
}
This is the interface for classes that set the database context
public interface DatabaseContextEditor {
public DatabaseContextEditor getInstance();
public void setClientIdentifier(Connection connection,String clientIdentifier) throws SQLException;
}
This is class which does that for Oracle
public class OracleDatabaseContextEditor implements DatabaseContextEditor{
public void setClientIdentifier(Connection connection,String clientIdentifier) throws SQLException{
OracleJdbc4NativeJdbcExtractor extractor = new OracleJdbc4NativeJdbcExtractor();
oracle.jdbc.OracleConnection oracleConnection = null;
if(!(connection instanceof oracle.jdbc.OracleConnection))
oracleConnection = (oracle.jdbc.OracleConnection) extractor.getNativeConnection(connection);
else
oracleConnection = (oracle.jdbc.OracleConnection)connection;
String[] metrics = new String[OracleConnection.END_TO_END_STATE_INDEX_MAX];
metrics[OracleConnection.END_TO_END_CLIENTID_INDEX]=clientIdentifier;
oracleConnection.setEndToEndMetrics(metrics,(short)0);
}
public DatabaseContextEditor getInstance(){
return new OracleDatabaseContextEditor();
}
}
This class is the PreparedStatementCreator for adding a User
public class InsertUser implements PreparedStatementCreator {
User insertUser;
/** This is the admin user Id I need to store */
String adminUserId;
private final String SQL = "INSERT INTO SC_USR (" +
"USR_ID, USR_SSO_NAME, USR_PH_NO, USR_SIP_NAME," +
"USR_SIP_PSWD, USR_SIP_DISP_NAME, USR_SIP_DOMAIN, USR_SIP_PROXY," +
" USR_CREATED_BY, USR_CREATED_DATETIME) " +
"VALUES (SEQ_SC_USR_ID.NEXTVAL, ?, ?, ?, ?, ?, ?, ?, ?, SYSTIMESTAMP)";
private final String GENERATED_COLUMNS[] = {"USR_ID"};
/** Object that provides functionality for setting values in the database context */
private DatabaseContextEditor databaseContextEditor;
public InsertUser(User user,String adminUserId){
this.insertUser = user;
this.adminUserId = adminUserId;
}
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
this.databaseContextEditor.setClientIdentifier(connection, adminUserId);
PreparedStatement preparedStatement = connection.prepareStatement(SQL,GENERATED_COLUMNS);
int i=1;
preparedStatement.setString(i++,this.insertUser.getSsoName());
preparedStatement.setString(i++,this.insertUser.getPhoneNumber());
preparedStatement.setString(i++,this.insertUser.getSipName());
preparedStatement.setString(i++,this.insertUser.getSipPassword());
preparedStatement.setString(i++,this.insertUser.getSipDisplayName());
preparedStatement.setString(i++,this.insertUser.getSipDomain());
preparedStatement.setString(i++,this.insertUser.getSipProxy());
preparedStatement.setString(i++,this.insertUser.getCreatedBy().name());
return preparedStatement;
}
public void setDatabaseContextEditor(DatabaseContextEditor databaseContextEditor) {
this.databaseContextEditor = databaseContextEditor;
}
}
There are "AFTER DELETE OR INSERT OR UPDATE" triggers on each table I want to audit. Each table has a corresponding audit table. They extract the CLIENT_IDENTIFIER from the context and insert a row in the appropriate audit table. This is a sample.
CREATE OR REPLACE TRIGGER IPONE_DEV_USER.SC_USR$AUDTRG
AFTER DELETE OR INSERT OR UPDATE
ON IPONE_DEV_USER.SC_USR
REFERENCING NEW AS NEW OLD AS OLD
FOR EACH ROW
DECLARE
v_operation VARCHAR2(10) := NULL;
v_admin_user_id VARCHAR2(30);
BEGIN
v_admin_user_id := SYS_CONTEXT('USERENV', 'CLIENT_IDENTIFIER');
IF INSERTING THEN
v_operation := 'INS';
ELSIF UPDATING THEN
v_operation := 'UPD';
ELSE
v_operation := 'DEL';
END IF;
IF INSERTING OR UPDATING THEN
INSERT INTO SC_USR$AUD (
USR_ID,
USR_SSO_NAME,
USR_PH_NO,
USR_SOME_VALUE1,
USR_SOME_VALUE2,
USR_SOME_VALUE3,
USR_SOME_VALUE4,
USR_CREATED_BY,
USR_SOME_VALUE5,
USR_SOME_VALUE6,
aud_action,aud_timestamp,aud_user) VALUES (
:new.USR_ID,
:new.USR_SSO_NAME,
:new.USR_PH_NO,
:new.USR_SOME_VALUE1,
:new.USR_SOME_VALUE2,
:new.USR_SOME_VALUE3,
:new.USR_CREATED_DATETIME,
:new.USR_CREATED_BY,
:new.USR_SOME_VALUE4,
:new.USR_SOME_VALUE5,
v_operation,SYSDATE,v_admin_user_id);
ELSE
INSERT INTO SC_USR$AUD (
USR_ID,
USR_SSO_NAME,
USR_PH_NO,
USR_SIP_NAME,
USR_SIP_PSWD,
USR_SIP_DISP_NAME,
USR_CREATED_DATETIME,
USR_CREATED_BY,
USR_SIP_DOMAIN,
USR_SIP_PROXY,
aud_action,aud_timestamp,aud_user) VALUES (
:old.USR_ID,
:old.USR_SSO_NAME,
:old.USR_PH_NO,
:old.USR_SIP_NAME,
:old.USR_SIP_PSWD,
:old.USR_SIP_DISP_NAME,
:old.USR_CREATED_DATETIME,
:old.USR_CREATED_BY,
:old.USR_SIP_DOMAIN,
:old.USR_SIP_PROXY,
v_operation,SYSDATE,v_admin_user_id);
END IF;
END;
As I say this works, but I don't like it for the following reasons.
I have to modify the connection in methods that are intended for setting up the prepared statements.
I have to add this code to every PreparedStatementCreator or a BatchPreparedStatementSetter object I want to audit.
I don't have access to the connection after the database operation so I can clear the attribute.
What I really want is a single point where I can set the attribute on the connection, before and after.
Any input or ideas would be appreciated.
Spring have an elegant way of doing this. The example is pretty much what you want:
Spring Data docs
Uses AOP to set the CLIENT_IDENTIFIER when a connection is got from the Datasource.
Could use another pointcut for when the connection is closed. But not a problem is the connection pool is used soley by your app.
A better way is to use connection labeling. Have a look at the oracle.ucp.jdbc.LabelableConnection interface.

Resources