Get Statement from JDBCTemplate - spring

I have this code,
SimpleJdbcCall sql = new SimpleJdbcCall(dataSource).withProcedureName(procName);
sql.execute(parameters);
And I believe that under the hood this uses a JDBC Statement. How can I get to that object from here? (I need to call the .getWarnings() method on the statement).
In other words how can I get SQLWarnings AND named parameters?

It took a lot of digging, but here is how you can get SQLWarnings (or Print statements) AND named parameters. I extended JdbcTemplate and overrode the handleWarnings() method, and then passed that into my SimpleJdbcCall.
public class JdbcTemplateLoggable extends JdbcTemplate{
List<String> warnings;
public JdbcTemplateLoggable(DataSource dataSource){
super(dataSource);
warnings = new ArrayList<String>();
}
protected void handleWarnings(Statement stmt){
try {
SQLWarning warning = stmt.getWarnings();
while(warning != null){
warnings.add(warning.getMessage());
warning = warning.getNextWarning();
}
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public List<String> getWarnings(){
return warnings;
}
}
Then in my main program
JdbcTemplateLoggable template = new JdbcTemplateLoggable(dataSource);
SimpleJdbcCall sql = new SimpleJdbcCall(template).withProcedureName(procName);
sql.execute(parameters);
for(String s : template.getWarnings()){
log.info(s);
}

You should perhaps use JdbcTemplate directly, or subclass it for usage with your SimpleJdbcCall (instead of a DataSource). JdbcTemplate has a method execute(CallableStatementCreator, CallableStatementCallback), where a callback can be passed which gets the used Statement object.
You could override that method and wrap the passed callback with an own which stores the statement for later use.
public class CustomJdbcTemplate extends JdbcTemplate {
private CallableStatement lastStatement;
public CustomJdbcTemplate(DataSource dataSource) {
super(dataSource);
}
public CallableStatement getLastStatement() {
return lastStatement;
}
#Override
public <T> T execute(CallableStatementCreator csc, CallableStatementCallback<T> action) throws DataAccessException {
StoringCallableStatementCallback<T> callback = new StoringCallableStatementCallback<T>(action);
try {
return super.execute(csc, callback);
}
finally {
this.lastStatement = callback.statement;
}
}
private static class StoringCallableStatementCallback<T> implements CallableStatementCallback<T> {
private CallableStatementCallback<T> delegate;
private CallableStatement statement;
private StoringCallableStatementCallback(CallableStatementCallback<T> delegate) {
this.delegate = delegate;
}
#Override
public T doInCallableStatement(CallableStatement cs) throws SQLException, DataAccessException {
this.statement = cs;
return delegate.doInCallableStatement(cs);
}
}
}
Note that the statement will most probably be closed when you retrieve it later, so getWarnings() may cause errors, depending on used JDBC driver. So maybe you should store the warnings instead of the statement itself.

Related

Manage I/O and POI exception in a customReaderItem

I'm using a customReader that implements ItemReader. My reader take information from xls and treat it row by row. My constructor take Iterator value that will be read on each read() iteration. I'm trying to find a suitable way to manage exception. I look through SkipListener and ReaderListener onReadError. But Ican't use either because my exception will be thrown in the constructor before attempting read() method.
Is there any way to do that in order to allow me to manage properly action1/2/3 respectively to exceptions ?
#Component
public class CustomReaderFile implements ItemReader<Row> {
private final Iterator<Row> data;
private static final String FILE_NAME = "";
public CustomReaderFile() throws Exception {
this.data = iteratorFromXls();
}
#Override
public Row read() throws Exception {
if (this.data.hasNext()) {
return this.data.next();
} else {
return null;
}
}
public static Iterator<Row> iteratorFromXls() throws Exception {
Iterator<Row> iterator = null;
try {
FileInputStream excelFile = new FileInputStream(new File(FILE_NAME));
Workbook workbook = new XSSFWorkbook(excelFile);
Sheet dataTypeSheet = workbook.getSheetAt(0);
iterator = dataTypeSheet.iterator();
} catch (FileNotFoundException e) {
//action1
} catch (IOException e) {
//action2
} catch (NotOfficeXmlFileException e){
//action3
}
return iterator;
}
This is actually the reader's initialization code. The reading part is nothing more than calling .next on the iterator.
So I would make the reader implement ItemStreamReader and put the initialization code in the open method, in which you can throw an exception to signal to Spring Batch that the reader's initialization has failed.

transactional unit testing with ObjectifyService - no rollback happening

We are trying to use google cloud datastore in our project and trying to use objectify as the ORM since google recommends it. I have carefully used and tried everything i could read about and think of but somehow the transactions don't seem to work. Following is my code and setup.
#RunWith(SpringRunner.class)
#EnableAspectJAutoProxy(proxyTargetClass = true)
#ContextConfiguration(classes = { CoreTestConfiguration.class })
public class TestObjectifyTransactionAspect {
private final LocalServiceTestHelper helper = new LocalServiceTestHelper(
// Our tests assume strong consistency
new LocalDatastoreServiceTestConfig().setApplyAllHighRepJobPolicy(),
new LocalMemcacheServiceTestConfig(), new LocalTaskQueueTestConfig());
private Closeable closeableSession;
#Autowired
private DummyService dummyService;
#BeforeClass
public static void setUpBeforeClass() {
// Reset the Factory so that all translators work properly.
ObjectifyService.setFactory(new ObjectifyFactory());
}
/**
* #throws java.lang.Exception
*/
#Before
public void setUp() throws Exception {
System.setProperty("DATASTORE_EMULATOR_HOST", "localhost:8081");
ObjectifyService.register(UserEntity.class);
this.closeableSession = ObjectifyService.begin();
this.helper.setUp();
}
/**
* #throws java.lang.Exception
*/
#After
public void tearDown() throws Exception {
AsyncCacheFilter.complete();
this.closeableSession.close();
this.helper.tearDown();
}
#Test
public void testTransactionMutationRollback() {
// save initial list of users
List<UserEntity> users = new ArrayList<UserEntity>();
for (int i = 1; i <= 10; i++) {
UserEntity user = new UserEntity();
user.setAge(i);
user.setUsername("username_" + i);
users.add(user);
}
ObjectifyService.ofy().save().entities(users).now();
try {
dummyService.mutateDataWithException("username_1", 6L);
} catch (Exception e) {
e.printStackTrace();
}
List<UserEntity> users2 = this.dummyService.findAllUsers();
Assert.assertEquals("Size mismatch on rollback", users2.size(), 10);
boolean foundUserIdSix = false;
for (UserEntity userEntity : users2) {
if (userEntity.getUserId() == 1) {
Assert.assertEquals("Username update failed in transactional context rollback.", "username_1",
userEntity.getUsername());
}
if (userEntity.getUserId() == 6) {
foundUserIdSix = true;
}
}
if (!foundUserIdSix) {
Assert.fail("Deleted user with userId 6 but it is not rolledback.");
}
}
}
Since I am using spring, idea is to use an aspect with a custom annotation to weave objectify.transact around the spring service beans methods that are calling my daos.
But somehow the update due to ObjectifyService.ofy().save().entities(users).now(); is not gettign rollbacked though the exception throws causes Objectify to run its rollback code. I tried printing the ObjectifyImpl instance hashcodes and they are all same but still its not rollbacking.
Can someone help me understand what am i doing wrong? Havent tried the actual web based setup yet...if it cant pass transnational test cases there is no point in actual transaction usage in a web request scenario.
Update: Adding aspect, services, dao as well to make a complete picture. The code uses spring boot.
DAO class. Note i am not using any transactions here because as per code of com.googlecode.objectify.impl.TransactorNo.transactOnce(ObjectifyImpl<O>, Work<R>) a transnational ObjectifyImpl is flushed and committed in this method which i don't want. I want commit to happen once and rest all to join in on that transaction. Basically this is the wrong code in com.googlecode.objectify.impl.TransactorNo ..... i will try to explain my understanding a later in the question.
#Component
public class DummyDaoImpl implements DummyDao {
#Override
public List<UserEntity> loadAll() {
Query<UserEntity> query = ObjectifyService.ofy().transactionless().load().type(UserEntity.class);
return query.list();
}
#Override
public List<UserEntity> findByUserId(Long userId) {
Query<UserEntity> query = ObjectifyService.ofy().transactionless().load().type(UserEntity.class);
//query = query.filterKey(Key.create(UserEntity.class, userId));
return query.list();
}
#Override
public List<UserEntity> findByUsername(String username) {
return ObjectifyService.ofy().transactionless().load().type(UserEntity.class).filter("username", username).list();
}
#Override
public void update(UserEntity userEntity) {
ObjectifyService.ofy().save().entity(userEntity);
}
#Override
public void update(Iterable<UserEntity> userEntities) {
ObjectifyService.ofy().save().entities(userEntities);
}
#Override
public void delete(Long userId) {
ObjectifyService.ofy().delete().key(Key.create(UserEntity.class, userId));
}
}
Below is the Service class
#Service
public class DummyServiceImpl implements DummyService {
private static final Logger LOGGER = LoggerFactory.getLogger(DummyServiceImpl.class);
#Autowired
private DummyDao dummyDao;
public void saveDummydata() {
List<UserEntity> users = new ArrayList<UserEntity>();
for (int i = 1; i <= 10; i++) {
UserEntity user = new UserEntity();
user.setAge(i);
user.setUsername("username_" + i);
users.add(user);
}
this.dummyDao.update(users);
}
/* (non-Javadoc)
* #see com.bbb.core.objectify.test.services.DummyService#mutateDataWithException(java.lang.String, java.lang.Long)
*/
#Override
#ObjectifyTransactional
public void mutateDataWithException(String usernameToMutate, Long userIdToDelete) throws Exception {
//update one
LOGGER.info("Attempting to update UserEntity with username={}", "username_1");
List<UserEntity> mutatedUsersList = new ArrayList<UserEntity>();
List<UserEntity> users = dummyDao.findByUsername(usernameToMutate);
for (UserEntity userEntity : users) {
userEntity.setUsername(userEntity.getUsername() + "_updated");
mutatedUsersList.add(userEntity);
}
dummyDao.update(mutatedUsersList);
//delete another
UserEntity user = dummyDao.findByUserId(userIdToDelete).get(0);
LOGGER.info("Attempting to delete UserEntity with userId={}", user.getUserId());
dummyDao.delete(user.getUserId());
throw new RuntimeException("Dummy Exception");
}
/* (non-Javadoc)
* #see com.bbb.core.objectify.test.services.DummyService#findAllUsers()
*/
#Override
public List<UserEntity> findAllUsers() {
return dummyDao.loadAll();
}
Aspect which wraps the method annoted with ObjectifyTransactional as a transact work.
#Aspect
#Component
public class ObjectifyTransactionAspect {
private static final Logger LOGGER = LoggerFactory.getLogger(ObjectifyTransactionAspect.class);
#Around(value = "execution(* *(..)) && #annotation(objectifyTransactional)")
public Object objectifyTransactAdvise(final ProceedingJoinPoint pjp, ObjectifyTransactional objectifyTransactional) throws Throwable {
try {
Object result = null;
Work<Object> work = new Work<Object>() {
#Override
public Object run() {
try {
return pjp.proceed();
} catch (Throwable throwable) {
throw new ObjectifyTransactionExceptionWrapper(throwable);
}
}
};
switch (objectifyTransactional.propagation()) {
case REQUIRES_NEW:
int limitTries = objectifyTransactional.limitTries();
if(limitTries <= 0) {
Exception illegalStateException = new IllegalStateException("limitTries must be more than 0.");
throw new ObjectifyTransactionExceptionWrapper(illegalStateException);
} else {
if(limitTries == Integer.MAX_VALUE) {
result = ObjectifyService.ofy().transactNew(work);
} else {
result = ObjectifyService.ofy().transactNew(limitTries, work);
}
}
break;
case NOT_SUPPORTED :
case NEVER :
case MANDATORY :
result = ObjectifyService.ofy().execute(objectifyTransactional.propagation(), work);
break;
case REQUIRED :
case SUPPORTS :
ObjectifyService.ofy().transact(work);
break;
default:
break;
}
return result;
} catch (ObjectifyTransactionExceptionWrapper e) {
String packageName = pjp.getSignature().getDeclaringTypeName();
String methodName = pjp.getSignature().getName();
LOGGER.error("An exception occured while executing [{}.{}] in a transactional context."
, packageName, methodName, e);
throw e.getCause();
} catch (Throwable ex) {
String packageName = pjp.getSignature().getDeclaringTypeName();
String methodName = pjp.getSignature().getName();
String fullyQualifiedmethodName = packageName + "." + methodName;
throw new RuntimeException("Unexpected exception while executing ["
+ fullyQualifiedmethodName + "] in a transactional context.", ex);
}
}
}
Now the problem code part that i see is as follows in com.googlecode.objectify.impl.TransactorNo:
#Override
public <R> R transact(ObjectifyImpl<O> parent, Work<R> work) {
return this.transactNew(parent, Integer.MAX_VALUE, work);
}
#Override
public <R> R transactNew(ObjectifyImpl<O> parent, int limitTries, Work<R> work) {
Preconditions.checkArgument(limitTries >= 1);
while (true) {
try {
return transactOnce(parent, work);
} catch (ConcurrentModificationException ex) {
if (--limitTries > 0) {
if (log.isLoggable(Level.WARNING))
log.warning("Optimistic concurrency failure for " + work + " (retrying): " + ex);
if (log.isLoggable(Level.FINEST))
log.log(Level.FINEST, "Details of optimistic concurrency failure", ex);
} else {
throw ex;
}
}
}
}
private <R> R transactOnce(ObjectifyImpl<O> parent, Work<R> work) {
ObjectifyImpl<O> txnOfy = startTransaction(parent);
ObjectifyService.push(txnOfy);
boolean committedSuccessfully = false;
try {
R result = work.run();
txnOfy.flush();
txnOfy.getTransaction().commit();
committedSuccessfully = true;
return result;
}
finally
{
if (txnOfy.getTransaction().isActive()) {
try {
txnOfy.getTransaction().rollback();
} catch (RuntimeException ex) {
log.log(Level.SEVERE, "Rollback failed, suppressing error", ex);
}
}
ObjectifyService.pop();
if (committedSuccessfully) {
txnOfy.getTransaction().runCommitListeners();
}
}
}
transactOnce is by code / design always using a single transaction to do things. It will either commit or rollback the transaction. there is no provision to chain transactions like a normal enterprise app would want.... service -> calls multiple dao methods in a single transaction and commits or rollbacks depending on how things look.
keeping this in mind, i removed all annotations and transact method calls in my dao methods so that they don't start an explicit transaction and the aspect in service wraps the service method in transact and ultimately in transactOnce...so basically the service method is running in a transaction and no new transaction is getting fired again. This is a very basic scenario, in actual production apps services can call other service methods and they might have the annotation on them and we could still end up in a chained transaction..but anyway...that is a different problem to solve....
I know NoSQLs dont support write consistency at table or inter table levels so am I asking too much from google cloud datastore?

Choose Class in Birt is empty eventhough I have added jar in Datasource

Even though while creating dataset choose class window is empty. I am using Luna Service Release 2 (4.4.2).
From: http://yaragalla.blogspot.com/2013/10/using-pojo-datasource-in-birt-43.html
In the dataset class the three methods, “public void open(Object obj, Map map)”, “public Object next()” and “public void close()” must be implemented.
Make sure you have implemented these.
Here is a sample that I tested with:
public class UserDataSet {
public Iterator<User> itr;
public List<User> getUsers() throws ParseException {
List<User> users = new ArrayList<>();
// Add to Users
....
return users;
}
public void open(Object obj, Map<String, Object> map) {
try {
itr = getUsers().iterator();
} catch (ParseException e) {
e.printStackTrace();
}
}
public Object next() {
if (itr.hasNext())
return itr.next();
return null;
}
public void close() {
}
}

Freemarker removeIntrospectionInfo does not work with DCEVM after model hotswap

I am using Freemarker and DCEVM+HotSwapManager agent. This basically allows me to hotswap classes even when adding/removing methods.
Everything works like charm until Freemarker uses hotswapped class as model. It's throwing freemarker.ext.beans.InvalidPropertyException: No such bean property on me even though reflection shows that the method is there (checked during debug session).
I am using
final Method clearInfoMethod = beanWrapper.getClass().getDeclaredMethod("removeIntrospectionInfo", Class.class);
clearInfoMethod.setAccessible(true);
clearInfoMethod.invoke(clazz);
to clear the cache, but it does not work. I even tried to obtain classCache member field and clear it using reflection but it does not work too.
What am I doing wrong?
I just need to force freemarker to throw away any introspection on model class/classes he has already obtained.
Is there any way?
UPDATE
Example code
Application.java
// Application.java
public class Application
{
public static final String TEMPLATE_PATH = "TemplatePath";
public static final String DEFAULT_TEMPLATE_PATH = "./";
private static Application INSTANCE;
private Configuration freemarkerConfiguration;
private BeansWrapper beanWrapper;
public static void main(String[] args)
{
final Application application = new Application();
INSTANCE = application;
try
{
application.run(args);
}
catch (InterruptedException e)
{
System.out.println("Exiting");
}
catch (IOException e)
{
System.out.println("IO Error");
e.printStackTrace();
}
}
public Configuration getFreemarkerConfiguration()
{
return freemarkerConfiguration;
}
public static Application getInstance()
{
return INSTANCE;
}
private void run(String[] args) throws InterruptedException, IOException
{
final String templatePath = System.getProperty(TEMPLATE_PATH) != null
? System.getProperty(TEMPLATE_PATH)
: DEFAULT_TEMPLATE_PATH;
final Configuration configuration = new Configuration();
freemarkerConfiguration = configuration;
beanWrapper = new BeansWrapper();
beanWrapper.setUseCache(false);
configuration.setObjectWrapper(beanWrapper);
try
{
final File templateDir = new File(templatePath);
configuration.setTemplateLoader(new FileTemplateLoader(templateDir));
}
catch (IOException e)
{
throw new RuntimeException(e);
}
final RunnerImpl runner = new RunnerImpl();
try
{
runner.run(args);
}
catch (RuntimeException e)
{
e.printStackTrace();
}
}
public BeansWrapper getBeanWrapper()
{
return beanWrapper;
}
}
RunnerImpl.java
// RunnerImpl.java
public class RunnerImpl implements Runner
{
#Override
public void run(String[] args) throws InterruptedException
{
long counter = 0;
while(true)
{
++counter;
System.out.printf("Run %d\n", counter);
// Application.getInstance().getFreemarkerConfiguration().setObjectWrapper(new BeansWrapper());
Application.getInstance().getBeanWrapper().clearClassIntrospecitonCache();
final Worker worker = new Worker();
worker.doWork();
Thread.sleep(1000);
}
}
Worker.java
// Worker.java
public class Worker
{
void doWork()
{
final Application application = Application.getInstance();
final Configuration freemarkerConfiguration = application.getFreemarkerConfiguration();
try
{
final Template template = freemarkerConfiguration.getTemplate("test.ftl");
final Model model = new Model();
final PrintWriter printWriter = new PrintWriter(System.out);
printObjectInto(model);
System.out.println("-----TEMPLATE MACRO PROCESSING-----");
template.process(model, printWriter);
System.out.println();
System.out.println("-----END OF PROCESSING------");
System.out.println();
}
catch (IOException e)
{
e.printStackTrace();
}
catch (TemplateException e)
{
e.printStackTrace();
}
}
private void printObjectInto(Object o)
{
final Class<?> aClass = o.getClass();
final Method[] methods = aClass.getDeclaredMethods();
for (final Method method : methods)
{
System.out.println(String.format("Method name: %s, public: %s", method.getName(), Modifier.isPublic(method.getModifiers())));
}
}
}
Model.java
// Model.java
public class Model
{
public String getMessage()
{
return "Hello";
}
public String getAnotherMessage()
{
return "Hello World!";
}
}
This example does not work at all. Even changing BeansWrapper during runtime won't have any effect.
BeansWrapper (and DefaultObjectWrapper's, etc.) introspection cache relies on java.beans.Introspector.getBeanInfo(aClass), not on reflection. (That's because it treats objects as JavaBeans.) java.beans.Introspector has its own internal cache, so it can return stale information, and in that case BeansWrapper will just recreate its own class introspection data based on that stale information. As of java.beans.Introspector's caching, it's in fact correct, as it builds on the assumption that classes in Java are immutable. If something breaks that basic rule, it should ensure that java.beans.Introspector's cache is cleared (and many other caches...), or else it's not just FreeMarker that will break. At JRebel for example they made a lot of effort to clear all kind of caches. I guess DCEVM doesn't have the resources for that. So then, it seems you have to call Introspector.flushCaches() yourself.
Update: For a while (Java 7, maybe 6) java.beans.Introspector has one cache per thread group, so you have call flushCaches() from all thread groups. And this all is actually implementation detail that, in principle, can change any time. And sadly, the JavaDoc of Introspector.flushCaches() doesn't warn you...

What is proper way to use PreparedStatementCreator of Spring JDBC?

As per my understanding the use of PreparedStatement in Java is we can use it multiple times.
But I have some confusion using PreparedStatementCreator of Spring JDBC.
For example consider following code,
public class SpringTest {
JdbcTemplate jdbcTemplate;
PreparedStatementCreator preparedStatementCreator;
ResultSetExtractor<String> resultSetExtractor;
public SpringTest() throws SQLException {
jdbcTemplate = new JdbcTemplate(OracleUtil.getDataSource());
preparedStatementCreator = new PreparedStatementCreator() {
String query = "select NAME from TABLE1 where ID=?";
public PreparedStatement createPreparedStatement(Connection connection) throws SQLException {
return connection.prepareStatement(query);
}
};
resultSetExtractor = new ResultSetExtractor<String>() {
public String extractData(ResultSet resultSet) throws SQLException,
DataAccessException {
if (resultSet.next()) {
return resultSet.getString(1);
}
return null;
}
};
}
public String getNameFromId(int id){
return jdbcTemplate.query(preparedStatementCreator, new Table1Setter(id), resultSetExtractor);
}
private static class Table1Setter implements PreparedStatementSetter{
private int id;
public Table1Setter(int id) {
this.id =id;
}
#Override
public void setValues(PreparedStatement preparedStatement) throws SQLException {
preparedStatement.setInt(1, id);
}
}
public static void main(String[] args) {
try {
SpringTest springTest = new SpringTest();
for(int i=0;i<10;i++){
System.out.println(springTest.getNameFromId(i));
}
} catch (SQLException e) {
e.printStackTrace();
}
}
}
As per this code when I called springTest.getNameFromId(int id) method, it returns name from given id, Here I've used PreparedStatementCreator for creating PreparedStatement and PreparedStatementSetter for setting input parameters and I got result from ResultSetExtractor.
But performance is very slow.
After debugging and looking into what happens inside PreparedStatementCreator and JdbcTemplate I got to know that PreparedStatementCreator creates each and every time new PreparedStatement...!!!
Each and every time when I am calls method jdbcTemplate.query(preparedStatementCreator, preparedStatementSetter, resultSetExtractor), it creates new PreparedStatement and this slow downs performance.
Is this right way to use PreparedStatementCreator? Because in this code I unable to reuse PreparedStatement. And if this is right way to use PreparedStatementCreator than how to get benefit of re-usability of PreparedStatement?
Prepared Statements are usually cached by underlying connection pool, so you don't need to worry about creating a new one every time or not.
So I think that your actually usage is correct.
JdbcTemplate closes the statement after executing it, so if you really want to reuse the same prepared statement you could proxy the statement and intercept the close method in the statement creator
For example (not tested, only as example):
public abstract class ReusablePreparedStatementCreator implements PreparedStatementCreator {
private PreparedStatement statement;
public PreparedStatement createPreparedStatement(Connection conn) throws SQLException {
if (statement != null)
return statement;
PreparedStatement ps = doPreparedStatement(conn);
ProxyFactory pf = new ProxyFactory(ps);
MethodInterceptor closeMethodInterceptor = new MethodInterceptor() {
#Override
public Object invoke(MethodInvocation invocation) throws Throwable {
return null; // don't close statement
}
};
NameMatchMethodPointcutAdvisor closeAdvisor = new NameMatchMethodPointcutAdvisor();
closeAdvisor.setMappedName("close");
closeAdvisor.setAdvice(closeMethodInterceptor);
pf.addAdvisor(closeAdvisor);
statement = (PreparedStatement) pf.getProxy();
return statement;
}
public abstract PreparedStatement doPreparedStatement(Connection conn) throws SQLException;
public void close() {
try {
PreparedStatement ps = (PreparedStatement) ((Advised) statement).getTargetSource().getTarget();
ps.close();
} catch (Exception e) {
// handle exception
}
}
}
You are on the right way to use PreparedStatementCreator.
In each new transaction, you should create brand new PreparedStatement instance, it's definitely correct. PreparedStatementCreator is mainly designed to wrap the code block to create PreparedStatement instance easily, not saying that you should resue the new instance each itme.
PreparedStatement is mainly designed to send the templated and pre-compiled SQL statement DBMS which will save some pre-compiled time for SQL execution.
To summarize, what you did is correct. use PreparedStatement will have better performance than Statement.
After debugging and looking into what happens inside PreparedStatementCreator and JdbcTemplate I got to know that PreparedStatementCreator creates each and every time new PreparedStatement...!!!
I'm not sure why that's so shocking since it's your own code that creates a new PreparedStatement each time by calling connection.prepareStatement(query);. If you want to reuse the same one, then you shouldn't create a new one.

Resources