HSQLDB - Oracle to_char(integer) throws " Unexpected Token : )" - oracle

I'm doing integration testing in a Hsqldb. My production database is an Oracle Database.
Versions
Hibernate : 4.1.3.final.
Hsqldb : 2.3.3. ( I can't use the 2.3.4 because it can't run all my junit tests in one click.)
My problem
To create my test database, I followed this tutorial with minor modifications. Everything works fine except for the methods that use the TO_CHAR(integer) function from Oracle. For those methods, I got an unexpected token: )
This is the code that causes the exception
select ="select p.name, to_char(p.id) "
+ " from t_player p " +
"inner join t_job job on j.id=p.id_job ";
This is a part of my unit test
public void testFindPlayer() throws ClassNotFoundException, SQLException {
Class.forName("org.hsqldb.jdbcDriver");
Connection connection = DriverManager.getConnection("jdbc:hsqldb:mem:DB", "sa", "");
String syntax_ora = "SET DATABASE SQL SYNTAX ORA TRUE";
PreparedStatement ps_ora = connection.prepareStatement(syntax_ora);
ps_ora.execute();
ps_ora.close();
connection.close();
List<String[]> actual_player = Player.findPlayer("Name");
List<String[]> expected_player = new ArrayList<String[]>();
//etc..
This is the exception
Caused by: org.hsqldb.HsqlException: unexpected token: )
at org.hsqldb.error.Error.parseError(Unknown Source)
at org.hsqldb.ParserBase.unexpectedToken(Unknown Source)
at org.hsqldb.ParserDQL.readExpression(Unknown Source)
at org.hsqldb.ParserDQL.readSQLFunction(Unknown Source)
at org.hsqldb.ParserDQL.readColumnOrFunctionExpression(Unknown Source)
at org.hsqldb.ParserDQL.XreadSimpleValueExpressionPrimary(Unknown Source)
at org.hsqldb.ParserDQL.XreadAllTypesValueExpressionPrimary(Unknown Source)
at org.hsqldb.ParserDQL.XreadAllTypesPrimary(Unknown Source)
at org.hsqldb.ParserDQL.XreadAllTypesFactor(Unknown Source)
at org.hsqldb.ParserDQL.XreadAllTypesTerm(Unknown Source)
at org.hsqldb.ParserDQL.XreadAllTypesCommonValueExpression(Unknown Source)
at org.hsqldb.ParserDQL.XreadValueExpression(Unknown Source)
at org.hsqldb.ParserDQL.XreadSelect(Unknown Source)
at org.hsqldb.ParserDQL.XreadQuerySpecification(Unknown Source)
at org.hsqldb.ParserDQL.XreadSimpleTable(Unknown Source)
at org.hsqldb.ParserDQL.XreadQueryPrimary(Unknown Source)
at org.hsqldb.ParserDQL.XreadQueryTerm(Unknown Source)
at org.hsqldb.ParserDQL.XreadQueryExpressionBody(Unknown Source)
at org.hsqldb.ParserDQL.XreadQueryExpression(Unknown Source)
at org.hsqldb.ParserDQL.compileCursorSpecification(Unknown Source)
at org.hsqldb.ParserCommand.compilePart(Unknown Source)
at org.hsqldb.ParserCommand.compileStatement(Unknown Source)
at org.hsqldb.Session.compileStatement(Unknown Source)
at org.hsqldb.StatementManager.compile(Unknown Source)
at org.hsqldb.Session.execute(Unknown Source)
... 41 more
How I tried to fix it
1) I try another function from Oracle that is to_char(number,'format').
Basically, I change the select part with the following:
select ="select p.name, to_char(p.id,'999999') "
+ " from t_player p " +
"inner join t_job job on j.id=p.id_job ";
But then, I got this exception : incompatible data type in operation
2) I put the SET DATABASE SQL OSYNTAX part into comments
The same exceptions are raised.
Do you have an idea on how to fix this issue please?
Thank you for your reply.

HSQLDB's built-in TO_CHAR function supports only date and timestamp arguments.
You can create a user-defined TO_CHAR function for numeric values. For example:
CREATE FUNCTION TO_CHAR(param DECIMAL) RETURNS VARCHAR(20)
RETURN CAST(param AS VARCHAR(20))
CREATE FUNCTION TO_CHAR(param DECIMAL, format VARCHAR(20)) RETURNS VARCHAR(20)
RETURN CAST(param AS VARCHAR(20))
Now TO_CHAR(p.id) will return the id as a string and the two-arge version also returns the same.

I have the same error using the CAST function, like this:
cast(id as VARCHAR)
But the correct is:
cast(id as VARCHAR(20))

Related

BigQuery returns "Unparseable query parameter `` in type `TYPE_INT64`" when the query projects a literal that is a parameter of a prepared statement

BigQuery returns the error Unparseable query parameter `` in type ``TYPE_INT64``" when executing a query that meets all these conditions:
I run the query as a prepared statement.
The query has a literal of type String in the SELECT.
The value of this literal is passed as a parameter. For example SELECT ? AS field_1....
For example, this code:
import java.sql.*;
import java.sql.DriverManager;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
public class TestBigQueryPreparedStatement {
private static final String CONNECTION_URI =
"jdbc:bigquery://<connection_uri>";
private static final String QUERY = "SELECT ? AS `_SOURCE_TABLE`, field1 FROM test_view";
public static void main(String[] args) throws ClassNotFoundException {
Class.forName("com.simba.googlebigquery.jdbc42.Driver");
try (Connection connection = DriverManager.getConnection(CONNECTION_URI)) {
try (PreparedStatement ps = connection.prepareStatement(QUERY)) {
// The problem also occurs if I replace this line with "ps.setObject(...)
ps.setString(1, "SOURCE_TABLE");
try (ResultSet rs = ps.executeQuery()) {
while (rs.next()) {
System.out.println(rs.getString(1) + " " + rs.getInt(2));
}
}
}
} catch (SQLException e) {
e.printStackTrace();
}
}
}
Produces this error:
java.sql.SQLException: [Simba][BigQueryJDBCDriver](100032) Error executing query job. Message: Unparseable query parameter `` in type `TYPE_INT64`, Bad int64 value: SOURCE_TABLE value: 'SOURCE_TABLE'
at com.simba.googlebigquery.googlebigquery.client.requests.jobs.JobsInsertRequest.throwException(Unknown Source)
at com.simba.googlebigquery.googlebigquery.client.requests.AbstractRequestWithRetry.executeWithRetry(Unknown Source)
at com.simba.googlebigquery.googlebigquery.client.queryclient.JobsInsertClient.executeQuery(Unknown Source)
at com.simba.googlebigquery.googlebigquery.client.BQClient.executeQuery(Unknown Source)
at com.simba.googlebigquery.googlebigquery.dataengine.BQAbstractExecutor.execute(Unknown Source)
at com.simba.googlebigquery.googlebigquery.dataengine.BQSQLExecutor.execute(Unknown Source)
at com.simba.googlebigquery.jdbc.common.SPreparedStatement.executeWithParams(Unknown Source)
at com.simba.googlebigquery.jdbc.common.SPreparedStatement.executeQuery(Unknown Source)
at TestBigQueryPreparedStatement.main(TestBigQueryPreparedStatement.java:20)
Caused by: com.simba.googlebigquery.googlebigquery.client.exceptions.JobExecutionErrorException: [Simba][BigQueryJDBCDriver](100032) Error executing query job. Message: Unparseable query parameter `` in type `TYPE_INT64`, Bad int64 value: SOURCE_TABLE value: 'SOURCE_TABLE'
... 9 more
Caused by: java.lang.Exception: Unparseable query parameter `` in type `TYPE_INT64`, Bad int64 value: SOURCE_TABLE value: 'SOURCE_TABLE'
at com.simba.googlebigquery.googlebigquery.client.requests.jobs.JobsInsertRequest.execute(Unknown Source)
at com.simba.googlebigquery.googlebigquery.client.requests.jobs.JobsInsertRequest.execute(Unknown Source)
... 8 more
Reproducible with the latest JDBC driver of BigQuery (Simba v1.3.0 1001).
This error also occurs with PreparedStatement.setDate(...) and setFloat(...) but works fine with setDecimal(...), setInt(...) (I have not checked the output with all the setXXX methods of PreparedStatement)
Is it possible to execute in BigQuery queries that have a literal in the SELECT and to execute the query as a prepared statement?
This is a sample scenario. My application has an execution engine that runs SQL queries on any database and it always does so with prepared statements. Occasionally, the query will have a literal in the SELECT and with BigQuery, I get this error above (it works with any other database). I can do certain changes specifically for the query generator of BigQuery but it would be very difficult to change the code so the literals in the SELECT clause are passed as literals, not parameters of the prepared statement.
You cannot use positional query parameters in the select or from part of the query. Positional parameters can only occur in the where clause of the query.
SELECT word, word_count
FROM `bigquery-public-data.samples.shakespeare`
WHERE corpus = ?
AND word_count >= ?
ORDER BY word_count DESC;";
See also https://cloud.google.com/bigquery/docs/samples/bigquery-query-params-positional

Add columns dynamically to Cassandra from Spring Boot

I am trying to add new columns dynamically from Spring Boot application. Let us say, each time an event e occurs, I want to add a column into a Cassandra table with a well defined column-name and type. I have tried this code:
#Query("alter table attributes.attributedata add ?0 ?1")
public void addColumn(String columnName, String dataType);
Error Log:
org.springframework.cassandra.support.exception.CassandraQuerySyntaxException: line 1:41 no viable alternative at input 'gac5' (alter table attributes.attributedata add ['gac]...); nested exception is com.datastax.driver.core.exceptions.SyntaxError: line 1:41 no viable alternative at input 'gac5' (alter table attributes.attributedata add ['gac]...)
at org.springframework.cassandra.support.CassandraExceptionTranslator.translateExceptionIfPossible(CassandraExceptionTranslator.java:132)
at org.springframework.cassandra.core.CqlTemplate.potentiallyConvertRuntimeException(CqlTemplate.java:946)
at org.springframework.cassandra.core.CqlTemplate.translateExceptionIfPossible(CqlTemplate.java:930)
at org.springframework.cassandra.core.CqlTemplate.translateExceptionIfPossible(CqlTemplate.java:912)
at org.springframework.cassandra.core.CqlTemplate.doExecute(CqlTemplate.java:278)
at org.springframework.cassandra.core.CqlTemplate.doExecute(CqlTemplate.java:559)
at org.springframework.cassandra.core.CqlTemplate.doExecute(CqlTemplate.java:549)
at org.springframework.cassandra.core.CqlTemplate.query(CqlTemplate.java:485)
at org.springframework.cassandra.core.CqlTemplate.query(CqlTemplate.java:510)
at org.springframework.cassandra.core.CqlTemplate.query(CqlTemplate.java:505)
at org.springframework.data.cassandra.core.CassandraTemplate.selectOne(CassandraTemplate.java:638)
at org.springframework.data.cassandra.core.CassandraTemplate.selectOne(CassandraTemplate.java:509)
at org.springframework.data.cassandra.repository.query.CassandraQueryExecution$SingleEntityExecution.execute(CassandraQueryExecution.java:104)
at org.springframework.data.cassandra.repository.query.CassandraQueryExecution$ResultProcessingExecution.execute(CassandraQueryExecution.java:143)
at org.springframework.data.cassandra.repository.query.AbstractCassandraQuery.execute(AbstractCassandraQuery.java:113)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:483)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:461)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:56)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:57)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
I have also tried to store the entire query into a string and then directly place the string into value such that
#Query(value="?0")
but that does not work either.
This code works perfectly fine. If the values obtained from the function are not enclosed within single inverted commas, the code would work.
#Query("alter table attributes.attributedata add dummy text")
public void addColumn(String columnName, String dataType);
Is there any way this could work? Please suggest the possible alternatives.
A cluster object can be created as shown in the code, followed by the session object on which the query is executed. The name and type of the column to be added are passed as parameters.
public void addColumn(String columnName, String columnType){
//Query
String query = "ALTER TABLE keyspace.tablename ADD " + columnName + " " + columnType;
//Creating Cluster object
Cluster cluster = Cluster.builder().addContactPoint("127.0.0.1").build();
//Creating Session object
Session session = cluster.connect();
//Executing the query
session.execute(query);
System.out.println("Column added");
}
In order to optimize the code, we can create the session object only once during the application startup. The implementation for such scenario is given below:
private Session session;
#EventListener(ApplicationReadyEvent.class)
private void createSessionObject(){
//Creating Cluster object
Cluster cluster = Cluster.builder().addContactPoint("127.0.0.1").build();
//Creating Session object
Session session = cluster.connect();
}
public void addColumn(String columnName, String columnType){
//Query
String query = "ALTER TABLE keyspace.tablename ADD " + columnName + " " + columnType;
if(session == null){
createSessionObject();
}
//Executing the query
session.execute(query);
System.out.println("Column added");
}
You can try something like this using CONCAT
#Query("alter table attributes.attributedata add CONCAT('%',:columnNameDataType,'%')")
public void addColumn(String columnNameDataType);
Check this link

Mybatis: IllegalArgumentException: Mapped Statements collection does not contain value for xxx

I have two entities Vendor and Goods with one-to-many relation, the relation looks like:
I am using mybatis with annotation, the mapper:
GoodsMapper
public interface GoodsMapper {
#Select("select * from goods where id=#{goodsId}")
#Results({
#Result(id = true, column = "id", property = "id"),
#Result(column = "name", property = "name"),
#Result(column = "vendor_id", property = "vendor",
one = #One(select = "com.xxx.server.mapper.VendorMapper.getVendor"))
})
Goods getGoods(#Param("goodsId") String goodsId);
}
VendorMapper
public interface VendorMapper {
#Select("select * from vendor where id=#{vendorId}")
Vendor getVendor(#Param("vendorId") String vendorId);
}
ignore the entity code & others...
when I invoked goodsMapper.getGoods(goodsId), I caught the following exception :
Caused by: org.apache.ibatis.exceptions.PersistenceException:
### Error querying database. Cause: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for com.xxx.server.mapper.VendorMapper.getVendor
### The error may exist in com/xxx/server/mapper/GoodsMapper.java (best guess)
### The error may involve com.xxx.server.mapper.GoodsMapper.getGoods
### The error occurred while handling results
### SQL: select * from goods where id=?
### Cause: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for com.xxx.server.mapper.VendorMapper.getVendor
at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:30)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:150)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:141)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectOne(DefaultSqlSession.java:77)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.mybatis.spring.SqlSessionTemplate$SqlSessionInterceptor.invoke(SqlSessionTemplate.java:433)
... 117 more
Caused by: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for com.xxx.server.mapper.VendorMapper.getVendor
at org.apache.ibatis.session.Configuration$StrictMap.get(Configuration.java:933)
at org.apache.ibatis.session.Configuration.getMappedStatement(Configuration.java:726)
at org.apache.ibatis.session.Configuration.getMappedStatement(Configuration.java:719)
at org.apache.ibatis.executor.resultset.DefaultResultSetHandler.getNestedQueryMappingValue(DefaultResultSetHandler.java:740)
at org.apache.ibatis.executor.resultset.DefaultResultSetHandler.getPropertyMappingValue(DefaultResultSetHandler.java:465)
at org.apache.ibatis.executor.resultset.DefaultResultSetHandler.applyPropertyMappings(DefaultResultSetHandler.java:441)
I have checked the class path com.xxx.server.mapper.VendorMapper.getVendor for the select of #One, it is correct.
Appreciate any kind help~
In my case this was caused by referenced collection not being initialized by Spring yet.
Solution is to add #DependsOn annotation to the "parent" mapper.
#DependsOn("VendorMapper")
public interface GoodsMapper{
...
}
#Repository("VendorMapper")
public interface VendorMapper {
...
}

Spring Batch JdbcPagingItemReader missing parameter

I have a Spring Batch application in which i have to export a partition based on a config table, using Oracle Data Dump. In the config table I have information like days to run the export, name of the table to export and so on.
This is my query provider
private PagingQueryProvider queryProviderStep1() throws Exception {
SqlPagingQueryProviderFactoryBean queryProvider = new SqlPagingQueryProviderFactoryBean();
queryProvider.setDataSource(infraConfig.dataSourceLocal());
queryProvider.setDatabaseType("ORACLE");
queryProvider
.setSelectClause("SELECT part.partition_name, config.ID_CONFIG_DATA_EXPORT, config.SERVIDOR_ORIGEM, "
+ "config.SERVIDOR_DESTINO, config.CAMINHO_DESTINO, config.NOME_TABELA, config.TEMPO_RETENCAO_BD, "
+ "config.TEMPO_RETENCAO_TAPELIBRARY, config.TEMPO_DELAY, config.FREQUENCIA_EXECUCAO ");
queryProvider.setFromClause("FROM user_tab_partitions#CDRONE_RAC part "
+ "INNER JOIN CONFIG_DATA_EXPORT config ON config.NOME_TABELA = part.TABLE_NAME "
+ "LEFT JOIN CONFIG_DATA_EXPORT_LOG exlog ON config.ID_CONFIG_DATA_EXPORT = exlog.ID_CONFIG_DATA_EXPORT ");
queryProvider.setWhereClause(" WHERE exlog.ID_CONFIG_DATA_EXPORT IS NULL");
queryProvider.setSortKey("config.id_config_data_export");
return queryProvider.getObject();
The problem I have is when Spring generates the 'remainingPagesSql' which it includes ' AND ((config.id_config_data_export > ?))' at the end of the query. I was expecting that Spring would automatically use the sortKey as the parameter, but it does not and gives me the error:
org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback; uncategorized SQLException for SQL [SELECT * FROM (SELECT part.partition_name, config.ID_CONFIG_DATA_EXPORT, config.SERVIDOR_ORIGEM, config.SERVIDOR_DESTINO, config.CAMINHO_DESTINO, config.NOME_TABELA, config.TEMPO_RETENCAO_BD, config.TEMPO_RETENCAO_TAPELIBRARY, config.TEMPO_DELAY, config.FREQUENCIA_EXECUCAO FROM user_tab_partitions#CDRONE_RAC part INNER JOIN CONFIG_DATA_EXPORT config ON config.NOME_TABELA = part.TABLE_NAME LEFT JOIN CONFIG_DATA_EXPORT_LOG exlog ON config.ID_CONFIG_DATA_EXPORT = exlog.ID_CONFIG_DATA_EXPORT WHERE exlog.ID_CONFIG_DATA_EXPORT IS NULL ORDER BY config.id_config_data_export ASC) WHERE ROWNUM <= 10 AND ((config.id_config_data_export > ?))]; SQL state [99999]; error code [17041]; Missing IN or OUT parameter at index:: 1; nested exception is java.sql.SQLException: Missing IN or OUT parameter at index:: 1
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:84) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:684) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:716) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:726) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:776) ~[spring-jdbc-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.batch.item.database.JdbcPagingItemReader.doReadPage(JdbcPagingItemReader.java:222) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.item.database.AbstractPagingItemReader.doRead(AbstractPagingItemReader.java:108) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.read(AbstractItemCountingItemStreamItemReader.java:88) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider.doRead(SimpleChunkProvider.java:91) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider.read(SimpleChunkProvider.java:157) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration(SimpleChunkProvider.java:116) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:374) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.item.SimpleChunkProvider.provide(SimpleChunkProvider.java:110) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:69) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133) ~[spring-tx-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:272) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:81) ~[spring-batch-core-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at org.springframework.batch.repeat.support.TaskExecutorRepeatTemplate$ExecutingRunnable.run(TaskExecutorRepeatTemplate.java:262) ~[spring-batch-infrastructure-3.0.8.RELEASE.jar:3.0.8.RELEASE]
at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_121]
Caused by: java.sql.SQLException: Missing IN or OUT parameter at index:: 1
I've tried to map a parameter but I don't have or don't know how to use this parameter as it's based on the PAGE_SIZE, so in each time the query is executed, the value changes.
FIRST EDIT
I've updated the sortKey to "id_config_data_export", without the "config.". But the error still happens.
Finally made it work! Instead of writing the query directly in Java, I created a view with the query and it worked. I still don't know why it was not working but that's the workaround.
This is the queryProvider now:
private PagingQueryProvider queryProviderStep1() throws Exception {
SqlPagingQueryProviderFactoryBean queryProvider = new SqlPagingQueryProviderFactoryBean();
queryProvider.setDataSource(infraConfig.dataSourceLocal());
queryProvider.setDatabaseType("ORACLE");
queryProvider.setSelectClause("SELECT * ");
queryProvider.setFromClause("FROM VW_CONFIG_EXPORT_PART_CDRONE ");
queryProvider.setSortKey("id_config_data_export");
return queryProvider.getObject();

Unable to locate table meta data for table

I am using Spring and my class is annotated with #Transactional.
I am using the SimpleJdbcInsert but I am getting the following warning:
TableMetaDataProvider: - Unable to locate table meta data for
'data.data_insert' -- column names must be provided
I have three tables and all the three are having the relationship such that:
primary key of table1 is the foreign key in table 2 and the primary key in table 2 is the foreign key in table 3.
Showing table 1 insert code:
java.sql.Timestamp timestamp = getCurrentJavaSqlTimestamp();
Map<String, Object> params = new HashMap<String, Object>();
params.put("notes", task.getNotes());
params.put("recording_time", timestamp);
params.put("end_user_id", 805);
SimpleJdbcInsert insertData = new SimpleJdbcInsert(dataSource).
withTableName("data.data_insert").
usingColumns("notes", "recording_time",
"end_user_id").usingGeneratedKeyColumns("data_id");
long dataId = insertData.executeAndReturnKey(params).longValue();
The error logs:
2015-09-29 14:10:27,133 WARN [http-8080-2] LegacyFlexJsonExceptionMessageConverter: - Generated Key Name(s) not specificed. Using the generated keys features requires specifying the name(s) of the generated column(s) for User ID: 805, Request ID: f8da3bb5-0613-4a74-9ca8-95a6ab4f1692, clientIP: 127.0.0.1 uri: /admin/dataInsert, Request Parameters:
org.springframework.dao.InvalidDataAccessApiUsageException: Generated Key Name(s) not specificed. Using the generated keys features requires specifying the name(s) of the generated column(s)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.prepareStatementForGeneratedKeys(AbstractJdbcInsert.java:530)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.access$0(AbstractJdbcInsert.java:528)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert$1.createPreparedStatement(AbstractJdbcInsert.java:448)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:581)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:843)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.executeInsertAndReturnKeyHolderInternal(AbstractJdbcInsert.java:445)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.executeInsertAndReturnKeyInternal(AbstractJdbcInsert.java:426)
at org.springframework.jdbc.core.simple.AbstractJdbcInsert.doExecuteAndReturnKey(AbstractJdbcInsert.java:380)
at org.springframework.jdbc.core.simple.SimpleJdbcInsert.executeAndReturnKey(SimpleJdbcInsert.java:122)
at com.gridpoint.energy.datamodel.impl.PGDataFixBackUpManagerBean.backupDataInRange(PGDataFixBackUpManagerBean.java:79)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:319)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
The correct one:
java.sql.Timestamp timestamp = getCurrentJavaSqlTimestamp();
Map<String, Object> params = new HashMap<String, Object>();
params.put("notes", task.getNotes());
params.put("recording_time", timestamp);
params.put("end_user_id", 805);
SimpleJdbcInsert insertData = new
SimpleJdbcInsert(dataSource).withSchemaName("data").
withTableName("data_insert")
usingColumns("notes", "recording_time",
"end_user_id").usingGeneratedKeyColumns("data_id");
long dataId = insertData.executeAndReturnKey(params).longValue();
So just needed a schemaName using withSchemaName.

Resources