I am using queryForList to get a list from DB ,
my code looks like,
List<RoleIdBean> role = jdbcTemplate.queryForList(query , new Object[] {userId},RoleIdBean.class);
query = select * from role where userid=?
role table has two coloumns and roleIdBean has two variables .
When I am running this code it is saying expected 1, actual 2
Could someone please check where i am going wrong and assist how to use this method.
As M. Deinum mentions, you have to provide implementation of RowMapper interface so that Spring knows which columns from your table to map to which properties of your object (RoleIdBean). For instance like this:
List<RoleIdBean> list = jdbcTemplate.query("SELECT * FROM role_id", new Object[]{ userId }, new RowMapper<RoleIdBean>() {
#Override
public RoleIdBean mapRow(ResultSet rs, int rowNum) throws SQLException {
RoleIdBean bean = new RoleIdBean();
// Set properties from the ResultSet, e.g:
// bean.setRole(rs.getString(1));
return bean;
}
});
Related
I am using Spring and stored procedures to retrieve data from a mySQL database. I have the stored procedure and parameters working OK but I'm having problems mapping the result set. At the moment I have some truly ugly code to get the values and I'm sure there has to be a better, cleaner and more elegant way. Can anyone guide me to a better solution?
After the stored procedure class, I have:
List<String> outList = new ArrayList<String>();
Map<String,Object> outMap = execute(parameters_map);
List list = (List) outMap.get("#result-set-1");
for (Object object : list) {
Map map2 = (Map) object;
list.add(map2.get("runname"));
}
return outList;
runname is the column from the database query.
Is there a better way to achieve this?
Example from spring docs using RowMapper:
public class JdbcActorDao implements ActorDao {
private SimpleJdbcCall procReadAllActors;
public void setDataSource(DataSource dataSource) {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.procReadAllActors = new SimpleJdbcCall(jdbcTemplate)
.withProcedureName("read_all_actors")
.returningResultSet("actors",
BeanPropertyRowMapper.newInstance(Actor.class));
}
public List getActorsList() {
Map m = procReadAllActors.execute(new HashMap<String, Object>(0));
return (List) m.get("actors");
}
// ... additional methods
}
took a while to interpret the Spring docs but I finally got there.
My solution:
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(jdbcTemplate)
.withProcedureName("DistinctRunNames")
.withoutProcedureColumnMetaDataAccess();
simpleJdbcCall.addDeclaredParameter(new SqlParameter("environment", Types.VARCHAR));
simpleJdbcCall.addDeclaredParameter(new SqlParameter("username", Types.VARCHAR));
simpleJdbcCall.addDeclaredParameter(new SqlParameter("test_suite", Types.VARCHAR));
SqlParameterSource parameters = new MapSqlParameterSource().addValue("environment", environment)
.addValue("username", username).addValue("test_suite", testSuite);
Map map = simpleJdbcCall.returningResultSet("runnames", new ParameterizedRowMapper<RunNameBean>() {
public RunNameBean mapRow(ResultSet rs, int rowNum) throws SQLException {
RunNameBean runNameBean = new RunNameBean();
runNameBean.setName(rs.getString("runname"));
return runNameBean;
}
}).execute(parameters);
return (List) map.get("runnames");
Had problems with expected parameters versus actual, had to break up the simpleJdbcCall object. Maps the results into a list beautifully.
Thank you for answers, helped me to learn about Spring mapping.
After processing some XML files with Spring Batch ItemProcessor.
The ItemProcessor returns items like this:
MetsModsDef
{
int id;
String title;
String path;
Properties identifers;
....
}
now i need to save this items into a database, so that the
(id, title, path) will go into the "Work" table
and all the Properties stored in the "identifiers" field go into a "Key/Value"-Table called "Identifier" (work, identitytype, identityValue)
how can i acheive this?
currently i am using a CompositeItemWriter to split the object and write it into two tables like this:
public ItemWriter<MetsModsDef> MultiTableJdbcWriter(#Qualifier("dataSource") DataSource dataSource) {
CompositeItemWriter<MetsModsDef> cWriter = new CompositeItemWriter<MetsModsDef>();
JdbcBatchItemWriter hsqlWorkWriter = new JdbcBatchItemWriterBuilder()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO work (id, title, path,enabled) VALUES (:id, :title,:path,1)" )
.dataSource(dataSource)
.build();
JdbcBatchItemWriter hsqlIdentifierWriter = new JdbcBatchItemWriterBuilder()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO identity (work, identitytype, identityValue) VALUES (:work, :identitytype, :identityValue)" )
.dataSource(dataSource)
.build();
List<ItemWriter<? super MetsModsDef>> mWriter = new ArrayList<ItemWriter<? super MetsModsDef>>();
mWriter.add(hsqlWorkWriter);
mWriter.add(hsqlIdentifierWriter);
cWriter.setDelegates(mWriter);
but this will not work for a propertylist since (work, identitytype, identityValue) are not part of my domain object MetModsDef which only contains one map of properties which are supposed to go into the Identifier table.
i have found advice on how to do it when writing to a file,
and even on using a splitter pattern from Spring-Integration Read one record/item and write multiple records/items using spring batch
but i am still not sure how to actually do it, when writing out via jdbc or hibernate (which i assume would be similarish )
thanks for your advice !
in case somebody is interested: after a while i have come up with own solution:
I have found one extending HibernateItemWriter (for hibernate writes) on the internet:
Spring-Batch Multi-line record Item Writer with variable number of lines per record
but i did not want to extend classes, so i had to come up with my own (based on what i could research on the internet).
I am not sure how good is, and how it will handle transactions or rollback (probably bad). but for now it is the only one i have. So if you need one too, or have comments on how to improve it. or even have a better one. You are very welcome.
i have created my own IdentifierListWriter which creates the Key/value-pairs-like-objects (here each pair is called "identitifier") for each MetsModsDef Item and writes out them all using JdbcBatchItemWriter identifierWriter wich is passed to it from the configuration
public class IdentifierListWriter implements ItemWriter<MetsModsDef>
{
private ItemWriter<Identifier> _identifierWriter;
public IdentifierListWriter ( JdbcBatchItemWriter<Identifier> identifierWriter )
{
_identifierWriter= identifierWriter;
}
#Transactional(readOnly = false, propagation = Propagation.REQUIRED)
public void write(List<? extends MetsModsDef> items) throws Exception
{
// Main Table WRITER
for ( MetsModsDef item : items )
{
ArrayList<Identifier> ids = new ArrayList<Identifier>();
for(String key : item.getAllIds().stringPropertyNames())
{
ids.add(new Identifier(item.getAllIds().getProperty(key),
key, item.getId()));
}
_identifierWriter.write(ids);
}
}
}
In the java configuration i create two jdbcBatchItemWriter Beans. One for the "Work" table and one for the "identifier" table. IdentifierListWriter bean and a CompositeItemWriter MultiTableJdbcWriter Bean which uses them all to write out the object
#Bean
#Primary
public ItemWriter<MetsModsDef> MultiTableJdbcWriter(#Qualifier("dataSource") DataSource dataSource) {
IdentifierListWriter identifierListWriter = new IdentifierListWriter(identifierWriter(dataSource) );
CompositeItemWriter cWriter = new CompositeItemWriter();
cWriter.setDelegates(Arrays.asList(hsqlWorkWriter(dataSource),identifierListWriter));
return cWriter;
}
#Bean
public JdbcBatchItemWriter<MetsModsDef> hsqlWorkWriter(#Qualifier("dataSource") DataSource dataSource) {
return new JdbcBatchItemWriterBuilder<MetsModsDef>()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO work (id, title, path,enabled) VALUES (:id, :title,:path,1)")
.dataSource(dataSource)
.build();
}
#Bean
public JdbcBatchItemWriter<Identifier> identifierWriter(#Qualifier("dataSource") DataSource dataSource) {
return new JdbcBatchItemWriterBuilder()
.itemSqlParameterSourceProvider(new BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO identifier (identifier, type, work_id) VALUES ( :identifier, :type, :work)")
.dataSource(dataSource)
//.afterPropertiesSet()
.build();
}
then the multiTableJdbcWriter is called from a Step:
#Bean
public Step step1(ItemWriter<MetsModsDef> multiTableJdbcWriter) {
return stepBuilderFactory.get("step1")
.<StreamSource, MetsModsDef>chunk(1)
.reader(new MetsModsReader())
.processor(metsModsFileProcessor())
.writer(multiTableJdbcWriter)
I'm using Spring and Oracle database in my solution and i need to execute script
select count(1) from ELEMENTS, table(cast(? as arrayofnumbers)) session_ids
where root_session_id in session_ids.VALUE
but i have a problem with passing input parameter.
i try to pass List or array of BigInteger into
JdbcTemplate.queryForObject("select count(1) from ELEMENTS, table(cast(? as arrayofnumbers)) session_ids
where root_session_id in session_ids.VALUE", Integer.class, INPUT_PARAMS)
but has an Exception:
java.sql.SQLException: Invalid column type
at oracle.jdbc.driver.OraclePreparedStatement.setObjectCritical(OraclePreparedStatement.java:8861)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:8338)
at oracle.jdbc.driver.OraclePreparedStatement.setObjectInternal(OraclePreparedStatement.java:9116)
at oracle.jdbc.driver.OraclePreparedStatement.setObject(OraclePreparedStatement.java:9093)
at oracle.jdbc.driver.OraclePreparedStatementWrapper.setObject(OraclePreparedStatementWrapper.java:234)
at weblogic.jdbc.wrapper.PreparedStatement.setObject(PreparedStatement.java:357)
Does anyone have the same problem?
EDIT:
Forget to describe arrayofnumber. It's custom type:
TYPE arrayofnumbers as table of number(20)
Found the solution:
final BigInteger[] ids = new BigInteger[]{BigInteger.valueOf(9137797712513092132L)};
int count = jdbc.query("select count(1) from NC_DATAFLOW_ELEMENTS\n" +
" where root_session_id in (select /*+ cardinality(t 10) */ * from table(cast (? as arrayofnumbers)) t)"
, new PreparedStatementSetter() {
public void setValues(PreparedStatement preparedStatement) throws SQLException {
Connection conn = preparedStatement.getConnection();
OracleConnection oraConn = conn.unwrap(OracleConnection.class);
oracle.sql.ARRAY widgets = oraConn.createARRAY("ARRAYOFNUMBERS", ids);
preparedStatement.setArray(1, widgets);
}
}, new ResultSetExtractor<Integer>() {
public Integer extractData(ResultSet resultSet) throws SQLException, DataAccessException {
resultSet.next();
return resultSet.getInt(1);
}
});
out.println(count);
should note that type of array (ARRAYOFNUMBER) should be in upper case
i am using JDBC template for getting data from database in Spring MVC.
my query is:
SELECT count(A.MEETING_ID),ITEM_TBL.REG_EMAIL FROM ITEM_TBL,MEETINGS_TBL WHERE ITEM_TBL.MEETING_ID=MEETINGS_TBL.MEETING_ID
GROUP BY ITEM_TBL.REG_EMAIL
this is returning rows like:
11 nishant#gmail.com
12 abhilasha#yahoo.com
13 shiwani#in.com
i want to store these value into Hash MAP. Can you please help how can i do this using JDBC TEMPLATE?
Thanks
You need ResultExtractor.
You can achieve that using below code.
String sql = "SELECT count(A.MEETING_ID),ITEM_TBL.REG_EMAIL FROM ITEM_TBL,MEETINGS_TBL WHERE ITEM_TBL.MEETING_ID=MEETINGS_TBL.MEETING_ID
GROUP BY ITEM_TBL.REG_EMAIL";
ResultExtractor mapExtractor = new ResultSetExtractor() {
public Object extractData(ResultSet rs) throws SQLException {
Map<String, String> mapOfKeys = new HashMap<String, String>();
while (rs.next()) {
String key = rs.getString("MEETING_ID");
String obj = rs.getString("REG_EMAIL");
/* set the business object from the resultset */
mapOfKeys.put(key, obj);
}
return mapOfKeys;
}
};
Map map = (HashMap) jdbcTemplate.query(sql.toString(), mapExtractor);
I use the Spring JDBCTemplate to call a stored procedure ,like this
Map receive10PrmtBill = (Map) getJdbcTemplate().execute(sql, new CallableStatementCallback() {
#Override
public Object doInCallableStatement(CallableStatement cs) throws SQLException,
DataAccessException {
cs.setString(1, Constants.ADD_PROC_TYPE);
Map<String,PrmtBillInfoDatagram> tempMap = new HashMap();
cs.execute();
but the execute method return false and didn't throw any exception ,so I don't know what's wrong with my program , how to catch the Exception? Any help?
This is the proc
create proc sp_xx ( #userid int)
as
begin
select personid, personname from person where personid = #userid
select teamid, teamname from team
end
Try setting "IGNORE_DONE_IN_PROC" property to "true". See at:
http://javabob64.wordpress.com/2011/04/12/sybase-and-the-jdbc-driver/