Retrieve values fom database using JDBC Template in hashmap - jdbc

i am using JDBC template for getting data from database in Spring MVC.
my query is:
SELECT count(A.MEETING_ID),ITEM_TBL.REG_EMAIL FROM ITEM_TBL,MEETINGS_TBL WHERE ITEM_TBL.MEETING_ID=MEETINGS_TBL.MEETING_ID
GROUP BY ITEM_TBL.REG_EMAIL
this is returning rows like:
11 nishant#gmail.com
12 abhilasha#yahoo.com
13 shiwani#in.com
i want to store these value into Hash MAP. Can you please help how can i do this using JDBC TEMPLATE?
Thanks

You need ResultExtractor.
You can achieve that using below code.
String sql = "SELECT count(A.MEETING_ID),ITEM_TBL.REG_EMAIL FROM ITEM_TBL,MEETINGS_TBL WHERE ITEM_TBL.MEETING_ID=MEETINGS_TBL.MEETING_ID
GROUP BY ITEM_TBL.REG_EMAIL";
ResultExtractor mapExtractor = new ResultSetExtractor() {
public Object extractData(ResultSet rs) throws SQLException {
Map<String, String> mapOfKeys = new HashMap<String, String>();
while (rs.next()) {
String key = rs.getString("MEETING_ID");
String obj = rs.getString("REG_EMAIL");
/* set the business object from the resultset */
mapOfKeys.put(key, obj);
}
return mapOfKeys;
}
};
Map map = (HashMap) jdbcTemplate.query(sql.toString(), mapExtractor);

Related

How to use directly query using mybatis?

i want use this query 'ANALYZE TABLE {tableName}' but i think mybatis supports only CRUD.
how to use 'ANALYZE TABLE' in mybatis?
Just declare it as a normal select and specify Map as the return type.
#Select("analyze table ${tableName}")
Map<String, Object> analyzeTable(String tableName);
#Test
public void testAnalyzeTable() {
try (SqlSession sqlSession = sqlSessionFactory.openSession()) {
Mapper mapper = sqlSession.getMapper(Mapper.class);
Map<String, Object> result = mapper.analyzeTable("users");
assertEquals("test.users", result.get("Table"));
assertEquals("analyze", result.get("Op"));
assertEquals("status", result.get("Msg_type"));
assertEquals("OK", result.get("Msg_text"));
}
}
Tested using...
MariaDB 10.4.10
MariaDB Connector/J 2.5.4

How do you map the output of a Spring stored procedure execute?

I am using Spring and stored procedures to retrieve data from a mySQL database. I have the stored procedure and parameters working OK but I'm having problems mapping the result set. At the moment I have some truly ugly code to get the values and I'm sure there has to be a better, cleaner and more elegant way. Can anyone guide me to a better solution?
After the stored procedure class, I have:
List<String> outList = new ArrayList<String>();
Map<String,Object> outMap = execute(parameters_map);
List list = (List) outMap.get("#result-set-1");
for (Object object : list) {
Map map2 = (Map) object;
list.add(map2.get("runname"));
}
return outList;
runname is the column from the database query.
Is there a better way to achieve this?
Example from spring docs using RowMapper:
public class JdbcActorDao implements ActorDao {
private SimpleJdbcCall procReadAllActors;
public void setDataSource(DataSource dataSource) {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
jdbcTemplate.setResultsMapCaseInsensitive(true);
this.procReadAllActors = new SimpleJdbcCall(jdbcTemplate)
.withProcedureName("read_all_actors")
.returningResultSet("actors",
BeanPropertyRowMapper.newInstance(Actor.class));
}
public List getActorsList() {
Map m = procReadAllActors.execute(new HashMap<String, Object>(0));
return (List) m.get("actors");
}
// ... additional methods
}
took a while to interpret the Spring docs but I finally got there.
My solution:
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(jdbcTemplate)
.withProcedureName("DistinctRunNames")
.withoutProcedureColumnMetaDataAccess();
simpleJdbcCall.addDeclaredParameter(new SqlParameter("environment", Types.VARCHAR));
simpleJdbcCall.addDeclaredParameter(new SqlParameter("username", Types.VARCHAR));
simpleJdbcCall.addDeclaredParameter(new SqlParameter("test_suite", Types.VARCHAR));
SqlParameterSource parameters = new MapSqlParameterSource().addValue("environment", environment)
.addValue("username", username).addValue("test_suite", testSuite);
Map map = simpleJdbcCall.returningResultSet("runnames", new ParameterizedRowMapper<RunNameBean>() {
public RunNameBean mapRow(ResultSet rs, int rowNum) throws SQLException {
RunNameBean runNameBean = new RunNameBean();
runNameBean.setName(rs.getString("runname"));
return runNameBean;
}
}).execute(parameters);
return (List) map.get("runnames");
Had problems with expected parameters versus actual, had to break up the simpleJdbcCall object. Maps the results into a list beautifully.
Thank you for answers, helped me to learn about Spring mapping.

Deserializing DynamoDBResults with gson fails

I have a specific use case where I store the results from my one table in DynamoDB to be stored in a serialized manner in another DynamoDB.
Now when I use gson to deserialize the data being retrieved,
I get this error:
java.lang.RuntimeException: Unable to invoke no-args constructor for class java.nio.ByteBuffer. Register an InstanceCreator with Gson for this type may fix this problem.
at com.google.gson.internal.ConstructorConstructor$12.construct(ConstructorConstructor.java:210)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:186)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:103)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:196)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:40)
at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:187)
at com.google.gson.internal.bind.MapTypeAdapterFactory$Adapter.read(MapTypeAdapterFactory.java:145)
at com.google.gson.Gson.fromJson(Gson.java:810)
at com.google.gson.Gson.fromJson(Gson.java:775)
My method looks like this:
public void store(MyCustomObject obj) {
String primaryKey = obj.getKey();
List<Map<String, AttributeValue>> results = AmazonDynamoDB.query(...).getItems();
Gson gson = new Gson();
List<String>records = results .stream()
.map(mappedResult-> gson.toJson(mappedResult))
.collect(Collectors.toList());
Map<String, AttributeValue> attributeMap = transformToAttributeMap(records);
PutItemRequest putItemRequest = new PutItemRequest().withItem(attributeMap);
AmazonDynamoDB.putItem(...);
}
The method to retrieve the records looks something like this:
public void retrieve(String id) {
QueryRequest...
Map<String, AttributeValue> records = DynamoDB.query(...).getItems();
List<String> serializedRecords = new ArrayList<>();
List<AttributeValue> values = records.get("key");
for( AttributeValue attributeValue: values) {
serializedRecords.add(attributeValue.getS());
}
Gson gson = new Gson();
Type recordType = new TypeToken<Map<String, AttributeValue>>() { }.getType();
List<Map<String, AttributeValue>> actualRecords = serializedRecords.stream()
.map(record-> gson.fromJson(record, recordType))
.collect(Collectors.toList());
}
What am I doing wrong?
The problem is AttributeValue class has a field java.nio.ByteBuffer with name b. Gson tries to deserialize the data into it, but there is no default constructor for ByteBuffer class. Therefore gson cannot deserialize b field.
An alternative solution is with the new DynamoDB usage of AWS SDK. Following example should work:
AmazonDynamoDBClient client = new AmazonDynamoDBClient(
new ProfileCredentialsProvider());
Item item = new DynamoDB(client).getTable("user").getItem("Id", "user1");
String json = item.toJSON();
Item deserialized = Item.fromJSON(json);
You should modify the credentials provider according to your setup.
Not exactly the best workaround/answer, but I was able to do this:
Item item = new Item().withJSON("document", jsonStr);
Map<String,AttributeValue> attributes = InternalUtils.toAttributeValues(item);
return attributes.get("document").getM();

Getting IncorrectResultSetColoumnCount exception in queryForList

I am using queryForList to get a list from DB ,
my code looks like,
List<RoleIdBean> role = jdbcTemplate.queryForList(query , new Object[] {userId},RoleIdBean.class);
query = select * from role where userid=?
role table has two coloumns and roleIdBean has two variables .
When I am running this code it is saying expected 1, actual 2
Could someone please check where i am going wrong and assist how to use this method.
As M. Deinum mentions, you have to provide implementation of RowMapper interface so that Spring knows which columns from your table to map to which properties of your object (RoleIdBean). For instance like this:
List<RoleIdBean> list = jdbcTemplate.query("SELECT * FROM role_id", new Object[]{ userId }, new RowMapper<RoleIdBean>() {
#Override
public RoleIdBean mapRow(ResultSet rs, int rowNum) throws SQLException {
RoleIdBean bean = new RoleIdBean();
// Set properties from the ResultSet, e.g:
// bean.setRole(rs.getString(1));
return bean;
}
});

Insert byte[] into blob field with spring's jdbcTemplate and stored procedure

I'm try to insert byte[] into blob field with stored procedure, and get an Exception :
Request processing failed; nested exception is org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT ID FROM sp_NEWFILE(?,?,?)]; nested exception is org.firebirdsql.jdbc.field.TypeConversionException: Error converting to object.
Model:
public class fileBody {
private int ID;
private byte[] BODY;
private String FILENAME; //getters an setters}
Insert it to database
public class FileBodyDaoImpl implements FileBodyDao {
public int insertData(final FileBody fileBody) throws IOException {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
LobHandler lobHandler = new DefaultLobHandler();
final InputStream in = new ByteArrayInputStream(fileBody.getBODY());
final int fileSize = in.available();
Map<String, Object> out = jdbcTemplate.queryForMap("SELECT ID FROM sp_NEWFILE(?,?,?)",
new AbstractLobCreatingPreparedStatementCallback(lobHandler) {
protected void setValues(PreparedStatement ps, LobCreator lobCreator) throws SQLException,
DataAccessException {
ps.setString(1, fileBody.getFILENAME());
lobCreator.setBlobAsBinaryStream(ps, 2, in, fileSize);
ps.setNull(3, java.sql.Types.INTEGER);
}
});
int last_inserted = Integer.parseInt(String.valueOf(out.get("ID")));
return last_inserted;
}
And my stored procedure
create or alter procedure sp_NEWFILE (
FILENAME varchar(255),
BODY blob sub_type 0 segment size 80,
USEID integer)
returns (
ID integer)
as
begin
if (useid is not null) then ID=USEID;
else ID=GEN_ID(gen_filebody_id,1);
if ((FILENAME is NULL) or (FILENAME='')) then FILENAME='UNDEFINED';
INSERT INTO t_filebody(ID,BODY,FILENAME) VALUES(:ID,:BODY,:FILENAME);
suspend;
end^
and i get an Exception:
Request processing failed; nested exception is org.springframework.jdbc.BadSqlGrammarException:
PreparedStatementCallback;
bad SQL grammar [SELECT ID FROM sp_NEWFILE(?,?,?)]; nested exception is org.firebirdsql.jdbc.field.TypeConversionException: Error converting to object.
Versions:
jaybird-jdk17-2.2.5;
Source: firebird2.5 Version: 2.5.1.26351.ds4-2ubuntu0.1;
The problem is that queryForMap does not support a PreparedStatementCallback (contrary to for example execute), instead your anonymous object is considered a normal parameter for the query to execute, and Jaybird does not support this object type. And if Jaybird had supported it, you would have received an error for missing parameters 2 and 3.
Your code can be greatly simplified by passing the byte array:
Map<String, Object> out = jdbcTemplate.queryForMap("SELECT ID FROM sp_NEWFILE(?,?,?)",
fileBody.getFILENAME(), fileBody.getBODY(), null);
This works as Jaybird considers a BLOB SUB_TYPE 0 as a java.sql.Types.LONGVARBINARY and JDBC 4.2 appendix B declares that byte[] is the default type for that (although you can also use it as a java.sql.Types.BLOB).
As a side note, your stored procedure does not need to be selectable (removing SUSPEND makes it executable), and the procedure could also be replaced by using a TRIGGER to generate the primary key and retrieving the value either by using INSERT .. RETURNING .. or through the JDBC generated keys facility (which in turn is implemented in Jaybird through INSERT .. RETURNING ..).
For those seeking non-Jaybird solution to insert BLOB using spring jdbctemplate, the following syntax worked for me to use stored procedures which is different compared to insert via queries.
Insert via Queries
ByteArrayInputStream inputStream = new ByteArrayInputStream(file.getBytes());
ps.setBlob(1, inputStream);
Insert via Stored Procedure Call
Map<String, Object> inParams = new HashMap<>();
inParams.put("pi_some_id", id);
inParams.put("pi_file_blob", new SqlLobValue(file.getBytes()));
SqlParameterSource sqlParameterSource = new MapSqlParameterSource(inParams);
SqlParameter[] sqlParameters = {
new SqlParameter("pi_some_id", Types.VARCHAR),
new SqlParameter("pi_file_blob", Types.BLOB),
new SqlOutParameter("po_error_flag", Types.VARCHAR),
new SqlOutParameter("po_message", Types.VARCHAR)};
SimpleJdbcCall simpleJdbcCall = new SimpleJdbcCall(jdbcTemplate).withoutProcedureColumnMetaDataAccess().
withProcedureName(storedProcName).withCatalogName(packageName).
declareParameters(sqlParameters);
Map<String, Object> storedProcResult = simpleJdbcCall.execute(sqlParameterSource);

Resources