jdbcTemplate.batchUpdate for list of inserts doesn't work - spring

I'm out of nerves cells and I really need your help guys.
For some reason I can't get list of answers inserted to database
I use jdbcTemplate to do that and my code looks like this
public void insertVastaukset(List<Vastaus> vastaukset) {
final String sql = "insert into vastaus (kysymysID, vastausteksti) values(?,?)";
getJdbcTemplate().batchUpdate(sql,
new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i)
throws SQLException {
Vastaus vastaus = vastaukset.get(i);
ps.setInt(1, vastaus.getKysymysID());
ps.setString(2, vastaus.getVastausteksti());
}
#Override
public int getBatchSize() {
return vastaukset.size();
}
});}
And for some reason the program gives no error! When i tried it couple days ago it gave me null pointer, but now its fixed but still no inserts are created into database. Help me please, Thank you very much!

Use the below catch block to examine the exceptions
int[] updateCounts;
try {
updateCounts = getJdbcTemplate().batchUpdate("insert into test123 (id, value) values (?,?)",
new BatchPreparedStatementSetter() {
//// YOUR CODE HERE
});
}
catch (Exception sqle) {
Throwable s2 = sqle;
System.out.println("=============v"+s2.getClass().getName()+"=====================");
while (s2!=null) {
s2.printStackTrace();
s2 = s2.getCause();
if(s2 instanceof java.sql.BatchUpdateException) {
System.out.println("======================^^^^^^======================");
((java.sql.BatchUpdateException) s2).getNextException().printStackTrace();
}
}
throw sqle;
}

Related

what's the matter with bulk update method

I wrote a bulk update by spring boot jdbcTemplate, the response code is 200, but the mysql recodes haven't updated yet, I don't know the reason, here is my codes:
#Override
#Transactional
public void updateCustomCategory(List<ItemDto> itemDtoList) {
if (CollectionUtils.isNotEmpty(itemDtoList)) {
jdbcTemplate.batchUpdate("update item_tab set l1_custom_category_id = ? AND l2_custom_category_id = ? where item_id = ?",
new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setLong(1, itemDtoList.get(i).getL1CustomCatId());
ps.setLong(2, itemDtoList.get(i).getL2CustomCatId());
ps.setLong(3, itemDtoList.get(i).getItemId());
}
#Override
public int getBatchSize() {
return 500;
}
});
}
}
how should I modify the logic?
the issue fixed now, add the "rewriteBatchedStatements=true" and index for "itemid"

Usage of custom freemarker template

I was wondering if anyone can help me with Apache FreeMarker? I'm trying to use a custom model but I can't figure it out.
Imagine I want to dump the result of a query (java ResultSet in a FreeMarker template). What is the best approach?
I have found on Google the class: ResultSetTemplateModel
import java.sql.ResultSet;
import freemarker.template.SimpleScalar;
import freemarker.template.TemplateHashModel;
import freemarker.template.TemplateModel;
import freemarker.template.TemplateModelException;
import freemarker.template.TemplateSequenceModel;
public class ResultSetTemplateModel implements TemplateSequenceModel {
private ResultSet rs = null;
public ResultSetTemplateModel(ResultSet rs) {
this.rs = rs;
}
public TemplateModel get(int i) throws TemplateModelException {
try {
rs.next();
} catch(Exception e) {
throw new TemplateModelException(e.toString());
}
TemplateModel model = new Row(rs);
return model;
}
public int size() throws TemplateModelException {
int size=0;
try {
rs.last();
size = rs.getRow();
rs.beforeFirst();
} catch (Exception e ) {
throw new TemplateModelException( e.toString());
}
return size;
}
class Row implements TemplateHashModel {
private ResultSet rs = null;
public Row(ResultSet rs) {
this.rs = rs;
}
public TemplateModel get(String s) throws TemplateModelException {
TemplateModel model = null;
try {
model = new SimpleScalar( rs.getString(s) );
} catch (Exception e) { e.printStackTrace(); }
return model;
}
public boolean isEmpty() throws TemplateModelException {
boolean isEmpty = false;
if ( rs == null ) { isEmpty = true; }
return isEmpty;
}
}
}
And I have a very simple class (I even made it easier than previous):
public static void main(String[] args) {
try {
Configuration cfg = new Configuration(Configuration.VERSION_2_3_27);
cfg.setTemplateExceptionHandler(TemplateExceptionHandler.RETHROW_HANDLER);
cfg.setClassForTemplateLoading(MyCLASS.class, "/");
StringWriter out = new StringWriter();
Map<String, Object> parameters = new TreeMap<>();
ResultSet rs = getResultSet("Select foo, bar FROM my_table");
parameters.put("hello", "World");
parameters.put("result", rs);
Template temp = cfg.getTemplate("template.txt");
temp.process(parameters, out);
System.out.println("out = " + out);
} catch (IOException | TemplateException e) {
e.printStackTrace();
}
}
My template
Hello ${hello}
<#-- how do I specify ResultSet columns here ?? -->
How can I use the custom template?? Any advice?? I know how to load the template file. But I don't know how to specify that it is a custom model in the template.
THank you guys for the support :)
There are two ways of using ResultSetTemplateModel for wrapping ResultSet-s:
Either extend DefaultObjectWrapper by overriding handleUnknownType, where you return new ResultSetTemplateModel((ResultSet) obj) if obj is a ResultSet, otherwise call super. Then use Configuration.setObjectWrapper to actually use it.
Or, add new ResultSetTemplate(rs) to parameters instead of rs; if something is already a TempalteModel, it will not be wrapped again. Note that if you get a ResultSet from somewhere else in the template, this approach will not work as it avoids your manual wrapping, so extending the DefaultObjectWrapper is what you want generally.
Note that the ResultSetTemplateModel implementation shown is quite limited. The ObjectWrapper should be passed to the constructor as well, and stored in a final field. Then, instead of new SimpleScalar( rs.getString(s) ) it should do objectWrapper.wrap(rs.getObject(s)).

Subscribers onnext does not contain complete item

We are working with project reactor and having a huge problem right now. This is how we produce (publish our data):
public Flux<String> getAllFlux() {
return Flux.<String>create(sink -> {
new Thread(){
public void run(){
Iterator<Cache.Entry<String, MyObject>> iterator = getAllIterator();
ObjectMapper mapper = new ObjectMapper();
while(iterator.hasNext()) {
try {
sink.next(mapper.writeValueAsString(iterator.next().getValue()));
} catch (IOException e) {
e.printStackTrace();
}
}
sink.complete();
}
} .start();
});
}
As you can see we are taking data from an iterator and are publishing each item in that iterator as a json string. Our subscriber does the following:
flux.subscribe(new Subscriber<String>() {
private Subscription s;
int amount = 1; // the amount of received flux payload at a time
int onNextAmount;
String completeItem="";
ObjectMapper mapper = new ObjectMapper();
#Override
public void onSubscribe(Subscription s) {
System.out.println("subscribe");
this.s = s;
this.s.request(amount);
}
#Override
public void onNext(String item) {
MyObject myObject = null;
try {
System.out.println(item);
myObject = mapper.readValue(completeItem, MyObject.class);
System.out.println(myObject.toString());
} catch (IOException e) {
System.out.println(item);
System.out.println("failed: " + e.getLocalizedMessage());
}
onNextAmount++;
if (onNextAmount % amount == 0) {
this.s.request(amount);
}
}
#Override
public void onError(Throwable t) {
System.out.println(t.getLocalizedMessage())
}
#Override
public void onComplete() {
System.out.println("completed");
});
}
As you can see we are simply printing the String item which we receive and parsing it into an object using jackson wrapper. The problem we got now is that for most of our items everything works fine:
{"itemId": "someId", "itemDesc", "some description"}
But for some items the String is cut off like this for example:
{"itemId": "some"
And the next item after that would be
"Id", "itemDesc", "some description"}
There is no pattern for those cuts. It is completely random and it is different everytime we run that code. Ofcourse our jackson is gettin an error Unexpected end of Input with that behaviour.
So what is causing such a behaviour and how can we solve it?
Solution:
Send the Object inside the flux instead of the String:
public Flux<ItemIgnite> getAllFlux() {
return Flux.create(sink -> {
new Thread(){
public void run(){
Iterator<Cache.Entry<String, ItemIgnite>> iterator = getAllIterator();
while(iterator.hasNext()) {
sink.next(iterator.next().getValue());
}
}
} .start();
});
}
and use the following produces type:
#RequestMapping(value="/allFlux", method=RequestMethod.GET, produces="application/stream+json")
The key here is to use stream+json and not only json.

include does not work with local data store?

ParseQuery<Custom> query = ParseQuery.getQuery(Custom.class);
query.whereEqualTo("user", ParseUser.getCurrentUser());
query.include("address");
query.fromLocalDatastore();
query.getFirstInBackground(new GetCallback<Custom>() {
#Override
public void done(Custom c, ParseException e) {
c.getAddress().getName(); //this gives me fetchIfneeded exception.
}
});
if I do something like this
ParseQuery<Address> query = ParseQuery.getQuery(Address.class);
query.fromLocalDatastore();
query.findInBackground(new FindCallback<Address>() {
#Override
public void done(List<Address> addresses, ParseException e) {
for(Address address: addresses) {
address.getName(); //it works just fine
}
}
});
looks like the include does not work with local datastore.
is this by design? or miss something?
edit:
I can survive by adding
c.getAddress().fetchFromLocalDatastore();

Incorrect number of arguments for PROCEDURE; expected 1, got 0. Cant determine the error from code

//set input parameters
Map<String,Object> inParams = new HashMap<String,Object>();
inParams.put("Sig",resourceHistoryBean.getId());
List<ResourceHistoryBean> resourceHistoryList= new ArrayList<ResourceHistoryBean>();
// define stored procedure
try{
SimpleJdbcCall readResult = new SimpleJdbcCall(getDataSource())
.useInParameterNames("Sig")
.declareParameters(new SqlParameter("Sig", Types.VARCHAR))
.withProcedureName("SP_ResourceAllocationDtls")
.withSchemaName("hrms")
.returningResultSet("ResourceHistory", new ParameterizedRowMapper<ResourceHistoryBean>() {
public ResourceHistoryBean mapRow(ResultSet rs, int rowNum)
throws SQLException {
ResourceHistoryBean bean = new ResourceHistoryBean();
resourceHistoryBean.setProjectName(rs.getString(RH_PROJECT_NAME));
return bean;
}
});
readResult.compile();
// execute stored procedure
Map<String, Object> out = readResult.execute(inParams);
resourceHistoryList = (List<ResourceHistoryBean>) out.get("ResourceHistory");
Looks like I was able to find an alternative solution to above problem (Parameter passing to stored procedure and use a mapping class as well ):
public List<ResourceHistoryBean> getResourceHistory(final ResourceHistoryBean resourceHistoryBean)throws Exception{
try {
// call stored procedure and pass parameter to it
List resourceHistoryList = getJdbcTemplate().query(
"call hrms.SP_ResourceAllocationDtls(?)",
new Object[] {resourceHistoryBean.getId()}, new HistoryMapper());
return resourceHistoryList;
} catch (Exception e) {
throw e;
} finally {
closeTemplate();
}
}
// mapper class
class HistoryMapper implements RowMapper, IDatabaseConstants {
public Object mapRow(ResultSet rs, int rowNum) throws SQLException {
ResourceHistoryBean resourceHistoryBean = new ResourceHistoryBean();
resourceHistoryBean.setProjectName(rs.getString(RH_PROJECT_NAME));
return resourceHistoryBean;
}
}

Resources