I am accessing JSON array in streams , looking to handle if that JSON array is not present in the schema - java-8

JsonObject response = new JsonObject(IOUtils.resourceToString("/ResponseSample.json",
Charset.defaultCharset()));
JsonObject nameObj=
response.getJsonArray("applications",new JsonArray())
.stream().
map(JsonObject.class::cast)
.filter(x->x.getString("id").equalsIgnoreCase("2022025GSxxxxxx"))
.findFirst()
.get()
.getJsonArray("applicants",new JsonArray())
.getJsonObject(0)
.getJsonArray("names", new JsonArray())
.getJsonObject(0)
.getJsonObject("name",new JsonObject());
System.out.println(nameObj.getString("first","")+"--"+nameObj.getString("last",""));
} catch (IOException ex) {
ex.printStackTrace();
}

Related

Rest template exchange not accept ParameterizedTypeReference

Oops guys! Beauty? I'm trying to get a byte list[] in my Rest template's response, but my exchange isn't accepting the new ParameterizedTypeReference<List<byte[]>>() {} , could someone help me?
ResponseEntity<List<byte[]>> response = null;
try {
response = restTemplate.exchange(parametros.get("SERVICE_HUB2_BASE_URL") + "/fw/v1/pdf/kms/assinaturas",
HttpMethod.POST, entity, new ParameterizedTypeReference<List<byte[]>>() {});
} catch (HttpServerErrorException e) {
e.printStackTrace();
throw new ClientException(e.getStatusCode().value(), e.getStatusText());
} catch (HttpClientErrorException e) {
e.printStackTrace();
throw new ClientException(e.getStatusCode().value(), e.getStatusText());
} catch (Exception e) {
e.printStackTrace();
}
Hi As per the mouse hower it's pointing to different method.
Please check import statement for rest template.
both ways should be fine .
ResponseEntity<Collection<byte[]>> responseEntityOne = restTemplate.exchange(formattedUrl, HttpMethod.POST, entity,
new ParameterizedTypeReference<Collection<byte[]>>(){});
ResponseEntity<List<byte[]>> responseEntityOne1 = restTemplate.exchange(formattedUrl, HttpMethod.POST, entity,
new ParameterizedTypeReference<List<byte[]>>(){});

java stream is making weird things to generate csv file in Spring Boot

I'm processing a csv file through my springboot app, the file is to download it, in my case I use streams but there is a problem what I don't know what's wrong in my code because some rows is complete with the columns but next row only write some columns and leftover columns are write below as if were a new row. I hope you understand what I mean. I hope you give a hand, thank you in advance.
This code below is the controller
.....
#RequestMapping(value="/stream/csv/{grupo}/{iduser}", method = RequestMethod.GET)
public void generateCSVUsingStream(#PathVariable("grupo") String grupo,
#PathVariable("iduser") String userId,HttpServletResponse response) {
response.addHeader("Content-Type", "application/csv");
response.addHeader("Content-Disposition", "attachment; filename=\""+userId+"_Reporte_PayCash"+grupo.replaceAll("\\s", "")+".csv");
response.setCharacterEncoding("UTF-8");
try (Stream<ReportePayCashDTO> streamPaycashdatos = capaDatosDao.ReportePayCashStream(userId, grupo);PrintWriter out = response.getWriter();) {
//PrintWriter out = response.getWriter();
out.write(String.join(",", "Cuenta" , "Referencia", "Referencia_paycash","Distrito","Plaza","Cartera"));
out.write("\n");
streamPaycashdatos.forEach(streamdato -> {
out.write(streamdato.getAccount()+","+streamdato.getReferencia()+","+streamdato.getReferenciapaycash()
+","+streamdato.getCartera()+","+streamdato.getState()+","+streamdato.getCity());
out.append("\r\n");
});
out.flush();
out.close();
streamPaycashdatos.close();
} catch (IOException ix) {
throw new RuntimeException("There is an error while downloading file", ix);
}
}
The method on DAO is this
...
#Override
public Stream<ReportePayCashDTO> ReportePayCashStream(String userId, String grupo) {
// TODO Auto-generated method stub
Stream<ReportePayCashDTO > stream = null ;
String query ="";
//more code
try {
stream = getJdbcTemplate().queryForStream(query, (rs, rowNum) -> {
return new ReportePayCashDTO(Utils.valnull(rs.getString("account")),
Utils.valnull(rs.getString("reference")),
Utils.valnull(rs.getString("referencepaycash")),
Utils.valnull(rs.getString("state")),
Utils.valnull(rs.getString("city")),
Utils.valnull(rs.getString("cartera"))
);
});
}catch(Exception e) {
e.printStackTrace();
logger.error(e.getMessage());
}
return stream;
}
Example: This is what I hoped will write into csv file
55xxxxx02,88xxxx153,1170050202662,TAMAULIPAS,TAMPICO,AmericanExpre
58xxxxx25,88xxx899,1170050202662,TAMAULIPAS,TAMPICO,AmericanClasic
but some rows was written like this
55xxxxx02,88xxxx153,1170050202662
,TAMAULIPAS,TAMPICO,AmericanExpre
58xxxxx25,88xxx899,1170050202662
,TAMAULIPAS,TAMPICO,AmericanClasic

Does oracle physical connection impact performance while closing it

I am developing an application for my final year project, I need a quick clarification on java JDBC .. I am using Oracle physical connection(no connection pooling). When a user search in search criteria, the query execution is happening in 10 ms and returning the result set to UI is taking up to 1 minute. Code is been attached.
Connection con=null;
OracleConnection ocon = null;
ResultSet rs = null;
Gson gson = new Gson();
String gsonOrdrHeaderEntity="";
JSONObject searchObject = new JSONObject();
JSONObject returnSearchObject = new JSONObject();
CallableStatement callableStatement = null;
ArrayList<SearchEntity> searchArrayList = new ArrayList<>();
try{
con= DatasourceConfiguration.openConnection();
ocon = (OracleConnection)con.unwrap(OracleConnection.class);
callableStatement= con.prepareCall("call SEARCHPKG.QUERY(?,?,?,?,?)");
callableStatement.setString(1,entity.getSearch1());
callableStatement.setString(2,entity.getSearch2());
callableStatement.setString(3,entity.getSearch3());
callableStatement.setString(4,entity.getSearch4());
callableStatement.registerOutParameter(5, OracleTypes.CURSOR);
callableStatement.execute();
rs = ((OracleCallableStatement) callableStatement).getCursor(39);
if(rs != null)
while(rs.next()) {
OrderHeaderEntity searchEntity = new OrderHeaderEntity();
searchEntity.setSearch1(rs.getString(1));
searchEntity.setSearch2(rs.getString(3));
searchEntity.setSearch3(rs.getString(4));
searchEntity.setSearch4(rs.getString(5));
/**
* Description : Since there are will multiple order in a single search, we are adding the row to an array and iterating over it
*/
searchArrayList.add(searchEntity);
}
}catch(SQLException e){
logger.info("SQLException while performing search: "+e.getMessage());
throw e;
}catch(Exception e){
logger.info("Exception while performing search: "+e.getMessage());
}finally {
try {
if(rs != null) {rs.close();}
} catch (Exception e2) {
logger.info("Error while closing the search resultSet : "+e2.getMessage());
}
try {
if(callableStatement!=null) {callableStatement.close();}
} catch (Exception e2) {
logger.info("Error while closing the search callableStatment :"+e2.getMessage());
}
finally {
}
try {
if(con!=null) {con.close();}
} catch (Exception e2) {
logger.info("Error while the search connection :"+e2.getMessage());
}
}
}
/**
* #Description : mapping result set with JAVA object
*/
logger.info("Inside ProcedureCall >> search Proc>> converted to JSON successfully");
gsonOrdrHeaderEntity= gson.toJson(searchArrayList);
searchObject.put("order_header", gsonOrdrHeaderEntity);
returnSearchObject.put("status", "success");
returnSearchObject.put("message","Order Header Retrieved Successfully");
returnSearchObject.put("data", searchObject);
return returnSearchObject.toString();

the request was rejected because no multipart boundary was found when creating new document in Docushare Flex

I am trying to create new document in Docushare Flex using new docushare rest api and my request body suppose to be XML and I am generating it with requested data, when I send the request I get this error "org.apache.commons.fileupload.FileUploadException: the request was rejected because no multipart boundary was found"
HttpPost request = new HttpPost(postUrl);
String filePath = "C:/Test/CreateDocument.xml";
String createObj = helper.createDocumentXml(filePath, parentId, documentTitle, fileName, ownerId);
String createDocumentXml= null;
{
try {
createDocumentXml = FileUtils.readFileToString(new File(filePath));
} catch (IOException e) {
e.printStackTrace();
}
}
StringEntity bodyEntity = new StringEntity(createDocumentXml, ContentType.MULTIPART_FORM_DATA);
request.setEntity(bodyEntity);
CloseableHttpResponse response = client.execute(request);
System.out.println("Status is " + response.getStatusLine());
HttpEntity entity = response.getEntity();
I have used this block of code to upload a document in DocuShare Flex
HttpPost request = new HttpPost(postUrl);
String filePath = "C:/Test/CreateDocument.xml";
String createObj = helper.createDocumentXml();
String createDocumentXml= null;
{
try {
createDocumentXml = FileUtils.readFileToString(new File(filePath));
} catch (IOException e) {
e.printStackTrace();
}
}
FileBody body = new FileBody(file.toFile());
StringBody xmlContent = new StringBody(createDocumentXml, ContentType.APPLICATION_XML);
String boundry = UUID.nameUUIDFromBytes(file.toString().getBytes()).toString();
HttpEntity entity = MultipartEntityBuilder.create()
.setBoundary(boundry)
.setCharset(Charset.forName("UTF-8"))
.setMode(HttpMultipartMode.BROWSER_COMPATIBLE)`enter code here`
.addPart("content", body)
.addPart("request", xmlContent)
.build();
request.setEntity(entity);
CloseableHttpResponse response = client.execute(request);
HttpEntity entity = response.getEntity();

Saving value in a field using Java 8

I have a list of callables
KNNQuery<O> knnq; //field
List<Callable<KNNQuery>> callables = Arrays.asList(
task1(database, relation),
task2(database, relation));
executor.invokeAll(callables) //List<Future<KNNQery>>
.stream() //stream<Future<KNNQuery>>
.map(future -> {
try{
return future.get();
}catch (Exception e){
throw new IllegalStateException(e);
}
}) //stream<KNNQuery>
.forEach(System.out::println);
Instead of printing output of two Futures to the screen i want to combing and save both future's output to the knnq field how can i do that?
You have to use a reducing operation to combine the stream elements.
A stream collect, that is the transformation of the stream elements into a specific accumulated result (List, Map, KNNQuery, and so for...) is a reducing operation.
Supposing that KNNQuery accepts a String parameter as constructor that is KNNQuery(String value) and you would like to combine the KNNQuery toString() value, you could do the following that uses the Collectors.joining() reduction :
KNNQuery<O> knnq =
new KNNQuery(
executor.invokeAll(callables) //List<Future<KNNQery>>
.stream() //stream<Future<KNNQuery>>
.map(future -> {
try{
return future.get();
}catch (Exception e){
throw new IllegalStateException(e);
}
})
.map(Object::toString)
.collect(Collectors.joining(","))
);
With Collectors.reducing() you would have more freedom to combine KNNQuery instances.
For example you could delegate it to a KNNQuery method : KNNQuery combine(KNNQuery other).
KNNQuery<O> knnq =
executor.invokeAll(callables) //List<Future<KNNQery>>
.stream() //stream<Future<KNNQuery>>
.map(future -> {
try{
return future.get();
}catch (Exception e){
throw new IllegalStateException(e);
}
})
.reducing(KNNQuery::combine) // add it to combine two KNNQuery instances
.orElse(null);

Resources