how to "group by" messages based on a header value - spring

I'm trying to create a file zip based on on the file extension which follows this standard: filename.{NUMBER}, what I'm doing is reading a folder, grouping by .{number} and then creating a unique file .zip with that .num at the end, for example:
folder /
file.01
file2.01
file.02
file2.02
folder -> /processed
file.01.zip which contains -> file.01, file2.01
file02.zip which contains -> file.02, file2.02
what I done is using an outboundGateway, splitting files, enrich headers reading the file extension, and then aggregating reading that header, but doesn't seems to work properly.
public IntegrationFlow integrationFlow() {
return flow
.handle(Ftp.outboundGateway(FTPServers.PC_LOCAL.getFactory(), AbstractRemoteFileOutboundGateway.Command.MGET, "payload")
.fileExistsMode(FileExistsMode.REPLACE)
.filterFunction(ftpFile -> {
int extensionIndex = ftpFile.getName().indexOf(".");
return extensionIndex != -1 && ftpFile.getName().substring(extensionIndex).matches("\\.([0-9]*)");
})
.localDirectory(new File("/tmp")))
.split() //receiving an iterator, creates a message for each file
.enrichHeaders(headerEnricherSpec -> headerEnricherSpec.headerExpression("warehouseId", "payload.getName().substring(payload.getName().indexOf('.') +1)"))
.aggregate(aggregatorSpec -> aggregatorSpec.correlationExpression("headers['warehouseId']"))
.transform(new ZipTransformer())
.log(message -> {
log.info(message.getHeaders().toString());
return message;
});
}
it's giving me a single message containing all files, I should expect 2 messages.

due to the nature of this dsl, I have a dynamic number of files, so I couldn't count messages (files) ending with the same number, and I don't think timeout could be a good release Strategy, I just wrote the code on my own without writing to disk:
.<List<File>, List<Message<ByteArrayOutputStream>>>transform(files -> {
HashMap<String, ZipOutputStream> zipOutputStreamHashMap = new HashMap<>();
HashMap<String, ByteArrayOutputStream> zipByteArrayMap = new HashMap<>();
ArrayList<Message<ByteArrayOutputStream>> messageList = new ArrayList<>();
files.forEach(file -> {
String warehouseId = file.getName().substring(file.getName().indexOf('.') + 1);
ZipOutputStream warehouseStream = zipOutputStreamHashMap.computeIfAbsent(warehouseId, s -> new ZipOutputStream(zipByteArrayMap.computeIfAbsent(s, s1 -> new ByteArrayOutputStream())));
try {
warehouseStream.putNextEntry(new ZipEntry(file.getName()));
FileInputStream inputStream = new FileInputStream(file);
byte[] bytes = new byte[4096];
int length;
while ((length = inputStream.read(bytes)) >= 0) {
warehouseStream.write(bytes, 0, length);
}
inputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
});
zipOutputStreamHashMap.forEach((s, zipOutputStream) -> {
try {
zipOutputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
});
zipByteArrayMap.forEach((key, byteArrayOutputStream) -> {
messageList.add(MessageBuilder.withPayload(byteArrayOutputStream).setHeader("warehouseId", key).build());
});
return messageList;
})
.split()
.transform(ByteArrayOutputStream::toByteArray)
.handle(Ftp.outboundAdapter(FTPServers.PC_LOCAL.getFactory(), FileExistsMode.REPLACE)
......

Related

java stream is making weird things to generate csv file in Spring Boot

I'm processing a csv file through my springboot app, the file is to download it, in my case I use streams but there is a problem what I don't know what's wrong in my code because some rows is complete with the columns but next row only write some columns and leftover columns are write below as if were a new row. I hope you understand what I mean. I hope you give a hand, thank you in advance.
This code below is the controller
.....
#RequestMapping(value="/stream/csv/{grupo}/{iduser}", method = RequestMethod.GET)
public void generateCSVUsingStream(#PathVariable("grupo") String grupo,
#PathVariable("iduser") String userId,HttpServletResponse response) {
response.addHeader("Content-Type", "application/csv");
response.addHeader("Content-Disposition", "attachment; filename=\""+userId+"_Reporte_PayCash"+grupo.replaceAll("\\s", "")+".csv");
response.setCharacterEncoding("UTF-8");
try (Stream<ReportePayCashDTO> streamPaycashdatos = capaDatosDao.ReportePayCashStream(userId, grupo);PrintWriter out = response.getWriter();) {
//PrintWriter out = response.getWriter();
out.write(String.join(",", "Cuenta" , "Referencia", "Referencia_paycash","Distrito","Plaza","Cartera"));
out.write("\n");
streamPaycashdatos.forEach(streamdato -> {
out.write(streamdato.getAccount()+","+streamdato.getReferencia()+","+streamdato.getReferenciapaycash()
+","+streamdato.getCartera()+","+streamdato.getState()+","+streamdato.getCity());
out.append("\r\n");
});
out.flush();
out.close();
streamPaycashdatos.close();
} catch (IOException ix) {
throw new RuntimeException("There is an error while downloading file", ix);
}
}
The method on DAO is this
...
#Override
public Stream<ReportePayCashDTO> ReportePayCashStream(String userId, String grupo) {
// TODO Auto-generated method stub
Stream<ReportePayCashDTO > stream = null ;
String query ="";
//more code
try {
stream = getJdbcTemplate().queryForStream(query, (rs, rowNum) -> {
return new ReportePayCashDTO(Utils.valnull(rs.getString("account")),
Utils.valnull(rs.getString("reference")),
Utils.valnull(rs.getString("referencepaycash")),
Utils.valnull(rs.getString("state")),
Utils.valnull(rs.getString("city")),
Utils.valnull(rs.getString("cartera"))
);
});
}catch(Exception e) {
e.printStackTrace();
logger.error(e.getMessage());
}
return stream;
}
Example: This is what I hoped will write into csv file
55xxxxx02,88xxxx153,1170050202662,TAMAULIPAS,TAMPICO,AmericanExpre
58xxxxx25,88xxx899,1170050202662,TAMAULIPAS,TAMPICO,AmericanClasic
but some rows was written like this
55xxxxx02,88xxxx153,1170050202662
,TAMAULIPAS,TAMPICO,AmericanExpre
58xxxxx25,88xxx899,1170050202662
,TAMAULIPAS,TAMPICO,AmericanClasic

How to retrieve filtered value from a stream and use it

I'm new to Java 8:
I have to convert this piece of java 6 code to java 8 version:
List<String> unvalidnumbers = new ArrayList<>();
StringBuilder content = new StringBuilder();
String str = "current_user"
for (Iterator<String> it = numbers.iterator(); it.hasNext();) {
String number = it.next();
try {
PersIdentifier persIdentifier = this.getPersIdentifierByNumber(number);
if (persIdentifier != null) {
content.append(number).append(";").append(str);
if (StringUtils.equals(persIdentifier.getType(), "R")) {
content.append(";X");
}
if (it.hasNext()) {
content.append("\r\n");
}
}
} catch (BusException e) {
LOGGER.warn("Pers not found", e);
unvalidnumbers.add(number);
}
}
So i wrote this:
numbers.stream().filter((String number) -> {
try {
return this.getPersIdentifierByNumber(number) != null;
} catch (BusinessException e1) {
LOGGER.warn("Pers not found", e1);
return false;
}
}).forEach(number -> contentConstruction.append(number).append(";").append(str));
I know it's missing this part:
if (StringUtils.equals(persIdentifier.getType(), "R")) {
content.append(";X");
}
if (it.hasNext()) {
content.append("\r\n");
}
But i didn't found way to retrieve the corresponding persIdentifier object.
Any idea please
You should use a more functional approach if you use Java 8.
Instead of a forEach, favor collect(). Here Collectors.joining() looks suitable.
Not tested but you should have an overall idea :
String result =
numbers.stream()
.map(number -> new SimpleEntry<String, PersIdentifier>(number, this.getPersIdentifierByNumber(number) )
.filter(entry -> entry.getValue() != null)
.map(entry ->{
final String base = entry.getKey() + ";" + str;
return "R".equals(entry.getValue().getType()) ? base + ";X" : base;
})
.collect(joining("\r\n")); // Or not OS dependent : System.lineSeparator()

java 8 Stream to map

I want to convert the following into functional program. Please help to stream line the below code.
Map <String, TreeSet<Double>> cusipMap = new HashMap<>();
String[] key = new String[1];
try {
Files.lines(Paths.get("C:\\CUSIP.txt")).
forEach(l -> {
if (isCUSIP(l)) {
if (cusipMap.get(l) == null )
cusipMap.put(l, new TreeSet<Double>());
key[0] = l;
} else {
cusipMap.get(key[0]).add(Double.valueOf(l));
}
});
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Try this one
try {
Map<String, TreeSet<Double>> result = Files.lines(Paths.get("C:\\CUSIP.txt"))
.collect(Collectors.groupingBy(Function.identity(), Collector.of(
TreeSet::new,
(TreeSet<Double> tree, String s) -> {tree.add(Double.valueOf(s));},
(TreeSet<Double> tree, TreeSet<Double> s) -> {tree.addAll(s); return tree;}
)));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

String stream joining: stream has already been operated upon or closed

Using java8 to concatenate an object certain field value with a "_". The last line in the code throws an "stream has already been operated upon or closed".
Stream<Field> fields = ...
Stream<String> exclusions = ...
Stream<String> stringStream = fields.filter(f -> exclusions.anyMatch(e -> e.equals(f.getName())))
.map(f -> {
f.setAccessible(true);
Object value = null;
try {
value = f.get(obj);
} catch (IllegalAccessException e) {
e.printStackTrace();
}
return value;
})
.filter(v -> v != null)
.map(Object::toString);
String suffix = stringStream.collect(Collectors.joining("_"));
EDIT: I have tried this with:
List<Foo> list = new ArrayList<>();
list.stream().filter(item -> item != null).map(item -> {
String value = null;
return value;
}).filter(item -> item != null).map(item -> {
String value = null;
return value;
}).collect(Collectors.joining(""));
And there is no such exception.
How many times is the first filter called? More then once right? The exclusions that you use in the first call to filter is consumed via anyMatch; thus the second time you try to use it - you get the exception.
The way to solve it, would be to stream on every single filter operation:
filter(f -> sourceOfExclusions.stream().anyMatch...

How to make a save action that checks whether a 'save-as' has already been performed

I have researched and tried to refer back to my fileChooser.getSeletedFile() in my save as action but can not work out how to check whether or not a file has been created. Here is my attempted code so far:
Save as code(works well):
public void Save_As() {
fileChooserTest.setApproveButtonText("Save");
int actionDialog = fileChooserTest.showOpenDialog(this);
File fileName = new File(fileChooserTest.getSelectedFile() + ".txt");
try {
if (fileName == null) {
return;
}
BufferedWriter outFile = new BufferedWriter(new FileWriter(fileName));
outFile.write(this.jTextArea2.getText());//put in textfile
outFile.flush(); // redundant, done by close()
outFile.close();
} catch (IOException ex) {
}
}
"Save" code doesn't work:
private void SaveActionPerformed(java.awt.event.ActionEvent evt) {
File f = fileChooserTest.getSelectedFile();
try {
if (f.exists()) {
BufferedWriter bw1 = new BufferedWriter(new FileWriter(fileChooserTest.getSelectedFile() + ".txt"));
bw1 = new BufferedWriter(new FileWriter(fileChooserTest.getSelectedFile() + ".txt"));
String text = ((JTextArea) jTabbedPane1.getSelectedComponent()).getText();
bw1.write(text);
bw1.close();
} else {
Save_As();
}
} catch (Exception ex) {
ex.printStackTrace();
}
}
Instead of storing an instance to the JFileChooser rather store an instance to the File (wich will be null before any save has been performed). In your SaveActionPerformed method check if the file is null. If it is null then do a Save_As and store the selected file in your file variable, if it is not null then do a normal save into the file.

Resources