Java stream: Generate objects directly from an input file and save the objects with Spring Data - spring

I'll generate for each line from an input file a Foo object and save in the next step all the objects in the database (Spring Data JPA).
//read file into stream
try (Stream<String> stream = Files.lines((path), Charset.forName("ISO-8859-1"))) {
Stream<Foo> rStream = stream.map(line -> new Foo(line));
rStream.forEach(line -> fooRepository.save(line));
} catch (IOException e) {
e.printStackTrace();
}
I received an Unhandled exception: java.lang.Exception, because the constructor of the Foo class can propagate an exception from the MyParser class:
public Foo(String row) throws Exception {
String split[] = StringUtils.split(row, "Ú");
field = MyParser.getInt(split[0]);
}
Is it possible to use the Java stream API anyway? Maybe with only one awesome stream? Somethings like this:
stream.map(Foo::new).forEach(foo -> fooRepository.save(foo));
I use Java 8 with Spring Boot 1.5.8.

If you can edit the class, why not simply throw a RuntimeException instead of throwing an Exception. If the API that you use still throws Exception you can just wrap it into:
catch(Exception e){
throw new RuntimeException(e);
}
Then you can simplify it to :
stream.map(Foo::new).forEach(fooRepository::save)

Related

When using parallelStream, a child thread throws an exception

In a Spring Boot application, I'm writing files to S3 using list.stream().parallel().forEach. When trying to get the resource using resourceLoader.getResource(filePath), it throws the exception 'com.amazonaws.services.s3.AmazonS3 referenced from a method is not visible from class loader'. I've noticed that the main thread successfully writes the files, but the child thread throws an exception.
getAccountCommonList()
.stream().parallel()
// .stream()
.forEach(fileNumber -> {
String filePath =
"s3://xxxxx.net/Debug_"
+ LocalDateTime.now().format(DateTimeFormatter.ofPattern("yyyyMMdd_HHmm"))
+ "_"
+ fileNumber;
WritableResource resource = (WritableResource) resourceLoader.getResource(filePath);
try (OutputStream outputStream = resource.getOutputStream()) {
write(outputStream);
} catch (IOException e) {
System.console().printf(e.getMessage(), e);
throw new RuntimeException(e);
}
});
If I don't use parallel(), the code executes successfully.
I'd like to know how to proceed if I need to use parallel().
Could it be because the child thread cannot access the applicationContext?
Is it possible that the applicationContext is bound to the main thread through ThreadLocal?

Method should never be called while writing java object as YAML file

I am trying to write java object as an YAML file. And getting the below exception. Any pointer to fix this?
Code
public void writeRequestRoot(RequestRoot requestRoot, String fileName) {
ObjectMapper mapper = new ObjectMapper(new YAMLFactory());
try {
mapper.writeValue(new File("fileName), requestRoot);
} catch (IOException e) {
e.printStackTrace();
throw new RuntimeException(e.getMessage());
}
}
Logs
Method should never get called
java.lang.IllegalStateException: Method should never get called
at com.fasterxml.jackson.dataformat.yaml.YAMLFactory._createUTF8Generator(YAMLFactory.java:575)
at com.fasterxml.jackson.dataformat.yaml.YAMLFactory._createUTF8Generator(YAMLFactory.java:15)
at com.fasterxml.jackson.core.JsonFactory.createGenerator(JsonFactory.java:1228)
at com.fasterxml.jackson.databind.ObjectMapper.writeValue(ObjectMapper.java:3351)
at com.scb.nexus.beats.helper.YamlReaderWriter.writeRequestRoot(YamlReaderWriter.java:45)
Dependency
implementation "com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.1.2"
Dependency issue with the below dependency it works
implementation "com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.12.3"

Java 8 how to read file twice and fix java.io.IOException: Stream closed

I am trying to read a text file twice in order to calculate an average (calcAverage) and then to filter on the average to get a results list (processFile). When the second step is run the exception,
java.io.UncheckedIOException: java.io.IOException: Stream closed
is thrown.
Below is a simplified version of my failing code and a unit-test to drive the code.
A parameter (source) of type Reader is passed into App from a unit test and is a FileReader to a text file. I dont know how to access the File handler (or filename) from the Reader object to re-open it - I've tried implementing this inside App and this would fix the problem. The method signature of runProcess (Reader source) can not be changed - the other method signatures however can be.
I am using a try-with-resources block to open the Reader object and to read it through - its then closed automatically - which is all fine. I just need a way to re-open the file from the Reader to perform the filtering for pass-2.
I have read from similar questions, that the BufferedReader is like an iterator and you can only read it once.
I have tried using the mark() and reset() methods on my source object, but this throws an exception that these aren't supported.
I could read the whole file into a List object and then use this to calculate both steps but I dont know how large my file is going to be and so if possible would like to try and find a solution using the approach below.
Does anyone know how I can implement this ?
public class App {
public static void runProcess(Reader source) {
Collection<?> col = calcAverage(source);
processFile(source).forEach(x -> System.out.println(x));
}
private static Collection processFile(Reader source) {
Collection<?> col = processFile(source, ((BufferedReader reader) -> reader.lines()
.skip(1)
.collect(Collectors.toList()))
);
return col;
}
private static Collection<?> calcAverage(Reader source) {
Collection<?> col = processFile(source, ((BufferedReader reader) -> reader.lines()
.skip(1)
.collect(Collectors.toList())));
return col;
}
private static Collection<?> processFile(Reader source, BufferedReaderProcessor p){
try (BufferedReader reader = new BufferedReader(source)) {
return p.process(reader);
}catch (FileNotFoundException f){
f.printStackTrace();
return null;
}catch (IOException e){
e.printStackTrace();
return null;
}catch (Exception ex){
ex.printStackTrace();
return null;
}
}
#FunctionalInterface
public interface BufferedReaderProcessor {
Collection<?> process(BufferedReader b) throws IOException;
}
}
public class AppTest {
#Test
public void shouldReadFileTwice() throws FileNotFoundException {
App.runProcess(openFile("src/main/java/functions/example4/resources/list-of-fruits"));
}
private static Reader openFile(String filename) throws FileNotFoundException {
return new FileReader(new File(filename));
}
}
I believe you shouldn't use try with resources. It calls close on its BufferedReader which causes all the incapsulated readers to be closed.
I.e. it closes BufferedReader, Reader and FileReader.
When you invoke calcAverage in App::runProcess it closes all the readers. Then you're trying to read closed Reader when calling processFile on the next line.

Handling spring reactor exceptions in imperative spring application

I'm using the webflux in an imperative spring boot application. In this app I need to make rest calls to various backends using webclient and wait on all the responses before proceeding to the next step.
ClassA
public ClassA
{
public Mono<String> restCall1()
{
return webclient....exchange()...
.retryWhen(Retry.backoff(maxAttempts, Duration.ofSeconds(minBackOff))
.filter(this::isTransient)
.onRetryExhaustedThrow((retryBackoffSpec, retrySignal) -> {
return new MyCustomException();
});
}
}
ClassB
public ClassB
{
public Mono<String> restCall2()
{
return webclient....exchange()...
.retryWhen(Retry.backoff(maxAttempts, Duration.ofSeconds(minBackOff))
.filter(this::isTransient)
.onRetryExhaustedThrow((retryBackoffSpec, retrySignal) -> {
return new MyCustomException();
});
}
}
Mono<String> a = classAObj.restCall1();
Mono<String> b = classBObj.restCall2();
ArrayList<Mono<String>> myMonos = new ArrayList<>;
myMonos.add(a);
myMonos.add(b);
try {
List<String> results = Flux.mergeSequential(myMonos).collectList().block();}
catch(WebclientResponseException e) {
....
}
The above code is working as expected. The Webclient is configured to throw error on 5xx and 4xx which I'm able to catch using WebclientResponseException.
The problem is I'm unable to catch any exceptions from the react framework. For example my web clients are configured to retry with exponential backoff and throw exception on exhaustion and I have no way to catch it in my try catch block above. I explored the option to handle that exceptiom in the webclient stream using onErrorReturn but it does not propagate the error back to my subscriber.
I also cannot add the exception to the catch block as it's never being thrown by any part of the code.
Can anyone advice what is the best way to handle these type of error. scenarios. I'm new to webflux and reactive programming.

Exception handling using hibernate template in Spring framework

I am using Spring with Hibernate in my project.There are many methods written in DAO implementation java file and every method is using the same try/catch/finally lines of code which seem redundant to me.
I am told to optimize/refactor the code since the file LOC exceeds 10k.I read somewhere that using HibernateDaoSupport we need not to worry about exceptions or closing the session. It will be taken care of by Spring itself.
Could somebody please help me how to proceed or do the needful or any better way to handle exceptions?I am pasting below code of one method in DAO layer.
public class CSDDaoImpl extends HibernateDaoSupport implements CSDDao {
public Deal getDealStructure(long dealId) throws CSDServiceException {
Session session = null;
try {
session = getSession();
Deal deal = (Deal) session.createCriteria(Deal.class).add(
Restrictions.eq("dealId", dealId)).uniqueResult();
return deal;
} catch (DataAccessResourceFailureException darfex) {
String message = "Failed to retrieve the deal object.";
CSDServiceException ex = new CSDServiceException(message, darfex);
ex.setStackTrace(darfex.getStackTrace());
ex.setErrorCode(Constants.DATA_ACCESS_FAILURE_EXP);
ex.setMessageToUser(message);
throw ex;
} catch (IllegalStateException isex) {
String message = "Failed to retrieve the deal object.";
CSDServiceException ex = new CSDServiceException(message, isex);
ex.setStackTrace(isex.getStackTrace());
ex.setErrorCode(Constants.ILLEGAL_STATE_EP);
ex.setMessageToUser(message);
throw ex;
} catch (HibernateException hbex) {
String message = "Failed to retrieve the deal object.";
CSDServiceException ex = new CSDServiceException(message, hbex);
ex.setStackTrace(hbex.getStackTrace());
ex.setErrorCode(Constants.HIBERNATE_EXP);
ex.setMessageToUser(message);
throw ex;
} finally {
if (session != null && session.isOpen()) {
try {
session.close();
} catch (HibernateException hbex) {
log.error("Failed to close the Hibernate Session.", hbex);
hbex.printStackTrace();
CSDServiceException ex = new CSDServiceException(
"Failed to close the Hibernate Session.", hbex);
ex.initCause(hbex.getCause());
ex.setStackTrace(hbex.getStackTrace());
throw ex;
}
}
}
}
}
The best approach of handling exceptions is i believe through writing an Exception Interceptor to intercept all your DAO calls and you can catch the ones you only need in your application and wrap it with your own custom application specific exceptions.
You definitely do not need to work directly with session once an exception is thrown. That would defeat the purpose of using HibernateDaoSupport and Spring.
Have a look at this link : http://static.springsource.org/spring/docs/current/spring-framework-reference/html/classic-spring.html
Hope that helps.

Resources