Spring AOP Usage - spring

I am thinking of moving all my Slf4j logging to replace with Spring AOP.
But now I have some questions on how to use Spring AOP in the following scenarios.
I could see that we can use it to print general method trace logs and some arguments as well.
Lets say a situation like this.
public void methodTest(User object) {
Integer number = object.getValue();
List<Param> params = object.getParams();
if((number == null || number < 1000 || number > 10000)) {
LOGGER.error("Invalid Number- " + number);
//May not throw below exception. Just want to log and continue.
//throw new Exception(ErrorCode.INVALID_USER_NUMBER);
} else if (number == 10000) {
if(params != null && params.size() > 0) {
LOGGER.error("No params supported for number "+number);
throw new Exception(ErrorCode.INVALID_SCALE);
}
return;
}
if(number >= 1000 && number < 10000 && (params == null || params.size() == 0)) {
LOGGER.error("Not all params available");
throw new Exception(ErrorCode.INVALID_PARAM);
}
}
How can we adapt Spring AOP in this case? I have different log statements and values to print based on different logic.
Update: Writing our log statements based on ErrorCode what we received in Aspect looks good?
If we have same error code in few of the scenarios, log statements could be different rt. Is it possible to handle that as well?
Lastly, instead of error scenarios if I have debug statements to be used based on different conditions, do we get the control in Aspect?

Create an Aspect class where you define all your pointcuts and advices:
#Aspect
#Component
public class Foo {
#Pointcut("execution(* com.example.ClassName.methodName(..))")
private void exampleMethod() {}
#Before("exampleMethod()")
public void exampleAdvice() {
//your logging implementation
}
}
here you can find more examples:
https://javamondays.com/spring-aop-beginners-guide/

Related

Concurrent transaction issue in keycloak user attribute (java spring boot)

I managed our customer's point as keycloak user attribute.
I set 'point' as user attribute, and I handled it with keycloak api in Java Spring boot.
So, flow of update point is..
point = getPointByUserEmail(userEmail); // get point to update.
point -= 10; // minus point
updatePointByUserEmail(userEmail, point); // update point
public Long getPointByUserEmail(String userEmail) {
UserRepresentation userRepresentation = usersResource.search(userEmail, true).get(0);
Map<String, List<String>> attributes = userRepresentation.getAttributes();
if (attributes == null || attributes.get("point") == null)
return null;
return Long.parseLong(attributes.get("point").get(0));
}
public void updatePointByUserEmail(String userEmail, Long point) {
UserRepresentation userRepresentation = usersResource.search(userEmail, true).get(0);
UserResource userResource = usersResource.get(userRepresentation.getId());
Map<String, List<String>> attributes = userRepresentation.getAttributes();
attributes.put("point", Arrays.asList(point.toString()));
userRepresentation.setAttributes(attributes);
userResource.update(userRepresentation);
}
It works well.
But my problem is when user requests simultaneously at almost same time to update point,
It doesn't work well.
For example, there are 2 requests at once. (initial point = 100, minus point per request = 10)
I expected it would be 80 point, because 100 - (10 * 2) = 80
But it was 90 point.
So I think I need to set isolation level to transaction in point.
In JPA, there is #Lock annotation... but,, how can I do it in keycloak ?
Is there any way that I can set isolation level in keycloak api so that my function will work well ?
This is code when I handle point,
public class someController {
public ResponseEntity<String> methodToHandleRequest(#RequestBody Dto param, HttpServletRequest request) {
...
Long point = null;
try {
point = userAttributesService.getPoint();
if (point == null)
throw new NullPointerException();
} catch (Exception e) {
e.printStackTrace();
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).body("error");
}
if (point < 10)
return ResponseEntity.status(HttpStatus.PAYMENT_REQUIRED).body("you have at least 10 points " + "(current: " + point + ")");
userAttributesService.updatePoint(point - 10);
...
}
I tried managing point to use JPA, it would handle user attribute using DB.
But, when I updated user attribute data in DB.
I tried managing point to connect JPA with keycloak DB.
And I found DB table for user attribute, and there is point value !
But it doesn't update in keycloak when I updated point in DB.... :'(

How to use http patch request in spring boot?

I am trying to use PATCH request.
below is the code which i am using. and it is like ladder of if statements
#PatchMapping("/updateInvoiceByEmail/{email}")
public Mono<ResponseEntity<Invoice>> updateInvoiceByEmail(
#PathVariable String email,
#RequestBody Invoice invoice) {
return invoiceRepository
.findByEmail(vendorEmail)
.flatMap(existing -> {
if (invoice.getInvoiceStatus() != null) {
existing.setInvoiceStatus(invoice.getInvoiceStatus());
}
if (invoice.getCanRaise() != null) {
existing.setCanRaise(invoice.getCanRaise());
}
if (invoice.getAttachmentId() != null) {
existing.setAttachmentId(invoice.getAttachmentId());
}
if (invoice.getInvoiceId() != null) {
existing.setInvoiceId(invoice.getInvoiceId());
}
... and so on.
return invoiceRepository.save(existing);
})
.map(updatedInvoice -> new ResponseEntity<>(updatedInvoice, HttpStatus.OK))
.defaultIfEmpty(new ResponseEntity<>(HttpStatus.NOT_FOUND));
I am using
Spring WebFlux and mongodb
How can i make shorter and cleaner.
Thanks
You can use reflection if you want to reduce number of lines although reflection is always last thing that you should do. What I would do is to move this if - get - set logic in separate component(in separate class or method). Aside from that, I would just like to mention that simplicity is not the only problem here. Assume that null is valid value sent by client, you will need some mechanism how to detect if value is not sent or you explicitly wanted to set null value. Some example of moving this code to separate component would look something like this:
class InvoiceAssembler {
public static assemble(Invoice existing, Invoice newInvoice) {
if(newInvoice.getInvoiceId() != null) {
existing.setInvoiceId(newInvoice.getInvoiceId());
}
...
}
}
Only way I see you could improve is if you use a UpdateDTO for the fields that are most likely to be updated and therefor reduce the amount of fields you have to check.
Good description of how you use UpdateDTO here
Otherwise you are left with reflection but I don't know if that would make your code more readable or more confusing instead.

spring cloud stream file source app - History of Processed files and polling files under sub directory

I'm building a data pipeline with Spring Cloud Stream File Source app at the start of the pipeline. I need some help with working around some missing features
My File source app (based on org.springframework.cloud.stream.app:spring-cloud-starter-stream-source-file) works perfectly well excepting missing features that I need help with. I need
To delete files after polled and messaged
Poll into the subdirectories
With respect to item 1, I read that the delete feature doesn't exist in the file source app (it is available on sftp source). Every time the app is restarted, the files that were processed in the past will be re-picked, can the history of files processed made permanent? Is there an easy alternative?
To support those requirements you definitely need to modify the code of the mentioned File Source project: https://docs.spring.io/spring-cloud-stream-app-starters/docs/Einstein.BUILD-SNAPSHOT/reference/htmlsingle/#_patching_pre_built_applications
I would suggest to fork the project and poll it from GitHub as is, since you are going to modify existing code of the project. Then you follow instruction in the mentioned doc how to build the target binder-specific artifact which will be compatible with SCDF environment.
Now about the questions:
To poll sub-directories for the same file pattern, you need to configure a RecursiveDirectoryScanner on the Files.inboundAdapter():
/**
* Specify a custom scanner.
* #param scanner the scanner.
* #return the spec.
* #see FileReadingMessageSource#setScanner(DirectoryScanner)
*/
public FileInboundChannelAdapterSpec scanner(DirectoryScanner scanner) {
Note that all the filters must be configured on this DirectoryScanner instead.
There is going to be a warning otherwise:
// Check that the filter and locker options are _NOT_ set if an external scanner has been set.
// The external scanner is responsible for the filter and locker options in that case.
Assert.state(!(this.scannerExplicitlySet && (this.filter != null || this.locker != null)),
() -> "When using an external scanner the 'filter' and 'locker' options should not be used. " +
"Instead, set these options on the external DirectoryScanner: " + this.scanner);
To keep track of the files, it is better to consider to have a FileSystemPersistentAcceptOnceFileListFilter based on the external persistence store for the ConcurrentMetadataStore implementation: https://docs.spring.io/spring-integration/reference/html/#metadata-store. This must be used instead of that preventDuplicates(), because FileSystemPersistentAcceptOnceFileListFilter ensure only once logic for us as well.
Deleting file after sending might not be a case, since you may just send File as is and it is has to be available on the other side.
Also, you can add a ChannelInterceptor into the source.output() and implement its postSend() to perform ((File) message.getPayload()).delete(), which is going to happen when the message has been successfully sent to the binder destination.
#EnableBinding(Source.class)
#Import(TriggerConfiguration.class)
#EnableConfigurationProperties({FileSourceProperties.class, FileConsumerProperties.class,
TriggerPropertiesMaxMessagesDefaultUnlimited.class})
public class FileSourceConfiguration {
#Autowired
#Qualifier("defaultPoller")
PollerMetadata defaultPoller;
#Autowired
Source source;
#Autowired
private FileSourceProperties properties;
#Autowired
private FileConsumerProperties fileConsumerProperties;
private Boolean alwaysAcceptDirectories = false;
private Boolean deletePostSend;
private Boolean movePostSend;
private String movePostSendSuffix;
#Bean
public IntegrationFlow fileSourceFlow() {
FileInboundChannelAdapterSpec messageSourceSpec = Files.inboundAdapter(new File(this.properties.getDirectory()));
RecursiveDirectoryScanner recursiveDirectoryScanner = new RecursiveDirectoryScanner();
messageSourceSpec.scanner(recursiveDirectoryScanner);
FileVisitOption[] fileVisitOption = new FileVisitOption[1];
recursiveDirectoryScanner.setFilter(initializeFileListFilter());
initializePostSendAction();
IntegrationFlowBuilder flowBuilder = IntegrationFlows
.from(messageSourceSpec,
new Consumer<SourcePollingChannelAdapterSpec>() {
#Override
public void accept(SourcePollingChannelAdapterSpec sourcePollingChannelAdapterSpec) {
sourcePollingChannelAdapterSpec
.poller(defaultPoller);
}
});
ChannelInterceptor channelInterceptor = new ChannelInterceptor() {
#Override
public void postSend(Message<?> message, MessageChannel channel, boolean sent) {
if (sent) {
File fileOriginalFile = (File) message.getHeaders().get("file_originalFile");
if (fileOriginalFile != null) {
if (movePostSend) {
fileOriginalFile.renameTo(new File(fileOriginalFile + movePostSendSuffix));
} else if (deletePostSend) {
fileOriginalFile.delete();
}
}
}
}
//Override more interceptor methods to capture some logs here
};
MessageChannel messageChannel = source.output();
((DirectChannel) messageChannel).addInterceptor(channelInterceptor);
return FileUtils.enhanceFlowForReadingMode(flowBuilder, this.fileConsumerProperties)
.channel(messageChannel)
.get();
}
private void initializePostSendAction() {
deletePostSend = this.properties.isDeletePostSend();
movePostSend = this.properties.isMovePostSend();
movePostSendSuffix = this.properties.getMovePostSendSuffix();
if (deletePostSend && movePostSend) {
String errorMessage = "The 'delete-file-post-send' and 'move-file-post-send' attributes are mutually exclusive";
throw new IllegalArgumentException(errorMessage);
}
if (movePostSend && (movePostSendSuffix == null || movePostSendSuffix.trim().length() == 0)) {
String errorMessage = "The 'move-post-send-suffix' is required when 'move-file-post-send' is set to true.";
throw new IllegalArgumentException(errorMessage);
}
//Add additional validation to ensure the user didn't configure a file move that will result in cyclic processing of file
}
private FileListFilter<File> initializeFileListFilter() {
final List<FileListFilter<File>> filtersNeeded = new ArrayList<FileListFilter<File>>();
if (this.properties.getFilenamePattern() != null && this.properties.getFilenameRegex() != null) {
String errorMessage = "The 'filename-pattern' and 'filename-regex' attributes are mutually exclusive.";
throw new IllegalArgumentException(errorMessage);
}
if (StringUtils.hasText(this.properties.getFilenamePattern())) {
SimplePatternFileListFilter patternFilter = new SimplePatternFileListFilter(this.properties.getFilenamePattern());
if (this.alwaysAcceptDirectories != null) {
patternFilter.setAlwaysAcceptDirectories(this.alwaysAcceptDirectories);
}
filtersNeeded.add(patternFilter);
} else if (this.properties.getFilenameRegex() != null) {
RegexPatternFileListFilter regexFilter = new RegexPatternFileListFilter(this.properties.getFilenameRegex());
if (this.alwaysAcceptDirectories != null) {
regexFilter.setAlwaysAcceptDirectories(this.alwaysAcceptDirectories);
}
filtersNeeded.add(regexFilter);
}
FileListFilter<File> createdFilter = null;
if (!Boolean.FALSE.equals(this.properties.isIgnoreHiddenFiles())) {
filtersNeeded.add(new IgnoreHiddenFileListFilter());
}
if (Boolean.TRUE.equals(this.properties.isPreventDuplicates())) {
filtersNeeded.add(new AcceptOnceFileListFilter<File>());
}
if (filtersNeeded.size() == 1) {
createdFilter = filtersNeeded.get(0);
} else {
createdFilter = new CompositeFileListFilter<File>(filtersNeeded);
}
return createdFilter;
}
}

Hibernate queries getting slower and slower

I'm working on a process that checks and updates data from Oracle database. I'm using hibernate and spring framework in my application.
The application reads a csv file, processes the content, then persiste entities :
public class Main() {
Input input = ReadCSV(path);
EntityList resultList = Process.process(input);
WriteResult.write(resultList);
...
}
// Process class that loops over input
public class Process{
public EntityList process(Input input) :
EntityList results = ...;
...
for(Line line : input.readLine()){
results.add(ProcessLine.process(line))
...
}
return results;
}
// retrieving and updating entities
Class ProcessLine {
#Autowired
DomaineRepository domaineRepository;
#Autowired
CompanyDomaineService companydomaineService
#Transactional
public MyEntity process(Line line){
// getcompanyByXX is CrudRepository method with #Query that returns an entity object
MyEntity companyToAttach = domaineRepository.getCompanyByCode(line.getCode());
MyEntity companyToDetach = domaineRepository.getCompanyBySiret(line.getSiret());
if(companyToDetach == null || companyToAttach == null){
throw new CustomException("Custom Exception");
}
// AttachCompany retrieves some entity relationEntity, then removes companyToDetach and adds CompanyToAttach. this updates relationEntity.company attribute.
companydomaineService.attachCompany(companyToAttach, companyToDetach);
return companyToAttach;
}
}
public class WriteResult{
#Autowired
DomaineRepository domaineRepository;
#Transactional
public void write(EntityList results) {
for (MyEntity result : results){
domaineRepository.save(result)
}
}
}
The application works well on files with few lines, but when i try to process large files (200 000 lines), the performance slows drastically, and i get a SQL timeout.
I suspect cache issues, but i'm wondering if saving all the entities at the end of the processing isn't a bad practice ?
The problem is your for loop which is doing individual saves on the result and thus does single inserts slowing it down. Hibernate and spring support batch inserts and should be done when ever possible.
something like domaineRepository.saveAll(results)
Since you are processing lot of data it might be better to do things in batches so instead of getting one company to attach you should get a list of companies to attach processes those then get a list of companies to detach and process those
public EntityList process(Input input) :
EntityList results;
List<Code> companiesToAdd = new ArrayList<>();
List<Siret> companiesToRemove = new ArrayList<>();
for(Line line : input.readLine()){
companiesToAdd.add(line.getCode());
companiesToRemove.add(line.getSiret());
...
}
results = process(companiesToAdd, companiesToRemove);
return results;
}
public MyEntity process(List<Code> companiesToAdd, List<Siret> companiesToRemove) {
List<MyEntity> attachList = domaineRepository.getCompanyByCodeIn(companiesToAdd);
List<MyEntity> detachList = domaineRepository.getCompanyBySiretIn(companiesToRemove);
if (attachList.isEmpty() || detachList.isEmpty()) {
throw new CustomException("Custom Exception");
}
companydomaineService.attachCompany(attachList, detachList);
return attachList;
}
The above code is just sudo code to point you in the right direction, will need to work out what works for you.
For every line you read you are doing 2 read operations here
MyEntity companyToAttach = domaineRepository.getCompanyByCode(line.getCode());
MyEntity companyToDetach = domaineRepository.getCompanyBySiret(line.getSiret());
You can read more than one line and us the in query and then process that list of companies

Using equal sign for HttpStatus.BAD_REQUEST not working

I'm just wondering why using == to check equality of HttpStatus.BAD_REQUEST is not working:
HttpStatus httpStatusCode ...;
if (httpStatusCode == HttpStatus.BAD_REQUEST) {}
I got it working by using equals method:
if (httpStatusCode.equals(HttpStatus.BAD_REQUEST)) {}
But, HttpStatus.OK is working as in:
if (httpStatusCode == HttpStatus.OK) {}
I discovered it when I had this code:
if (httpStatusCode == HttpStatus.OK) {
...
} else if (httpStatusCode == HttpStatus.BAD_REQUEST ) {
...
} else {
...
}
Assumming httpStatusCode is HttpStatus.BAD_REQUEST, instead of going through else if block, it went to else block. But, when I changed == to .equals(), it worked.
I'm using Spring Web 4.3.6.RELEASE. Is there any explanation on this? Thank you
Use value() method:
httpStatusCode.value() == HttpStatus.OK.value()
If you will look inside the HttpStatus.java file, you can see it is an enum and it has a value method which return an int value of the HttpStatus, so you can use it to compare your Status codes.
And .equals works as it checks the enums value as == checks reference.

Resources