When should I use mapstruct or converters with java 8 to avoid error-prone? - spring-boot

At work, we use MapStruct in many SpringBoot projects with Java 8 REST Full applications and when we need to map Entity to DTO or DTO to Response or in similar cases. But today my friend showed me a great advantage of using a simple Converter instead of MapStruct.
This is a simple example using MapStrurct:
#Mapper(componentModel="spring", unmappedTargetPolicy = ReportingPolicy.IGNORE)
public interface AccountMapper {
#Mapping(source = "customerBank.customerId", target = "customerId")
AccountResponse toResponse(AccountBank accountBank);
}
It works perfectly but actually in the case of someone change the customerId attribute by another name and forgets to change this mapper we will have a Runtime error.
The pros for Converter is we will have a Compile time error and avoid Runtime error.
Please, let me know if someone managed to share how to avoid Runtime error, like my presented scenario, using MapStruct, due to Converter does not bring the same advantage.
My question is: Is it possible to use MapStruct with efficiency, I mean without Runtime error prone?

If I understand well, you would like to have a compile-time error for wrong property naming, using MapStruct custom mapping.
If so, you should add a necessary build plugin in your pom.xml (if you use maven).
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<annotationProcessorPaths>
<path>
<groupId>org.mapstruct</groupId>
<artifactId>mapstruct-processor</artifactId>
<version>${mapstruct.version}</version>
</path>
</annotationProcessorPaths>
</configuration>
</plugin>
And of course declare a property for MapStruct version:
<properties>
<mapstruct.version>1.4.1.Final</mapstruct.version>
</properties>
After compiling the project, with the added plugin, the annotation processor will generate full implementation:
public class AccountMapperImpl implements AccountMapper
In target\generated-sources\annotations folder.
You can check generated source of the implementation class, all is set and carefully checked.
In case of unexisting property names in #Mapping annotation, the compiler will throw an error.

Related

Spring - How to cache in self-invocation with aspectJ?

Thank you to click my question.
I want to call a caching method in self-invocation, so I need to use AspectJ.
(cache's config is okay)
add AspectJ dependencies
implementation 'org.springframework.boot:spring-boot-starter-aop'
add #EnableCaching(mode = AdviceMode.ASPECTJ) to my application.java
#EnableJpaAuditing
#EnableCaching(mode = AdviceMode.ASPECTJ) // <-- here
#SpringBootApplication
public class DoctorAnswerApplication {
public static void main(String[] args) {
SpringApplication.run(DoctorAnswerApplication.class, args);
}
}
my service.java
#Service
public class PredictionService {
#Cacheable(value = "findCompletedRecordCache")
public HealthCheckupRecord getRecordComplete(Long memberId, String checkupDate) {
Optional<HealthCheckupRecord> recordCheckupData;
recordCheckupData = healthCheckupRecordRepository.findByMemberIdAndCheckupDateAndStep(memberId, checkupDate, RecordStep.COMPLETE);
return recordCheckupData.orElseThrow(NoSuchElementException::new);
}
}
my test code
#Test
public void getRecordCompleteCacheCreate() {
// given
Long memberId = (long)this.testUserId;
List<HealthCheckupRecord> recordDesc = healthCheckupRecordRepository.findByMemberIdAndStepOrderByCheckupDateDesc(testUserId, RecordStep.COMPLETE);
String checkupDate = recordDesc.get(0).getCheckupDate();
String checkupDate2 = recordDesc.get(1).getCheckupDate();
// when
HealthCheckupRecord first = predictionService.getRecordComplete(memberId,checkupDate);
HealthCheckupRecord second = predictionService.getRecordComplete(memberId,checkupDate);
HealthCheckupRecord third = predictionService.getRecordComplete(memberId,checkupDate2);
// then
assertThat(first).isEqualTo(second);
assertThat(first).isNotEqualTo(third);
}
What did I don't...?
I didn't make any class related with aspectJ.
I think #EnableCaching(mode = AdviceMode.ASPECTJ) make #Cacheable work by AspectJ instead Spring AOP(proxy).
With thanks to #kriegaex, he fixed me by pointing out the spring-aspects dependency and the load-time-weaving javaagent requirement. For the convenience of others, the configuration snippets for Spring Boot and Maven follow.
(Note: In the end, I didn't feel all this (and the side-effects) were worth it for my project. See my other answer for a simple, if somewhat ugly, workaround.)
POM:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-aop</artifactId>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-cache</artifactId>
</dependency>
Application Config:
#Configuration
#EnableCaching(mode = AdviceMode.ASPECTJ)
public class ApplicationConfig { ... }
Target Method:
#Cacheable(cacheNames = { "cache-name" })
public Thingy fetchThingy(String identifier) { ... }
Weaving mechanism:
Option 1: Load Time Weaving (Spring default)
Use JVM javaagent argument or add to your servlet container libs
-javaagent:<path-to-jar>/aspectjweaver-<version>.jar
Option 2: Compile Time Weaving
(This supposedly works, but I found a lack of coherent examples for use with Spring Caching - see further reading below)
Use aspectj-maven-plugin: https://www.mojohaus.org/aspectj-maven-plugin/index.html
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.11</version>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${aspectj.version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjtools</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
<configuration>
<outxml>true</outxml>
<showWeaveInfo>false</showWeaveInfo>
<Xlint>warning</Xlint>
<verbose>false</verbose>
<aspectLibraries>
<aspectLibrary>
<groupId>org.springframework</groupId>
<artifactId>spring-aspects</artifactId>
</aspectLibrary>
</aspectLibraries>
<complianceLevel>${java.version}</complianceLevel>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
For reference/search purposes, here is the error that started all this:
Caused by: java.io.FileNotFoundException: class path resource [org/springframework/cache/aspectj/AspectJCachingConfiguration.class] cannot be opened because it does not exist
More reading:
AOP and Spring: https://docs.spring.io/spring-framework/docs/current/reference/html/core.html#aop
AspectJ tutorial (Baeldung): https://www.baeldung.com/aspectj
Complie-Time Weaving vs Load-Time Weaving: https://stackoverflow.com/a/23042793/631272
CTW vs LTW in spring brief: https://stackoverflow.com/a/41370471/631272
CTW vs LTW Tutorial: https://satenblog.wordpress.com/2017/09/22/spring-aspectj-compile-time-weaving/
Getting CTW to work in Eclipse M2e: https://stackoverflow.com/a/19616845/631272
CTW and Java 11 issues (may have been part of my struggles with it): https://www.geekyhacker.com/2020/03/28/how-to-configure-aspectj-in-spring-boot/
Did you read the Javadoc for EnableCaching?
Note that if the mode() is set to AdviceMode.ASPECTJ, then the value of the proxyTargetClass() attribute will be ignored. Note also that in this case the spring-aspects module JAR must be present on the classpath, with compile-time weaving or load-time weaving applying the aspect to the affected classes. There is no proxy involved in such a scenario; local calls will be intercepted as well.
So please check if you
have spring-aspects on the class path and
started your application with the parameter java -javaagent:/path/to/aspectjweaver.jar.
There is an alternative to #2, but using the Java agent is the easiest. I am not a Spring user, so I am not an expert in Spring configuration, but even a Spring noob like me succeeded with the Java agent, so please give that a shot first.
TL;DR: If AspectJ is giving you headaches and you don't really need it other than to work around Spring Caching self-invocation, it might actually be cleaner/lighter/easier to add a simple "cache delegate" bean that your service layers can re-use. No extra dependencies, no performance impacts, no unintended side-effects to changing the way spring proxies work by default.
Code:
#Component
public class CacheDelegateImpl implements CacheDelegate {
#Override #Cacheable(cacheNames = { "things" })
public Object getTheThing(String id) { ... }
}
#Service
public class ThingServiceImpl implements ThingService {
#Resource
private CacheDelegate cacheDelegate;
public Object getTheThing(String id) {
return cacheDelegate.getTheThing(id);
}
public Collection<Object> getAllTheThings() {
return CollectionUtils.emptyIfNull(findAllTheIds())
.parallelStream()
.map(this::getTheThing)
.collect(Collectors.toSet());
}
}
Adding another answer, because to solve this same issue for myself I ended up changing direction. The more direct solutions are noted by #kriegaex and myself earlier, but this is a different option for people that have issues getting AspectJ to work when you don't fundamentally need it.
For my project, adding AspectJ only to allow cacheable same-bean references was a disaster that caused 10 new headaches instead of one simple (but annoying) one.
A brief non-exhaustive rundown is:
Introduction of multiple new dependencies
Introduction of complex POM plugins to either compile-time weave (which never worked quite right for me) OR marshal the run-time byte-weaving jar into the correct place
Adding a runtime javaagent JVM argument to all our deployments
Much poorer performance at either build-time or start-time (to do the weaving)
AspectJ picking up and failing on Spring Transactional annotations in other areas of the codebase (where I was otherwise happy to use Spring proxies)
Java versioning issues
Somehow a dependency on the ancient Sun Microsystems tools.jar (which is not present in later OpenJDK versions)
Generally poor and scattershot doc on how to implement the caching use case in isolation from Spring Transactions and/or without full-blown AOP with AspectJ (which I don't need/want)
In the end, I just reverted to Spring Caching by proxy and introduced a "cache delegate" that both my service methods could refer to instead. This workaround is not the prettiest, but for me was preferable to all the AspectJ hoops I was jumping through when I really didn't need AspectJ. I just want seamless caching and DRY service code, which this workaround achieves.

Maven site generation using advanced Markdown?

We are using Markdown in our Maven generated site. Works like a charm. AFAIK the plugin uses Flexmark under the hood, which supports the Admonition extensions.
We would like to use them too, the infoboxes are quite helpful for documentation. Our site configuration in the pom.xml looks like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.8.2</version>
</plugin>
How could we configure it to recognise the additional markdown?
This question is a little old but I have been trying to figure this out for using the Gitlab extension for math equations and the short answer is I don't think you can do it out of the box.
The maven-site-plugin uses the doxia-module-markdown module for markdown which uses Flexmark internally but it is pre-configured with the extensions that it uses for this.
Here's a link to the code and the exact snippet from
// Initialize the Flexmark parser and renderer, once and for all
static
{
MutableDataSet flexmarkOptions = new MutableDataSet();
// Enable the extensions that we used to have in Pegdown
flexmarkOptions.set( com.vladsch.flexmark.parser.Parser.EXTENSIONS, Arrays.asList(
EscapedCharacterExtension.create(),
AbbreviationExtension.create(),
AutolinkExtension.create(),
DefinitionExtension.create(),
TypographicExtension.create(),
TablesExtension.create(),
WikiLinkExtension.create(),
StrikethroughExtension.create()
) );
// ...
}
I think that you could possibly fork this project, add the extensions you need, and add it as a dependency to the maven-site-plugin and it might work like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.8.2</version>
<dependencies>
<dependency>
<groupId>${my-forked-groupId}</groupId>
<artifactId>${my-forked-artifactId}</artifactId>
<version>${my-forked-version}</version>
</dependency>
</dependencies>
</plugin>
This in my opinion is less than ideal and I will probably be exploring other (non-maven) options to get the result I'm looking for but will probably still try this out in the next day or so and if I get it working I'll send a link to the code in case anybody else can benefit from it.

Update code generated by Swagger code-gen

I have generated the code from swagger.yaml file using swagger code-gen in spring.
Now I have updated the swagger.yaml file for my API and added few more HTTP operations.
Will it be possible to update the existing code generated previously automatically without merging it manually?
I guess you are talking about the Controllers generated by codegen, that you have then implemented. They are overwritten after each generation, which means you will have to manually merge the code to add the changes every time... which is really annoying.
Well the best workflow I could find was to use the interfaceOnly option to generate only the model and interface classes in the target directory, and then manually create the controllers that implement those interfaces.
Lets say you update your API specification file with one more GET operation, the interface is regenerated with that new operation and you will just have to adjust your controller to implement that new method (super quick and easy with modern IDE), everything else remain the same and you have more control over your code (splitting controllers in different folders...etc...).
Here is config I used for the plugin:
<plugin>
<groupId>io.swagger</groupId>
<artifactId>swagger-codegen-maven-plugin</artifactId>
<version>2.2.3</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<inputSpec>./api-contract/petstore.yml</inputSpec>
<language>spring</language>
<configOptions>
<sourceFolder>swagger</sourceFolder>
<java8>true</java8>
<interfaceOnly>true</interfaceOnly>
</configOptions>
</configuration>
</execution>
</executions>
</plugin>
You can check a complete example project using Spring Boot with swagger-codegen-maven-plugin here.
Cheers
Having the same issue, I found the solution to start with by applying git
commit the current status
run the generator
use git to stage the intended changes but do not stage reversal of your manual edits
commit and continue
I just start with this approach but it seems to work at least for php-slim where only one file (index.php) is changed when regenerating.

How to enable Rhino (or any JavaScript Engine) in CQ 5.6?

I have some custom logic where I need to evaluate a simple boolean expression. In my IDE I have some unit tests that run fine, but when I'm trying to use it on my CQ 5.6.1 instance, the ScriptEngineManager can't find a JavaScript engine. Though this should be part of a standard java installation on any environment.
ScriptEngineManager sef = new ScriptEngineManager();
ScriptEngine se = sef.getEngineByName("JavaScript");
In the pom I have the following which usually helps:
<Import-Package>*;resolution:=optional</Import-Package>
Usually some system libraries aren't exposed in OSGi when you don't put it into the bootdelegation in the sling.properties, but this didn't work either:
org.osgi.framework.bootdelegation=org.w3c.*,com.sun.script.*,com.yourkit.*, ${org.apache.sling.launcher.bootdelegation}
What else could I try?
EDIT:
Also regarding my comment to Christians answer. I found out that there should be a service in the OSGi:
http://svn.apache.org/repos/asf/sling/trunk/bundles/scripting/javascript/src/main/java/org/apache/sling/scripting/javascript/internal/RhinoJavaScriptEngineFactory.java
But when I try to reference it with the following code, my servlet isn't active anymore:
#Reference
private transient ScriptEngineFactory sef = null;
So it seems it can't inject the factory for some reason. I've seen there could be more than one service implementing this interface, how would I target the correct one (linked above)?
EDIT2:
I now even tried to reference the Rhino factory directly:
#Reference(target = "(component.name=org.apache.sling.scripting.javascript.internal.RhinoJavaScriptEngineFactory)")
private transient ScriptEngineFactory sef = null;
With this my servlet tells me it is satisfied:
["Satisfied","Service Name: javax.script.ScriptEngineFactory","Target Filter: (component.name=org.apache.sling.scripting.javascript.internal.RhinoJavaScriptEngineFactory)","Multiple: single","Optional: mandatory","Policy: static","Bound Service ID 2004 (org.apache.sling.scripting.javascript.internal.RhinoJavaScriptEngineFactory)"]
But if I access my servlet it doesn't get triggered and the SlingDefaultServlet takes over. Without the above #Reference it is accessible, so it has to do something with it.
You need an OSGi capable ScriptEngineManager. See https://devnotesblog.wordpress.com/2011/09/07/scripting-using-jsr-223-in-an-osgi-environment/
After almost a whole day of trial and error I found 2 major problems:
First: My compiler plugin was set to use 1.8; I had to revert that to 1.6 so my maven-scr-plugin would create proper manifests again and injection of OSGi services actually works:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
I have to investigate later how it would be possible to increase that to at least 1.7 (I tried but didn't work either).
Second and the actual answer for my question is simple:
#Reference
private transient ScriptEngineManager sem = null;
Also I had to use javascript instead of JavaScript for the getEngineByName method from the manager. To see what is registered for the engine, you can check out the following path in your OSGi console:
/system/console/status-slingscripting
There all available ScriptEngine are listed with their registered names, extensions and MIME Types.

Loading a partial Spring context

I am not much of a Spring expert, but was given a legacy system with a huge context file (not separated into modules).
I want to add some unit tests - that validates different parts of the system, with the actual production configuration.
I started using the ClassPathXmlApplicationContext/FileSystemXmlApplicationContext classes to load the context, however - that takes forever.
Is it possible to load only parts of the context file (recursively) without the need to separate the original file into modules?
Update:
I'll just post here my implementation of Ralph's solution using maven:
my pom.xml:
<plugin>
<groupId>com.google.code.maven-config-processor-plugin</groupId>
<artifactId>maven-config-processor-plugin</artifactId>
<version>2.0</version>
<configuration>
<namespaceContexts>
<s>http://www.springframework.org/schema/beans</s>
</namespaceContexts>
<transformations>
<transformation>
<input>context.xml</input>
<output>context-test.xml</output>
<config>test-context-transformation.xml</config>
</transformation>
</transformations>
</configuration>
<executions>
<execution>
<goals>
<goal>process</goal>
</goals>
<phase>test</phase>
</execution>
</executions>
</plugin>
my test-context-transformation.xml:
<processor>
<add>
<name>/s:beans</name>
<value>
<![CDATA[
default-lazy-init="true"
]]>
</value>
</add>
</processor>
If you are trying to run "unit" tests, you will not require the full application context at all. Just instantiate the class you want to test (and maybe its collaborators, though mocking may be a better option) and off you go. Unit tests should concentrate on single components in isolation - otherwise they are not unit tests.
If you are trying to run a full integration test by creating the complete object hierarchy defined in your application context, then it may be easiest by first refactoring your context and splitting it into modules - as you were suggesting already.
I guess it does not work out of the box. But you can try this (it is just an idea, I don't know if it works)
Spring support so called lazy initialization the idea is to add this to all the beans.
I can imagine two ways.
A simple tool that create an copy of the orignal configuration xml file and add default-lazy-init="true" the container level beans (with s) declaration.
Try do do it programmatic. With a Bean Post Processor, or try to "inject" the default-lazy-init="true" configuration programmatic

Resources