Why is basePackageClasses (#ComponentScan) "Type-Safe"? - spring

Can someone help with meaning of Type-Safety in this context?
I'm somehow not very clear with the understanding of the Javadoc -
https://docs.spring.io/spring-framework/docs/3.1.4.RELEASE/javadoc-api/org/springframework/context/annotation/ComponentScan.html#basePackages()

public #interface ComponentScan {
String[] basePackages() default {};
Class<?>[] basePackageClasses() default {};
}
It means both basePackages and basePackageClasses provide the same function but basePackageClasses has the type-safe advantages.
Type-safe advantages can help you to know if you really configure that value as the correct type before you really execute the application mainly because if you configure the value incorrectly , the compiler will not let you to compile and hence you cannot execute the application. So we say type safety can help to detect errors at compile time rather then runtime.
Back to this example , as the basePackages type is String , you can configure it to any values even it is an invalid package name and you can still compile and execute the application but it will throw exception when if tries to scan the components from the packages since they are invalid.
But as the basePackageClasses 's type is Class , if you configure it as an invalid package name , it will fail to compile and you cannot execute the application. So it helps to check the package you configure are really the valid one before you really execute the application.

Related

List<List<String>> mapped to List<String>

I'm learning how to use Mapstruct in a Spring Boot and Kotlin project.
I've got a generated DTO (ThessaurusDTO) that has a List and I need this mapped into a List on my model (Vocab).
It makes sense that MapStruct can't map this automatically, but I know for a fact that the first list will always be size = 1. I have no control on the API the DTO model belongs to.
I found on the documentation that I can create define a default method implementation within the interface, which would loosely translate to a normal function in Kotlin
My mapper interface:
#Mapper
interface VocabMapper {
#Mappings(
// ...
)
fun thessaurusToVocab(thessaurusDTO: ThessaurusDTO): Vocab
fun metaSyns(nestedList: List<List<String>>): List<String>
= nestedList.flatten()
}
When I try to do a build I get the following error:
VocabMapper.java:16: error: Can't map collection element "java.util.List<java.lang.String>" to "java.lang.String ". Consider to declare/implement a mapping method: "java.lang.String map(java.util.List<java.lang.String> value)".
It looks like mapStruct is still trying to automatically do the mapping while ignoring my custom implementation. Am I missing something trivial here?
I found on the documentation that I can create define a default method implementation within the interface, which would loosely translate to a normal function in Kotlin
From my understand of what I found online, Kotlin does not properly translate an interface function into a default method in Java, but actually generates a class that implements the interface.
If that's the problem, you can annotate metaSyns with #JvmDefault:
Specifies that a JVM default method should be generated for non-abstract Kotlin interface member.
Usages of this annotation require an explicit compilation argument to be specified: either -Xjvm-default=enable or -Xjvm-default=compatibility.
See the link for the difference, but you probably need -Xjvm-default=enable.
I've seen to have fixed this by relying on an abstract based implementation, instead of using an interface.
From my understand of what I found online, Kotlin does not properly translate an interface function into a default method in Java, but actually generates a class that implements the interface.
https://github.com/mapstruct/mapstruct/issues/1577

Problem with EntityScan Spring anotation. It stops working when moving model classes to new package

This is a strange case, I think. Certainly a fringe problem, but I don't know exactly where it is or if it's a spring problem or an IntelliJ problem or even a user problem.
Here's the story:
I have a spring boot app that uses spring data and works just fine.
Running this configuration, it runs great on IntelliJ:
#EntityScan(basePackages = {"com.legosoft.disperser.event.model"})
#SpringBootApplication(scanBasePackages = {"com.legosoft.disperser.event.bean"})
#EnableJpaRepositories(basePackages = "com.legosoft.disperser.event.bean.repositories")
One of the repositories I use is this one, and again, with the above configuration everything is ok:
package com.legosoft.disperser.event.bean.repositories;
import com.legosoft.disperser.event.model.FileConfiguration;
import com.legosoft.disperser.event.model.FileConfigurationId;
import org.springframework.data.jpa.repository.JpaRepository;
public interface FileConfigurationDao extends JpaRepository<FileConfiguration, FileConfigurationId> {
}
We needed to rename certain elements of the application, among other things, the packages and modules. So today I get into the office thinking that it'd be at most a couple hours worth of work and that only because there's a lot of documentation to write, thinking that the IDE's refactor -> rename option would actually do the heavy lifting for me.
So, I renamed the model package using the afore mentioned option to
com.legosoft.fileengine.core.model
which left my Application config like so:
#EntityScan(basePackages = {"com.legosoft.fileengine.core.model"})
#SpringBootApplication(scanBasePackages = {"com.legosoft.disperser.event.bean"})
#EnableJpaRepositories(basePackages = "com.legosoft.disperser.event.bean.repositories")
and the previously shown repository changed its imports accordingly:
package com.legosoft.disperser.event.bean.repositories;
import com.legosoft.fileengine.core.model.FileConfiguration;
import com.legosoft.fileengine.core.model.FileConfigurationId;
import org.springframework.data.jpa.repository.JpaRepository;
public interface FileConfigurationDao extends JpaRepository<FileConfiguration, FileConfigurationId> {
}
At first glance, everything was ok, it was just a package change. Everything compiled fine, but upon trying to run the application I got:
Caused by: java.lang.IllegalArgumentException: Not a managed type: class com.legosoft.fileengine.core.model.FileConfiguration
at org.hibernate.metamodel.internal.MetamodelImpl.managedType(MetamodelImpl.java:552)
at org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.<init>(JpaMetamodelEntityInformation.java:74)
at org.springframework.data.jpa.repository.support.JpaEntityInformationSupport.getEntityInformation(JpaEntityInformationSupport.java:66)
at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getEntityInformation(JpaRepositoryFactory.java:201)
at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:151)
at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:134)
at org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:65)
at org.springframework.data.repository.core.support.RepositoryFactorySupport.getRepository(RepositoryFactorySupport.java:305)
at org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport.lambda$afterPropertiesSet$5(RepositoryFactoryBeanSupport.java:297)
at org.springframework.data.util.Lazy.getNullable(Lazy.java:211)
at org.springframework.data.util.Lazy.get(Lazy.java:94)
at org.springframework.data.repository.core.support.RepositoryFactoryBeanSupport.afterPropertiesSet(RepositoryFactoryBeanSupport.java:300)
at org.springframework.data.jpa.repository.support.JpaRepositoryFactoryBean.afterPropertiesSet(JpaRepositoryFactoryBean.java:121)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1837)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1774)
... 42 common frames omitted
Now, here's the part I don't understand. Debugging into the dependencies I found this bit of code that produces the exception in the org.hibernate.metamodel.internal.JpaMetamodelEntityInformation class:
public <X> ManagedType<X> managedType(Class<X> cls) {
ManagedType<?> type = (ManagedType)this.jpaEntityTypeMap.get(cls);
if (type == null) {
type = (ManagedType)this.jpaMappedSuperclassTypeMap.get(cls);
}
if (type == null) {
type = (ManagedType)this.jpaEmbeddableTypeMap.get(cls);
}
if (type == null) {
throw new IllegalArgumentException("Not a managed type: " + cls);
} else {
return type;
}
Where from my debugging I've determined that the class it looks for is correct (it gets the class object in its new package) but it seems the entity scanning stops working because all maps used to look for the class in the entity model are empty (jpaEntityTypeMap, jpaMappedSuperclassTypeMap or jpaEmbeddableTypeMap) and literally the only thing that's changed is that the classes changed package.
Now, I don't know if its a problem inherent to the refactor->rename option in IntelliJ since there appears to be nothing obviously wrong (the project compiles, the references are correctly updated) but clearly something strange happened because there's no actual scanning going on, or perhaps I'm not using the tool correctly or maybe there's actually something I've done incorrectly with Spring.
I know the problem is a product of this refactor, because as soon as I revert it the project goes back to working fine, repositories and all. Does anybody know if the problem is IntelliJ, something I haven't yet seen on the Spring side of things or something I'm doing wrong to work the refactor maybe?
Any details that clear up this mess would be greatly appreciated. It's had me grumbling all day and for something that seemed so simple!
Thanks in advance.
Found the problem. Even if you mark "refactor all places" on the refactor -> rename dialog, the package name in entitymanager.packagesToScan in application.properties was not changing.
Since the package path in the annotation in the config application class WAS changing, it never crossed my mind to check the properties file.

Gemfire NoSuchBeanDefinitionException Autowiring Cache (Spring 5.0.2 / Gemfirev9.2.7)

We are migrating from Gemfire 8.2.7 to 9.2.1
As part of Gemfire startup, we leverage SpringContextBootstrappingInitializer to initialize the spring-beans which #Autowire the Cache.
The same code when migrated to Gemfire 9.2.1 (along with the other stack) is failing on server startup with below error.
Gemfire 8.2.7 --> Gemfire 9.2.1
Spring-data-Gemfire 1.8.4 --> 2.0.2
Spring-Boot 1.4.7 --> 2.0.0.M7
Spring --> 5.0.2
Caused by:
org.springframework.beans.factory.NoSuchBeanDefinitionException: No
qualifying bean of type 'org.apache.geode.cache.Cache' available:
expected at least 1 bean which qualifies as autowire candidate.
Dependency annotations:
{#org.springframework.beans.factory.annotation.Autowired(required=true)}
Any pointers / changes required for GemfireConfig? Below is our JavaConfig.
#Bean
public CacheFactoryBean gemfireCache() {
return new CacheFactoryBean();
}
Looks like the ComponentScan is kicking in prior to Configuration processor. Any idea on controlling this behavior? This was lasted tested to work in Spring-Boot 1.4.6 (Spring- 4.3.8) and gets resolved with a #Depends option - but just wanted to understand if there are any fundamental changes with the ordering of bean initialization with newer Spring version.
#Configuration
#EnableAutoConfiguration(exclude = { HibernateJpaAutoConfiguration.class, BatchAutoConfiguration.class })
#Import(value = { GemfireServerConfig.class, JpaConfiguration.class, JpaConfigurableProperties.class })
#ComponentScan(basePackages = "com.test.gemfire", excludeFilters = #ComponentScan.Filter(type = FilterType.ANNOTATION, classes = Configuration.class) )
To begin, let me give you some tips since there are 3 issues with your problem statement above...
1) First, you have not made it clear why or how you are using the o.s.d.g.support.SpringContextBootstrappingInitializer Docs here.
I can only assume it is because you are launching your GemFire servers with Gfsh
using the following command...
gfsh> start server --name=MyServer --cache-xml-file=/path/to/cache.xml ...
Where your cache.xml is defined similar to this. After all, this was the original intent for using the SpringContextBootstrappingInitializer.
If this is the case, why not use the Gfsh, start server command, --spring-xml-location option instead. For example:
gfsh> start server --name=MyServer --spring-xml-location=/by/default/a/classpath/to/applicationContext.xml --classpath=/path/to/spring-data-gemfire-2.0.2.RELEASE.jar:...
By doing so, you no longer need to provide cache.xml just to declare the SpringContextBootstrappingInitializer in order to bootstrap a Spring container inside the GemFire JVM process. You can simply use the --spring-xml-location option and put SDG on the server's classpath when starting the server.
2) Second, it is not apparent what type of application component/bean you are injecting a GemFire Cache reference into (e.g. a Region or another application component class, like a DAO, etc). Providing a snippet of code showing how you injected the Cache reference, i.e. the injection point using the #Autowired annotation would have been helpful. For example:
#Service
class MyService {
#Autowired
private Cache gemfireCache;
...
}
3) #2 would have been more apparent if you included the full stack trace rather than just the NoSuchBeanDefinitionException message.
Despite the issues with your problem statement, I can infer the following:
Clearly, you are using "classpath component scanning" (with the #ComponentScan annotation) and are auto-wiring "by type"; which maybe key actually; I will come back to this later below.
You are using Spring's #Autowired annotation on a bean class field (field injection) or property (setter injection), maybe even a constructor.
The type of this field/property (or constructor parameter) is definitely org.apache.geode.cache.Cache.
Moving on...
In general, Spring will follow dependency order first and foremost. That is, if A depends on B, then B must be created before and destroyed after A. Typically, Spring will and can honor this without incident.
Beyond "dependency order" bean creation and satisfying dependencies between beans (including with the #DependsOn annotation), the order of bean creation is pretty loosely defined.
There are several factors that can influence it, such as "registration order" (i.e. the order in which bean definitions are declared, which is particularly true for beans defined in XML), "import order" (when using the #Import annotation on #Configuration classes), Java reflection (includes #Bean definitions declared in #Configuration classes), etc. Configuration organization is definitely important and should not be taken lightly.
This is 1 reason why I am not a big proponent of "classpath component scanning. While it may be convenient, it is always better, IMO, to be more "explicit" in your configuration, and the organization of your configuration, for reasons outlined here in addition to other non-apparent limitations. At worst, you should definitely be limiting the scope of the scan.
Ironically, you excluded/filtered the 1 thing that could actually help your organizational concerns... components of type #Configuration:
... excludeFilters = #ComponentScan.Filter(type = FilterType.ANNOTATION, classes = Configuration.class)
NOTE: given the exclusion, are you certain you did not exclude the the 1 #Configuration class containing your CacheFactoryBean definition? I suppose not since you say this worked after including the #DependsOn annotation.
Clearly there is a dependency defined between some application component of yours (??) and a bean of type o.a.g.cache.Cache (using #Autowired), yet Spring is failing to resolve it.
My thinking is, Spring cannot resolve the Cache dependency because 1) the GemFire cache bean either has not been created yet and 2) Spring cannot find an appropriate bean definition of the desired type (i.e. o.a.g.cache.Cache) in your configuration that would resolve the dependency and force the GemFire Cache to be created first, or 3) the GemFire Cache bean has been created first but Spring is unable to resolve the type as o.a.g.cache.Cache.
I have encountered both scenarios before and it is not exactly clear to me when each scenario happens because I simply have not traced this through yet. I have simply corrected it and moved on. I have noticed that it is version related though.
There are several ways to solve this problem.
If the problem is the later, 3), then simply declaring your dependency as type o.a.g.cache.GemFireCache should resolve the problem. So, for example:
#Repository
class MyDataAccessObject {
#Autowired
private GemFireCache gemfireCache;
...
}
The reason for this is because the o.s.d.g.CacheFactoryBean class's getObjectType() method returns a Class type generically extending o.a.g.cache.GemFireCache. This was by design since o.s.d.g.client.ClientCacheFactoryBean extends o.s.d.g.CacheFactoryBean, though I probably would not have done it that way if I had created these classes. However, it is consistent with the fact that the actual cache type in GemFire is o.a.g.internal.cache.GemFireCacheImpl which indirectly implements both the o.a.g.cache.Cache interface as well as the o.a.g.cache.client.ClientCache interface.
If your problem is the former (1) + 2), which is a bit trickier), then I would suggest you employ a smarter organization of your configuration, separated by concern. For example, you can encapsulate your GemFire configuration with:
#Configuration
class GemFireConfiguration {
// define GemFire components (e.g. CacheFactoryBean) here
}
Then, your application components, where some are dependent on GemFire components, can be defined with:
#Configuration
#Import(GemFireConfiguration.class)
class ApplicationConfiguration {
// define application beans, including beans dependent on GemFire components
}
By importing the GemFireConfiguration you are ensuring the GemFire components/beans are created (instantiated, configured and initialized) first.
You can even employ more targeted, limited "classpath component scanning" at the ApplicationConfiguration class-level in cases where you have a large number of application components (services, DAO, etc).
Then, you can have your main, Spring Boot application class drive all this:
#Configuration
#Import(ApplicationConfiguration.class)
class MySpringBootApplication {
public static void main(String[] args) {
SpringApplication.run(MySpringBootApplication.class, args);
}
}
The point is, you can be as granular as you choose. I like to encapsulate configuration by concern and clearly organize the configuration (using imports) to reflect the order in which I want my components created (constructed, configured and initialized).
Honestly, I basically organize my configuration in the order of dependencies. If my application ultimately depends on a data store and cannot function without that data store, then it makes since to ensure that is initialized first, otherwise, what is the point of starting the application.
Finally, you can always rely on the #DependsOn annotation, as you have appropriately done, to ensure that Spring will create the component before the component that expects it.
Based on the fact that the #DependsOn annotation solved your problem, then I would say this is an organizational problem and falls under the 1) / 2) category I outlined above.
I am going to dig into this a bit deeper and respond to my answer in comments with what I find.
Hope this helps!
-John

Spring 4 Join point to get method argument names and values

I am using Spring 4.3. Is it possible to get method parameter names and values passed to it? I believe this can be done using AOP (before advice) if possible could you please give me a source code.
The following works as expected (Java 8 + Spring 5.0.4 + AspectJ 1.8.13):
#Aspect
#Component
public class SomeAspect {
#Around("#annotation(SomeAnnotation)")
public Object aroundAdvice(ProceedingJoinPoint joinPoint) throws Throwable {
CodeSignature codeSignature = (CodeSignature) joinPoint.getSignature();
System.out.println("First parameter's name: " + codeSignature.getParameterNames()[0]);
System.out.println("First argument's value: " + joinPoint.getArgs()[0]);
return joinPoint.proceed();
}
}
CodeSignature methodSignature = (CodeSignature) joinPoint.getSignature();
String[] sigParamNames = methodSignature.getParameterNames();
You can get method signature arguments names.
Unfortunately, you can't do this. It is a well-known limitation of bytecode - argument names can't be obtained using reflection, as they are not always stored in bytecode.
As workaround, you can add additional annotations like #ParamName(name = "paramName").
So that, you can get params names in the following way:
MethodSignature.getMethod().getParameterAnnotations()
UPDATE
Since Java 8 you can do this
You can obtain the names of the formal parameters of any method or constructor with the method java.lang.reflect.Executable.getParameters. (The classes Method and Constructor extend the class Executable and therefore inherit the method Executable.getParameters.) However, .class files do not store formal parameter names by default. This is because many tools that produce and consume class files may not expect the larger static and dynamic footprint of .class files that contain parameter names. In particular, these tools would have to handle larger .class files, and the Java Virtual Machine (JVM) would use more memory. In addition, some parameter names, such as secret or password, may expose information about security-sensitive methods.
To store formal parameter names in a particular .class file, and thus
enable the Reflection API to retrieve formal parameter names, compile
the source file with the -parameters option to the javac compiler.
https://docs.oracle.com/javase/tutorial/reflect/member/methodparameterreflection.html
In your AOP advice you can use methods of the JoinPoint to get access to methods and their parameters. There are multiple examples online and at stackoverflow.
Get method arguments using spring aop?
For getting arguments: https://docs.jboss.org/jbossaop/docs/2.0.0.GA/docs/aspect-framework/apidocs/org/jboss/aop/joinpoint/MethodInvocation.html#getArguments()
For getting method details: https://docs.jboss.org/jbossaop/docs/2.0.0.GA/docs/aspect-framework/apidocs/org/jboss/aop/joinpoint/MethodInvocation.html#getMethod%28%29

Neo4j/SDN warining: No identity field found for class of type for exception class

In my Neo4j/Spring Data Neo4j project I have a following exception class:
public class CriterionNotFoundException extends NotFoundDomainException {
private static final long serialVersionUID = -2226285877530156902L;
public CriterionNotFoundException(String message) {
super(message);
}
}
During application startup I see a following WARN:
WARN o.s.d.n.m.Neo4jPersistentProperty - No identity field found for class of type: com.example.domain.dao.decision.exception.DecisionAlreadyExistsException when creating persistent property for field: null
Why Neo4j/SDN is looking for identity field in this class ? How to correctly configure my application in order to skip this warning ?
You can ignore this warning- this is produced by SDN when building metadata Spring Data REST integration. It should not be doing this for Exceptions of course, and we'll have this fixed.
One way "to correctly configure [your] application" would be add EnableNeo4jRepositories and EntityScan annotations to your SpringBootApplication (or your config bean) as mentioned here and specify the names of your packages with Neo4J relevant classes.
I've debugged the SDN/Neo4j code only for 5 minutes, so my guesses may be off, but, I believe those warnings are generated when you don't specify packages to scan for your entities, and repositories. I'm guessing in that case SpringBoot+Neo4J-mapping scans each and every class in your project, and if a class has some fields, but nothing resembling an "id" field, it spits this warning. (So adding a Long id field to the classes with warnings may be another (yes, very ugly) work-around as well)
I've seen those warnings vanished when I tried explicitly specifying package names in my project using SpringBoot 2.0.6 + spring-data-neo4j 5.0.11.

Resources