Error with overriding KafkaAutoConfiguration - spring

I'm trying to override the KafkaAutoConfiguration because I have separated producer and consumer configurations into different property files.
#Configuration(proxyBeanMethods=false)
#ConditionalOnClass(value=org.springframework.kafka.core.KafkaTemplate.class)
#PropertySources({
#PropertySource("classpath:producer-config2.properties"),
#PropertySource("classpath:consumer-config.properties")
})
#EnableConfigurationProperties(value=KafkaProperties.class)
#Import(value={org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration.class,org.springframework.boot.autoconfigure.kafka.KafkaStreamsAnnotationDrivenConfiguration.class})
public class KafkaAutoConfiguration {
}
The issue is coming from line 8 #Import(value={org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration.class,org.springframework.boot.autoconfigure.kafka.KafkaStreamsAnnotationDrivenConfiguration.class}) where I'm getting the errors:
The type org.springframework.boot.autoconfigure.kafka.KafkaAnnotationDrivenConfiguration is not visible,
The type org.springframework.boot.autoconfigure.kafka.KafkaStreamsAnnotationDrivenConfiguration is not visible
But when I F3 into this class in my IDE, I'm able to see both of the classes just fine. What's the issue with this? How can I solve this issue?

class KafkaAnnotationDrivenConfiguration {
The class is package-private and thus only visible to other classes in the same package; you can't reference it from your class.
The fact that you can navigate to it in your IDE is irrelevant.

Related

What magic happens in kotlin

I have kotlin + spring boot app. I done some development in my feature branch committed and returned to dev branch, built app and just tried to start it in intellij idea, but get an error
Error: Main method not found in class com.test.Application, please define the main method as:
public static void main(String[] args)
or a JavaFX application class must extend javafx.application.Application
I could swear it worked and moreover it works.
#SpringBootApplication
#EnableScheduling
#EnableConfigurationProperties()
class Application {
}
fun main(args: Array<String>) {
runApplication<Application>(*args)
}
Where i should look for reason of this issue - kotlin, intellij idea?
Thanks
Your main() function is a package-level (AKA ‘top-level’) function, because it's not defined within your Application class (or, more usually for main(), its companion object).
And in Kotlin/JVM, package-level functions are treated as belonging to a special class named for the source file, with a Kt suffix.
So in this case, you should specify the main class as com.test.ApplicationKt.
(The language docs describe this here. This is one of those implementation details that you normally don't need to know; it's only important when specifying your main class, or calling a package-level function from Java.)

kotlin sealed class with Spring #Component

I am relatively new to kotlin but already loving it. In one of our projects, we use kotlin; when I tried to annotate a sealed class with Spring's #Component, the compiler threw the following exception,
org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'ShutDownManager' available
The simple shutdown manager class
package com.tes.streamconsumer.stream.processor
#Component
sealed class ShutDownManager(
#Autowired private val applicationContext: ApplicationContext
) {
fun shutDownApplication() {
SpringApplication.exit(applicationContext)
}
}
That is Autowired in another class,
package com.tes.streamconsumer.stream.processor
#Component
class AccountFacade(
#Autowired private val shutDownManager: ShutDownManager
) {
}
From the Kotlin documentation on sealed class, I understand this is useful to have restricted class hierarchies that provide more control over inheritance, so my questions below,
Is the sealed class not meant to be used with spring injection
or the ApplicationContext not ready hence the bean was not created?
Please shed light on what I miss here; thanks.
Your problem is nothing to do with the sealed class but elsewhere. Typically this kind of error occurs because Spring is not scanning your code looking for Beans in the way you expect.
You have correctly annotated your ShutDownManager class with #Component but you don't give enough information on your package structure.
This is the right kind of package structure for a Spring project:
com.mydomain.myapp
.facades
.AccountFacade.kt
.managers
.ShutDownManager.kt
.MyApp.kt
What is important is the Spring entrypoint class is higher than all the packages where you declare your Beans. The default behaviour of Spring is to Scan the packages below looking for Components/Services/etc. (You can override the behaviour to scan packages, etc explicitly, but my general preference is to locate the entry point for your application at the top of the tree on its own so it is easy to find in the tree structure and then everything beneath.)
One other word of caution is that in Java the package structure is intrinsically linked to the file system folder structure - you must keep them matched. There is no such restriction in Kotlin. I recommend not making use of this, since many Java devs will use the folder structure and never notice the package declaration differs; this could also be the source of Spring not finding your Beans.
Use of sealed classes/interfaces
I guess you might be thinking of using sealed to protect your ShutDownManager from being subclassed or overriden, but actually by default Kotlin makes all classes final. (You have to explicitly permit subclassing using the open keyword.)
sealed classes have some specific benefits in other places - most often when you are creating data objects, say Apple and Pear that implement/extend from Fruit. You can then write code that knows that there can only be two fruits if you had said sealed class Fruit. In Kotlin there is a when statement that's like Java's switch...case, and the compiler would know there is no need for an else if you were using a sealed Fruit class. See this article:
https://commonsware.com/Kotlin/pages/chap-sealed-002.html

Java Configuration vs Component Scan Annotations

Java configuration allows us to manage bean creation within a configuration file. Annotated #Component, #Service classes used with component scanning does the same. However, I'm concerned about using these two mechanisms at the same time.
Should Java configuration and annotated component scans be avoided in the same project? I ask because the result is unclear in the following scenario:
#Configuration
public class MyConfig {
#Bean
public Foo foo() {
return new Foo(500);
}
}
...
#Component
public class Foo {
private int value;
public Foo() {
}
public Foo(int value) {
this.value = value;
}
}
...
public class Consumer {
#Autowired
Foo foo;
...
}
So, in the above situation, will the Consumer get a Foo instance with a 500 value or 0 value? I've tested locally and it appears that the Java configured Foo (with value 500) is created consistently. However, I'm concerned that my testing isn't thorough enough to be conclusive.
What is the real answer? Using both Java config and component scanning on #Component beans of the same type seems like a bad thing.
I think your concern is more like raised by the following use case:
You have a custom spring-starter-library that have its own #Configuration classes and #Bean definitions, BUT if you have #Component/#Service in this library, you will need to explicitly #ComponentScan these packages from your service, since the default #ComponentScan (see #SpringBootApplication) will perform component scanning from the main class, to all sub-packages of your app, BUT not the packages inside the external library. For that purpose, you only need to have #Bean definitions in your external library, and to inject these external configurations via #EnableSomething annotation used on your app's main class (using #Import(YourConfigurationAnnotatedClass.class) OR via using spring.factories in case you always need the external configuration to be used/injected.
Of course, you CAN have #Components in this library, but the explicit usage of #ComponentScan annotation may lead to unintended behaviour in some cases, so I would recommend to avoid that.
So, to answer your question -> You can have both approaches of defining beans, only if they're inside your app, but bean definitions outside your app (e.g. library) should be explicitly defined with #Bean inside a #Configuration class.
It is perfectly valid to have Java configuration and annotated component scans in the same project because they server different purposes.
#Component (#Service,#Repository etc) are used to auto-detect and auto-configure beans.
#Bean annotation is used to explicitly declare a single bean, instead of letting Spring do it automatically.
You can do the following with #Bean. But, this is not possible with #Component
#Bean
public MyService myService(boolean someCondition) {
if(someCondition) {
return new MyServiceImpl1();
}else{
return new MyServiceImpl2();
}
}
Haven't really faced a situation where both Java config and component scanning on the bean of the same type were required.
As per the spring documentation,
To declare a bean, simply annotate a method with the #Bean annotation.
When JavaConfig encounters such a method, it will execute that method
and register the return value as a bean within a BeanFactory. By
default, the bean name will be the same as the method name
So, As per this, it is returning the correct Foo (with value 500).
In general, there is nothing wrong with component scanning and explicit bean definitions in the same application context. I tend to use component scanning where possible, and create the few beans that need more setup with #Bean methods.
There is no upside to include classes in the component scan when you create beans of their type explicitly. Component scanning can easily be targeted at certain classes and packages. If you design your packages accordingly, you can component scan only the packages without "special" bean classes (or else use more advanced filters on scanning).
In a quick look I didn't find any clear information about bean definition precedence in such a case. Typically there is a deterministic and fairly stable order in which these are processed, but if it is not documented it maybe could change in some future Spring version.

Neo4j/SDN warining: No identity field found for class of type for exception class

In my Neo4j/Spring Data Neo4j project I have a following exception class:
public class CriterionNotFoundException extends NotFoundDomainException {
private static final long serialVersionUID = -2226285877530156902L;
public CriterionNotFoundException(String message) {
super(message);
}
}
During application startup I see a following WARN:
WARN o.s.d.n.m.Neo4jPersistentProperty - No identity field found for class of type: com.example.domain.dao.decision.exception.DecisionAlreadyExistsException when creating persistent property for field: null
Why Neo4j/SDN is looking for identity field in this class ? How to correctly configure my application in order to skip this warning ?
You can ignore this warning- this is produced by SDN when building metadata Spring Data REST integration. It should not be doing this for Exceptions of course, and we'll have this fixed.
One way "to correctly configure [your] application" would be add EnableNeo4jRepositories and EntityScan annotations to your SpringBootApplication (or your config bean) as mentioned here and specify the names of your packages with Neo4J relevant classes.
I've debugged the SDN/Neo4j code only for 5 minutes, so my guesses may be off, but, I believe those warnings are generated when you don't specify packages to scan for your entities, and repositories. I'm guessing in that case SpringBoot+Neo4J-mapping scans each and every class in your project, and if a class has some fields, but nothing resembling an "id" field, it spits this warning. (So adding a Long id field to the classes with warnings may be another (yes, very ugly) work-around as well)
I've seen those warnings vanished when I tried explicitly specifying package names in my project using SpringBoot 2.0.6 + spring-data-neo4j 5.0.11.

SD MongoDB polymorphism in subdocument

I just started developing some app in Java with spring-data-mongodb and came across some issue that I haven't been able to solve:
Have a couple of document beans like this:
#Document(collection="myBeanBar")
public class BarImpl implements Bar {
String id;
Foo foo;
// More fields and methods ...
}
#Docuemnt
public class FooImpl implements Foo {
String id;
String someField;
// some more fields and methods ...
}
And I have a repository class with a method that simply invokes a find similar to this:
public List<? extends Bar> findByFooField(final String fieldValue) {
Query query = Query.query(Criteria.where("foo.someField").is(fieldValue));
return getMongoOperations().find(query, BarImpl.class);
}
Saving a Bar works just fine, it would save it in mongo along with the "_class" attribute for both Foo and Bar. However, finding by some attribute in Foo would throw an exception like this:
Exception in thread "main" java.lang.IllegalArgumentException: No property someField found on test.Foo!
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentPropertyPath(AbstractMappingContext.java:225)
at org.springframework.data.mongodb.core.convert.QueryMapper.getPath(QueryMapper.java:202)
at org.springframework.data.mongodb.core.convert.QueryMapper.getTargetProperty(QueryMapper.java:190)
at org.springframework.data.mongodb.core.convert.QueryMapper.getMappedObject(QueryMapper.java:86)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1336)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1322)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:495)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:486)
Which, after some digging, makes some sense, since nowhere in the query is the sub-document concrete type being specified, and the Entity Information of Bar says the type of foo is Foo (not FooImpl), which in turn can not have properties cause it is an interface.
My question is: Is there a way to specify it or work-around this issue without declaring the sub-document type as a concrete type?
I've been googling it for a couple of days and looking at the documentation and API and the source code but I can not find a clear way to do it. I'd really appreciate your help.
Thank you very much.
I had a similar problem, I have a class that implements an interface and when I use findAll I get the error:
org.springframework.data.mapping.model.MappingInstantiationException: Could not instantiate bean class [test.MetaClasse]: Specified class is an interface.
After debugging SpringData code, I realized that Mapper uses #TypeAlias to discover the type it has to instantiate, so I just put #TypeAlias("FullClassName") on my implementations of test.MetaClasse and it worked!
I tested with your situation and it will work!
Like mentioned in this comment, the solution with having full class name in the type alias is imperfect as it might make refactoring cumbersome.
Instead you can just configure type mappings and make it work automagically. Here's how:
First you'll need to annotate BarImpl and FooImpl with #TypeAlias. It doesn't have to be a full class name, could be anything else. For example #TypeAlias("bar_impl") and #TypeAlias("foo_impl") respectively.
Then we’re going to need the reflections library. Pick the latest version for the build tool of your choice here.
For example with Gradle:
implementation("org.reflections:reflections:0.10.2")
Now we’re going to need a small extension to DefaultMongoTypeMapper to make it easy to configure and instantiate. Here’s how it would look in Kotlin:
class ReflectiveMongoTypeMapper(
private val reflections: Reflections = Reflections("com.example")
) : DefaultMongoTypeMapper(
DEFAULT_TYPE_KEY,
listOf(
ConfigurableTypeInformationMapper(
reflections.getTypesAnnotatedWith(TypeAlias::class.java).associateWith { clazz ->
getAnnotation(clazz, TypeAlias::class.java)!!.value
}
),
SimpleTypeInformationMapper(),
)
)
where com.example is either your base package or the package with MongoDB models.
This way we will find all classes annotated with #TypeAlias and register alias to type mappings.
Next we'll need to adjust the app's mongo configuration a bit. The configuration has to extend AbstractMongoClientConfiguration and we need to override method mappingMongoConverter to make use of the mapper we created before. It should look like this:
override fun mappingMongoConverter(
databaseFactory: MongoDatabaseFactory,
customConversions: MongoCustomConversions,
mappingContext: MongoMappingContext,
) = super.mappingMongoConverter(databaseFactory, customConversions, mappingContext).apply {
setTypeMapper(ReflectiveMongoTypeMapper())
}
Done!
Now all alias to type mappings will be registered automatically on context startup and all your polymorphic fields will work just fine.
You can check the full code example on GitHub.
Also, here's a blog post where you can read about the root cause of this issue as well as check other ways to solve it (in case you don't want to rely on reflection): https://blog.monosoul.dev/2022/09/16/spring-data-mongodb-polymorphic-fields/

Resources