Can't Find A Reponsitory - spring-boot

I have a repository interface as
#Repository
public interface WordRepository extends ReactiveCrudRepository<Word, Long> {}
And in the #SpringApplication class, I have
#Bean
ApplicationListener<applicationReadyEvent> ready(WordRepository rep) {
...
}
to populate some data to the database. It won't be compiled. After the message "APPLICATION FAILED TO START", it says
Action:
Consider defining a bean of type 'com.example.reactive.wordservice.WordRepository' in your configuration.
With or without the annotation #Repository won't yield a different outcome. I change to another approach with a new class instead.
#Component
class WordDataInitializer {
private static Logger log = LoggerFactory.getLogger(WordDataInitializer.class);
private WordRepository wordRepository;
public WordDataInitializer(WordRepository wordRepository) {
this.wordRepository = wordRepository;
}
#EventListener(ApplicationReadyEvent.class)
public void initializeDB() throws URISyntaxException, IOException {
...
}
}
The outcome is still the same. I have done that many times and don't know why it doesn't work this time with Reactor. The Spring Boot is the latest version, 2.3.0 release.
What is missing?

After waking up this morning, I recognized that a dependency I added might cause the problem. I added the Spring Boot starter data JPA to get the #Entity annotation. Removing the dependency solves the problem.
The Reactive DB works differently. The Entity is one case. Also, a schema.sql file won't get picked up automatically as the JPA approach does. I need to write some code to pick up the schema file.

Related

SpringBoot 3 native compilation not generating bean definition for second JpaRepository and failing to start with -Dspring.aot.enabled=true

I am facing an issue with Spring Boot 3 native compilation where the project contains two JpaRepository connecting to two different datasources. The creation of the second datasource configuration depends on the first datasource and JpaRepository as it contains the details about the databases to connect.
The problem is that the Spring Boot Maven plugin process-aot goal does not generate bean definition for repositories which are processed later on. As a result, the application fails to start with the -Dspring.aot.enabled=true property enabled.
I have tried several solutions, including:
Adding the #DependsOn annotation to the second datasource configuration class, but it didn't work.
Adding the #DependsOn annotation to the second JpaRepository, but it also didn't work.
Adding a #Configuration class that contains both datasource configurations, but it also didn't work.
Here is a simplified version of my code:
package com.company.multidatabases.config
#Configuration
public class DataSource1Config {
// datasource1 configuration
#Autowired
private MyEntity1Repository repo;
private Map<Object,Object> dataSources;
}
package com.company.config
#Configuration
public class DataSource2Config {
#Autowired
private DataSource1Config dataSource1Config;
#Bean
public DataSource dataSource(){
return // AbstractDataSourceRouting with datasources map from DataSource1Config
}
// datasource2 configuration that depends on dataSource1Config
}
package com.company.multidatabases.repository
#Repository
public interface MyEntity1Repository extends JpaRepository<MyEntity1, Long> {
// MyEntity1Repository definition
}
package com.company.repository
#Repository
public interface MyEntity2Repository extends JpaRepository<MyEntity2, Long> {
// MyEntity2Repository definition that depends on DataSource2Config
}
And here is the error message I get:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'myEntity1Repository' available
Any help or suggestion is highly appreciated. Thank you in advance.

Spring Boot 2.5 and Spring Data: #NoRepositoryBean unexpected behaviour in multi-module project

I'm facing the following issue in a legacy code that I can't change. I have a multi module project which defines in the commons module a Spring Data interface as below:
package commons;
...
#NoRepositoryBean
public interface MyCustomRepository<P, I extends Number> extends JpaRepository<MyEntity, Integer>
{
MyEntity getOneAndCheck();
}
In another module I extend this interface as follows:
package data;
...
#Repository
public interface MyRepository extends MyCustomRepository<MyEntity, Integer>
{
...
}
So, the idea is that I don't want that Spring Data generates any implementation for the MyEntity getOneAndCheck() method 'cause it is implemented like this:
package data;
...
public class MyCustomRepositoryImpl implements MyCustomRepository
{
...
#Override
public MyEntity getOneAndCheck()
{
...
}
...
}
However, when I'm starting the application, I get the following exception:
...
Caused by: java.lang.IllegalArgumentException: Failed to create query for method public abstract MyEntity commons.MyCustomRepository.getOneAndCheck()! No property getOne found for type MyEntity!
...
So what it seems to happen is that Spring Data tries to generate a Query for the MyEntity getOneAndCheck() method, despite the #NoRepositoryBean annotation. This works as expected in the application I'm gonna migrate from Spring 3 with Spring Data to Spring Boot 2.5.
Not sure if the described behavior has anything to do with the fact that there are multiple Maven modules and that the repositories, the entities and the DTOs are in different modules. Not sure neither if there should be any difference between the way it runs currently with Spring and the one with Spring Boot. But the result is that all of the dozens of repositories in this legacy application are failing with the mentioned exception.
It might be important to mention that the main class needs to use annotations in order to tune the scanning:
#SpringBootApplication(scanBasePackages = "...")
#EnableJpaRepositories(basePackages={"...", "..."})
#EntityScan(basePackages= {"...", "..."})
public class MyApp
{
public static void main(String[] args)
{
SpringApplication.run(MyApp.class, args);
}
}
Not sure whether these annotations are supposed to change anything from the point of view of #NoRepositoryBean but the issue appeared as soon as I added this Spring Boot main class. It worked okay previously without Spring Boot.
Any suggestion please ?
Many thanks in advance.
Kind regards,
Seymour
There are two things that play together:
Spring Data's default custom implementation
Repository fragments
None of these apply because:
The default custom implementation follows the name of the actual repository. In your case, the implementation is named MyCustomRepositoryImpl whereas the repository name is MyRepository. Renaming the implementation to MyRepositoryImpl would address the issue
Since Spring Data 2.0, the repository detection considers interfaces defined at the repository level as fragment candidates where each interface can contribute a fragment implementation. While the implementation name follows the fragment interface name (MyCustomRepository -> MyCustomRepositoryImpl), only interfaces without #NoRepositoryBean are considered.
You have three options:
extracting your custom method into its own fragment interface and providing an implementation class that follows the fragment name:
interface MyCustomFragement {
MyEntity getOneAndCheck();
}
class MyCustomFragementImpl implements MyCustomFragement {
public MyEntity getOneAndCheck() {…}
}
public interface MyRepository extends MyCustomRepository<MyEntity, Integer>, MyCustomFragment {…}
Set the repositoryBaseClass via #EnableJpaRepositories(repositoryBaseClass = …) to a class that implements the custom method.
If you cannot change the existing code, you could implement a BeanPostProcessor to inspect and update the bean definition for the JpaRepositoryFactoryBean by updating repositoryFragments and adding the implementation yourself. This path is rather complex and requires the use of reflection since bean factory internals aren't exposed.

Spring boot how to exclude certain class from testing

I have implemented own class for representing of database vector data based on UserType class.
My example:
class MyVectorType implements UserType {
#Override
public int[] sqlTypes() {
return new int[] { Types.ARRAY };
}
};
#Entity
#Table("MY_ENTITY")
public class MyEntity {
private MyVectorType myVectorType;
}
However this class cannot be used in testing with h2 dialect ie. in memory database. There is error: No Dialect mapping for JDBC type: 2003.
Therefore I would like to exclude this entity (inc. repository) from testing but this does not work:
#SpringBootApplication
#ComponentScan(excludeFilters = {
#ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, classes = {
MyEntity.class, MyEntityRepository.class})
})
public class ApiApplication {
public static void main(String[] args) {
SpringApplication.run(ApiApplication.class, args);
}
}
What is wrong or is there any best practice solving this problem?
EDIT 1: fixed examples - added correct entity and repository
SOLUTION 1:
I think that the only possible solution for this moment is move entity classes (which needs to be excluded) to other package. Then set #EntityScan to scan just non-excluded package. Exclude filters in ComponentScan seems to work only in case of #Component classes, not #Entity. However this is not absolutely best practice to solve this problem.
Just define it as a #MockBean so the real implementation of your repository will be replaced by a functionless mock in your tests:
#MockBean
private MyVectorRepositoryType vectorRepository;
I had a similar problem and I excluded the application configuration in my test configuration, as it seems the test config component scans the application config component scans the classes you want to exclude.

Spring Boot detects 2 identical repository beans

I am using Spring Boot with Spring Data JPA, there is only one #SpringBootApplication. And I have also a repository classes, for example:
package com.so;
public interface SORepository {
//methods
}
And impl
#Repository("qualifier")
#Transactional(readOnly = true)
public class SORepositoryImpl implements SORepository {
//methods
}
The proplem is, when I start the application, I get following error:
Parameter 0 of constructor in com.so.SomeComponent required a single bean, but 2 were found:
- qualifier: defined in file [path\to\SORepositoryImpl.class]
- soRepositoryImpl: defined in file [path\to\SORepositoryImpl.class]
So, as you see, somehow 2 beans of one repository class are created. How can I fix this?
You can use Spring Data JPA methods having created Proxy element and than inject it into public class SORepositoryImpl:
public interface Proxy() extends JpaRepository<Element, Long>{
Element saveElement (Element element); //ore other methods if you want}
And than:
#Repository
#Transactional(readOnly = true)
public class SORepositoryImpl implements SORepository {
#Autowired
private Proxy proxy;
//end realisation of methods from interface SORepository
}
Try taking the #Repository annotation off the SORepositoryImpl class
e.g.
#Transactional(readOnly = true)
public class SORepositoryImpl implements SORepository {
//methods
}
The error message is implying you have two beans, one named "qualifier" and one named "soRepositoryImpl", which is probably in a Config class.
I guess you should share your SomeComponent class supposing you have no extra configuration class/xml. My take is that you are injecting as 'soRepositoryImpl' there where you have defined as 'qualifier'. Having two options them. I would say to just remove the annotation parameter 'qualifier' and it should work.
Moreover, unless you want do specify an custom DAO implementation you can avoid #Repository at all (That's an annotation you use to make it injectable for your services). You can just create an interface extending Spring interface and define methods for queries.
For example:
public interface PersonRepository extends Repository<User, Long> {
List<Person> findByEmailAddressAndLastname(EmailAddress emailAddress, String lastname);
Then you can just inject it in your services/controller directly.
private final PersonRepository personRepository;
public PersonController(final PersonRepository personRepository) {
this.personRepository = personRepository;
}
check samples:
https://spring.io/guides/gs/accessing-data-jpa/
http://docs.spring.io/spring-data/data-commons/docs/1.6.1.RELEASE/reference/html/repositories.html
OK, I've found the issue.
I just couldn't understand, how Spring creates the second bean (soRepositoryImpl), because I've never told it, neither explicitly nor in config classes. But I figured out that the second bean us created during the instantiation of my another SORepository (which is in the different package com.another and which extends JpaRepository).
So, when Spring tries to resolve all dependencies of com.another.SORepository it somehow finds my com.so.SORepositoryImpl (which has nothing familiar with com.another.SORepository - not extending\implementing, not jpa stuff, only similar names!).
Well it seems like a Spring bug to me, because it doesn't check the real inheritance of dependent classes of repositories, only name + Impl (even in different package) suits for him.
The only thing that I should do is to rename `com.so.SORepositoryImpl and that it, no 2 beans anymore.
Thanks everyone for answers!

Why does Spring Boot BatchAutoConfiguration prevent repositories from saving during integration test?

In my spring boot project, having parent spring-cloud-starter-parent Bristxon-M4, I have encountered a problem in my integration tests when adding this to a test class:
#ImportAutoConfiguration({BatchAutoConfiguration.class})
My problem is that I do not fully understand the startup sequence and in which order the configurations are loaded - at least I suspect that is my problem.
The behaviour I am observing is that when I try to save two different #Entity objects which have a #ManyToOne(optional=false) relation in the #Before method of another test, it fails with this message:
Attempting to save one or more entities that have a non-nullable association with an unsaved transient entity. The unsaved transient entity must be saved in an operation prior to saving these dependent entities.
My setup is as follows:
Code snippets from the application:
#Configuration
#EnableBatchProcessing
public class CollectionBatchJobConfiguration ...
#SpringBootApplication
#EnableScheduling
#EnableJms
#EnableCircuitBreaker
public class MyServiceApplication
{
public static void main(String[] args)
{
SpringApplication.run(MyServiceApplication.class, args);
}
}
Code snippets from the test packages:
#Configuration
#ImportAutoConfiguration({BatchAutoConfiguration.class, CollectionBatchJobConfiguration.class, RepositoryTestConfiguration.class})
public class CollectionBatchJobTestConfiguration
{
#Autowired
#Bean
public JobLauncherTestUtils jobLauncherTestUtils()
{
return new JobLauncherTestUtils();
}
...
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(CollectionBatchJobTestConfiguration.class)
public class CollectionBatchJobIT
{
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
...
#Configuration
#ImportAutoConfiguration({
DataSourceAutoConfiguration.class,
HibernateJpaAutoConfiguration.class,
JpaRepositoriesAutoConfiguration.class,
FlywayAutoConfiguration.class})
public class RepositoryTestConfiguration ...
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = MyServiceApplication.class)
#DirtiesContext(classMode = ClassMode.AFTER_EACH_TEST_METHOD)
#WebIntegrationTest
public class MyRestServiceIT
{
#Autowired
private EntityARepository entityARepo;
#Autowired
private EntityBRepository entityBRepo;
#Before
public void before()
{
EntityA a = new EntityA();
a = entityARepo.save(a);
EntityB b = new EntityB(a);
entityBRepo.save(b); // This line fails with org.hibernate.TransientPropertyValueException
}
...
I am running MariaDB externally and use Flyway to clean and migrate schema with repo tables and tables for Spring Batch. The service works as expected under manuel test and initially when running both the MyRestServiceIT and the CollectionBatchJobIT with the #SpringApplicationConfiguration class set to MyServiceApplication ran and passed all test.
But in my attempt to optimise test execution time and to be more in line with what appears to be Spring Boot best practice for testing, I am slimming down the loaded test configurations and using the new #ImportAutoConfiguration together with custom TestConfiguration classes instead of using the main SpringBootApplication class MyServiceApplication.
I have succeeded in improving my other integration tests but after finishing the CollectionBatchJobIT then the MyRestServiceIT failed in the #Before block with this hibernate error:
org.springframework.dao.InvalidDataAccessApiUsageException: org.hibernate.TransientPropertyValueException: Not-null property references a transient value - transient instance must be saved before current operation :
...
Not-null property references a transient value - transient instance must be saved before current operation
The stacktrace shows that the save invocation is in fact from the #Before block.
When debugging, the save operation of entityA shows the missing ID and the database does not contain the expected row.
So to sum up: After adding the BatchAutoConfiguration to one test, another test fails because it can no longer persist entities to the underlying database.
Can anyone body explain what happens or how I can figure out the reason behind it?
Btw. if the BatchAutoConfiguration is omitted from the #ImportAutoConfiguration line the CollectionBatchJobIT fails because nothing gets committed to the database.
Your problem may stem from the fact that the tests which are using MyServiceApplication for configuration automatically include all your test configuration classes due to component scanning being enabled by the #SpringBootApplication annotation. This is most likely not what you want.
One way to exclude these is to replace #SpringBootApplication with the following under the assumption that your test configuration classes are named accordingly:
#Configuration
#EnableAutoConfiguration
#ComponentScan(excludeFilters = #Filter(type = FilterType.REGEX,
pattern = ".*TestConfiguration.*"))

Resources