Modularization Of Spring Data JPA Configuration - spring-boot

We are looking for a way for different modules to separately specify #ComponentScan, #EnableJpaRepositories, and EntityManagerFactory.setPackagesToScan.
We combine multiple code modules into our web application. In addition, we allow for customer specific extensions to the base code which can add additional packages. In my testing I found that I can add an additional java config class and the additional packages in the #ComponentScan and #EnableJpaRepositories are picked up. I am thinking that if I could use #EntityScan I would see similar behavior.
However, we are performing some customization in EntityManagerFactory so #EntityScan is no longer an option. I dont think we want to specify another EntityManagerFactory is each module. The method setPacakgesToScan performs a replacement of the packages (instead of adding to current list).
There have been many posting about the ability to problematically set the packagesToScan but that appears to increase complexity significantly.
Example base configuration class
#Configuration
#EnableJpaRepositories(basePackages = {
"a", "b", "c"
}
,repositoryFactoryBeanClass = BaseRepositoryFactoryBean.class
)
#ComponentScan(basePackages = {
"a", "b", "c"
}
)
public class BaseConfig {
#Bean
public EntityManagerFactory entityManagerFactory() throws NamingException {
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setJtaDataSource(dataSource());
emf.setPackagesToScan(new String[] {"a", "b", "c"})
}
}
Example extension configuration class
#Configuration
#EnableJpaRepositories(basePackages = {
"d"
}
,repositoryFactoryBeanClass = BaseRepositoryFactoryBean.class
)
#ComponentScan(basePackages = {
"d"
}
)
public class ExtensionConfig {
#Bean
public EntityManagerFactory entityManagerFactory() throws NamingException {
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setJtaDataSource(dataSource());
emf.setPackagesToScan(new String[] {"d"})
}
}
Is there another way to achieve this behavior?
Thanks.

You can try following approach, though I'm not sure it's best one:
Create a holder class for packages list. It must be accessible by client extension modules:
public class EmfPackages {
private final String[] packages;
public EmfPackages(String[] packages) {
this.packages = packages;
}
public String[] getPackages() {
return this.packages;
}
}
Then adjust both configuration classes:
public class BaseConfig {
#Bean
public EMFPackages baseEmfPackages() {
return new EmfPackages(new String[] {"a", "b", "c"});
}
#Bean
// both "holders" are now injected here
// AFAIK this feature works in Spring 4+
public EntityManagerFactory entityManagerFactory(List<EmfPackages> emfPackages) throws NamingException {
// actually this is Java 8+ style, adjust for lower versions if needed
final String[] combinedPackages = emfPackages.stream()
.flatMap(p -> Arrays.stream(p.getPackages()))
.collect(Collectors.toList())
.toArray(new String[0]{});
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setJtaDataSource(dataSource());
emf.setPackagesToScan(combinedPackages)
}
}
public class ExtensionConfig {
#Bean
public EmfPackages extendedEmfPackages() {
return new EmfPackages(new String[] {"d"});
}
}

Related

#Bean-method providing a list is ignored, if there is a #Bean-method proving a single bean

I have a configuration providing a single bean and a configuration providing a list of beans. All these beans have the same type.
When I start up an application context with these configurations, I see that an autowired list of the bean type only contains the single bean. I want it to include all beans of that type. I use Spring 5.2.0.
I have boiled it down to one configuration: if I provide a single bean and a list of beans, only the single bean will be used.
This is reproduced in the following test. It fails, because the list only contains "A" and "D" (which shows it did not autowire the list of beans):
#ExtendWith(SpringExtension.class)
#ContextConfiguration(classes = { TestConfiguration.class })
class AutowiringListsTest {
#Autowired
private List<TestBean> testBeanList;
#Test
void testThatBothConfigurationsContributeToBeanList() {
final List<String> idList = testBeanList.stream().map(TestBean::getId).sorted().collect(Collectors.toList());
assertThat(idList, hasItems("A", "B", "C", "D"));
}
#Configuration
public static class TestConfiguration {
#Bean
public TestBean someBean() {
return new TestBean("A");
}
#Bean
public List<TestBean> someMoreBeans() {
return Arrays.asList(new TestBean("B"), new TestBean("C"));
}
#Bean
public TestBean anotherBean() {
return new TestBean("D");
}
}
public static class TestBean {
private final String id;
public TestBean(final String id) {
this.id = id;
}
private String getId() {
return id;
}
}
}
I want to get this to run so that multiple modules can provide beans of a certain type.
Some modules want to provide multiple beans and their number depends on a property.
Some modules will always provide one bean.
The module using the beans (autowiring them as list) should autowire all beans.
How can I get this to run? In what scenario does Spring's behavior make sense?
I can work around the issue by introducing a TestBeanFactory. Each configuration that wants to contribute to the list of TestBeans instead provides a factory.
#Configuration
public static class TestConfiguration {
/** Implemented once in the configuration that defines <code>TestBean</code>. */
#Bean
public List<TestBean> testBeansFromFactory(Collection<TestBeanFactory> factories) {
return factories.stream().map(TestBeanFactory::createTestBeans).flatMap(Collection::stream)
.collect(toList());
}
// Further methods can be defined in various configurations that want to add to the list of TestBeans.
#Bean
public TestBeanFactory someBean() {
return () -> Arrays.asList(new TestBean("A"));
}
#Bean
public TestBeanFactory someMoreBeans() {
return () -> Arrays.asList(new TestBean("B"), new TestBean("C"));
}
#Bean
public TestBeanFactory anotherBean() {
return () -> Arrays.asList(new TestBean("D"));
}
}
public static class TestBean { ... }
public static interface TestBeanFactory {
public Collection<TestBean> createTestBeans();
}
That works and is only slightly more code.
M.Deinum makes a point in the comments for Spring's behavior being consistent:
As you defined a bean of type List it will be used. Autowiring is based on type, it will try to detect the bean of the certain type. The collection (and map as well) is a special one looking up all dependencies of the given type.

How to use a second data source in spring?

I downloaded the example code here
https://github.com/spring-projects/spring-data-examples/tree/master/jpa/multiple-datasources
but I still don't understand how an repository is connected to a datasource. Even when I look into the config class, it makes no reference to a repository. And inside the repository interface it makes no reference to the data source or config.
So when you use two different repositories to save, how does it know which datasource to go to for each repo?
In the repository with examples you linked, each config class is annotated with #EnableJpaRepositories that does scanning for repositories only in the package of annotated class plus subpackages - this is where the relation between datasource and repository happens.
Notice the configurations (commented as /* important */)
in OrderConfig
#Configuration
/* important */
#EnableJpaRepositories(entityManagerFactoryRef = "orderEntityManagerFactory",
transactionManagerRef = "orderTransactionManager")
class OrderConfig {
#Bean
PlatformTransactionManager orderTransactionManager() {
return new JpaTransactionManager(orderEntityManagerFactory().getObject());
}
#Bean
LocalContainerEntityManagerFactoryBean orderEntityManagerFactory() {
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(true);
LocalContainerEntityManagerFactoryBean factoryBean = new LocalContainerEntityManagerFactoryBean();
/* important */
factoryBean.setDataSource(orderDataSource());
factoryBean.setJpaVendorAdapter(vendorAdapter);
factoryBean.setPackagesToScan(OrderConfig.class.getPackage().getName());
return factoryBean;
}
#Bean
DataSource orderDataSource() {
return new EmbeddedDatabaseBuilder().//
setType(EmbeddedDatabaseType.HSQL).//
setName("orders").//
build();
}
}
and, CustomerConfig
#Configuration
/* important */
#EnableJpaRepositories(entityManagerFactoryRef = "customerEntityManagerFactory",
transactionManagerRef = "customerTransactionManager")
class CustomerConfig {
#Bean
PlatformTransactionManager customerTransactionManager() {
return new JpaTransactionManager(customerEntityManagerFactory().getObject());
}
#Bean
LocalContainerEntityManagerFactoryBean customerEntityManagerFactory() {
HibernateJpaVendorAdapter jpaVendorAdapter = new HibernateJpaVendorAdapter();
jpaVendorAdapter.setGenerateDdl(true);
LocalContainerEntityManagerFactoryBean factoryBean = new LocalContainerEntityManagerFactoryBean();
/* important */
factoryBean.setDataSource(customerDataSource());
factoryBean.setJpaVendorAdapter(jpaVendorAdapter);
factoryBean.setPackagesToScan(CustomerConfig.class.getPackage().getName());
return factoryBean;
}
#Bean
DataSource customerDataSource() {
return new EmbeddedDatabaseBuilder().//
setType(EmbeddedDatabaseType.HSQL).//
setName("customers").//
build();
}
}
and, finally the #Transactional annotation in DataInitializer
/* important */
#Transactional("customerTransactionManager")
public CustomerId initializeCustomer() {
return customers.save(new Customer("Dave", "Matthews")).getId();
}
uses customerTransactionManager which is configured in CustomerConfig
and,
/* important */
#Transactional("orderTransactionManager")
public Order initializeOrder(CustomerId customer) {
Assert.notNull(customer, "Customer identifier must not be null!");
Order order = new Order(customer);
order.add(new LineItem("Lakewood Guitar"));
return orders.save(order);
}
uses orderTransactionManager which is configured in OrderConfig
Basically, you're configuring different datasources, different entityManagers, different transactionManagers and refer to them specifically as per your choice.

Spring Disable #Transactional from Configuration java file

I have a code base which is using for two different applications. some of my spring service classes has annotation #Transactional. On server start I would like to disable #Transactional based on some configuration.
The below is my configuration Class.
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:application.properties")
public class WebAppConfig {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
#Resource
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
dataSource.setUrl(url);
dataSource.setUsername(userId);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public PlatformTransactionManager txManager() {
DefaultTransactionDefinition def = new DefaultTransactionDefinition();
def.setIsolationLevel(TransactionDefinition.ISOLATION_DEFAULT);
if(appName.equqls("ABC")) {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_NEVER);
}else {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
}
CustomDataSourceTransactionManager txM=new CustomDataSourceTransactionManager(def);
txM.setDataSource(dataSource());
return txM;
}
#Bean
public JdbcTemplate jdbcTemplate() {
JdbcTemplate jdbcTemplate = new JdbcTemplate();
jdbcTemplate.setDataSource(dataSource());
return jdbcTemplate;
}
}
I am trying to ovveried methods in DataSourceTransactionManager to make the functionality. But still it is trying to commit/rollback the transaction at end of transaction. Since there is no database connection available it is throwing exception.
If I keep #Transactional(propagation=Propagation.NEVER), everything works perfectly, but I cannot modify it as another app is using the same code base and it is necessary in that case.
I would like to know if there is a to make transaction fully disable from configuration without modifying #Transactional annotation.
I'm not sure if it would work but you can try to implement custom TransactionInterceptor and override its method that wraps invocation into a transaction, by removing that transactional stuff. Something like this:
public class NoOpTransactionInterceptor extends TransactionInterceptor {
#Override
protected Object invokeWithinTransaction(
Method method,
Class<?> targetClass,
InvocationCallback invocation
) throws Throwable {
// Simply invoke the original unwrapped code
return invocation.proceedWithInvocation();
}
}
Then you declare a conditional bean in one of #Configuration classes
// assuming this property is stored in Spring application properties file
#ConditionalOnProperty(name = "turnOffTransactions", havingValue = "true"))
#Bean
#Role(BeanDefinition.ROLE_INFRASTRUCTURE)
public TransactionInterceptor transactionInterceptor(
/* default bean would be injected here */
TransactionAttributeSource transactionAttributeSource
) {
TransactionInterceptor interceptor = new NoOpTransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource);
return interceptor;
}
Probably you gonna need additional configurations, I can't verify that right now

Configured ObjectMapper not used in spring-boot-webflux

I have mixins configured in my objectmapperbuilder config, using the regular spring web controller, the data outputted according to the mixins.
However using webflux, a controller with a method returning a Flow or Mono have the data serialized like if the objectmapper a default one.
How to get webflux to enforce an objectmapper configuration to be used ?
sample config:
#Bean
JavaTimeModule javatimeModule(){
return new JavaTimeModule();
}
#Bean
Jackson2ObjectMapperBuilderCustomizer jackson2ObjectMapperBuilderCustomizer(){
return jacksonObjectMapperBuilder -> jacksonObjectMapperBuilder.featuresToEnable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
.mixIn(MyClass.class, MyClassMixin.class);
}
I actually found my solution by stepping through the init code:
#Configuration
public class Config {
#Bean
JavaTimeModule javatimeModule(){
return new JavaTimeModule();
}
#Bean
Jackson2ObjectMapperBuilderCustomizer jackson2ObjectMapperBuilderCustomizer(){
return jacksonObjectMapperBuilder -> jacksonObjectMapperBuilder.featuresToEnable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
.mixIn(MyClass.class, MyClassMixin.class);
}
#Bean
Jackson2JsonEncoder jackson2JsonEncoder(ObjectMapper mapper){
return new Jackson2JsonEncoder(mapper);
}
#Bean
Jackson2JsonDecoder jackson2JsonDecoder(ObjectMapper mapper){
return new Jackson2JsonDecoder(mapper);
}
#Bean
WebFluxConfigurer webFluxConfigurer(Jackson2JsonEncoder encoder, Jackson2JsonDecoder decoder){
return new WebFluxConfigurer() {
#Override
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(encoder);
configurer.defaultCodecs().jackson2JsonDecoder(decoder);
}
};
}
}
I translated the solution of #Alberto Galiana to Java and injected the configured Objectmapper for convenience, so you avoid having to do multiple configurations:
#Configuration
#RequiredArgsConstructor
public class WebFluxConfig implements WebFluxConfigurer {
private final ObjectMapper objectMapper;
public void configureHttpMessageCodecs(ServerCodecConfigurer configurer) {
configurer.defaultCodecs().jackson2JsonEncoder(
new Jackson2JsonEncoder(objectMapper)
);
configurer.defaultCodecs().jackson2JsonDecoder(
new Jackson2JsonDecoder(objectMapper)
);
}
}
Just implement WebFluxConfigurer and override method configureHttpMessageCodecs
Sample code for Spring Boot 2 + Kotlin
#Configuration
#EnableWebFlux
class WebConfiguration : WebFluxConfigurer {
override fun configureHttpMessageCodecs(configurer: ServerCodecConfigurer) {
configurer.defaultCodecs().jackson2JsonEncoder(Jackson2JsonEncoder(ObjectMapper()
.setSerializationInclusion(JsonInclude.Include.NON_EMPTY)))
configurer.defaultCodecs().jackson2JsonDecoder(Jackson2JsonDecoder(ObjectMapper()
.enable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES)))
}
}
Make sure all your data classes to be encoded/decoded have all its properties annotated with #JsonProperty even if property name is equal in class and json data
data class MyClass(
#NotNull
#JsonProperty("id")
val id: String,
#NotNull
#JsonProperty("my_name")
val name: String)
In my case, I was trying to use a customized ObjectMapper while inheriting all of the behavior from my app's default WebClient.
I found that I had to use WebClient.Builder.codecs. When I used WebClient.Builder.exchangeStrategies, the provided overrides were ignored. Not sure if this behavior is something specific to using WebClient.mutate, but this is the only solution I found that worked.
WebClient customizedWebClient = webClient.mutate()
.codecs(clientCodecConfigurer ->
clientCodecConfigurer.defaultCodecs()
.jackson2JsonDecoder(new Jackson2JsonDecoder(customObjectMapper)))
.build();
I have tried all the different solutions (#Primary #Bean for ObjectMapper, configureHttpMessageCodecs(), etc.). What worked for me at the end was specifying a MIME type. Here's an example:
#Configuration
class WebConfig: WebFluxConfigurer {
override fun configureHttpMessageCodecs(configurer: ServerCodecConfigurer) {
val encoder = Jackson2JsonEncoder(objectMapper, MimeTypeUtils.APPLICATION_JSON)
val decoder = Jackson2JsonDecoder(objectMapper, MimeTypeUtils.APPLICATION_JSON)
configurer.defaultCodecs().jackson2JsonEncoder(encoder)
configurer.defaultCodecs().jackson2JsonDecoder(decoder)
}
}

Spring-data-neo4j multiple graphs

I know there's some similar topic out there, but none of them gives a solution. So, if using Spring-data-neo4j, is there any way to connect to multiple graphs? NOT graphs in the same instance with different labels.
Or equivalently, I can ask this question:
How can I configure spring-data-neo4j to have multiple sessions to different Neo4j instances on different ports.
Thanks.
EDIT
Thanks to #Hunger, I think I am one step forward. Now the question is: how to confiture spring-data-neo4j to have multiple 'PereistenceContext' and each of them refers to an individual Neo4j instance.
You can configure different application contexts with different REST-API's declared pointing to different databases.
You should not mix objects or sessions from those different databases though.
So you might need qualifiers for injection.
How about having multiple configurations :
//First configuration
#Configuration
#EnableNeo4jRepositories(basePackages = "org.neo4j.example.repository.dev")
#EnableTransactionManagement
public class MyConfigurationDev extends Neo4jConfiguration {
#Bean
public Neo4jServer neo4jServer() {
return new RemoteServer("http://localhost:7474");
}
#Bean
public SessionFactory getSessionFactory() {
// with domain entity base package(s)
return new SessionFactory("org.neo4j.example.domain.dev");
}
// needed for session in view in web-applications
#Bean
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
public Session getSession() throws Exception {
return super.getSession();
}
}
and another one
//Second config
#Configuration
#EnableNeo4jRepositories(basePackages = "org.neo4j.example.repository.test")
#EnableTransactionManagement
public class MyConfigurationDev extends Neo4jConfiguration {
#Bean
public Neo4jServer neo4jServer() {
return new RemoteServer("http://localhost:7475");
}
#Bean
public SessionFactory getSessionFactory() {
// with domain entity base package(s)
return new SessionFactory("org.neo4j.example.domain.test");
}
// needed for session in view in web-applications
#Bean
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
public Session getSession() throws Exception {
return super.getSession();
}
}

Resources