Spring - Data persisted by JdbcTemplate unable to be seen by JpaRepository - spring

I am building a Spring Boot application which requires the need for persistence via JDBC and selecting/reading via JPA/Hibernate. I have implemented both of these types of operations using Spring's JdbcTemplate and Spring Data's JpaRepository.
After I persist using JdbcTemplate I am unable to see the data via JpaRepository even though they share the same datasource. I am able to read the data if I use JdbcTemplate.
NOTE: I am using two data sources. One is configured in another class without the #Primary annotation using its own entity manager factory and transaction manager, which is why I've needed to explicitly define it below using Spring Boot's default bean terminology "transactionManager" and "entityManagerFactory".
The following is my embedded database configuration for the primary beans:
#Configuration
#EnableJpaRepositories(basePackages = {"com.repository"})
public class H2DataSourceConfiguration {
private static final Logger log = LoggerFactory.getLogger(H2DataSourceConfiguration.class);
#Bean(destroyMethod = "shutdown")
#Primary
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.H2)
.setName("dataSource")
.build();
}
#Bean
#Primary
public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder, DataSource dataSource) {
return builder
.dataSource(dataSource)
.packages("com.my.domain", "org.springframework.data.jpa.convert.threeten")
.build();
}
#Bean
#Primary
public JpaTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager(entityManagerFactory);
return jpaTransactionManager;
}
}
The persistence happens in a different transaction to the reading of the data, however they share the same service.
Both operations happen within the #Transactional annotation. Both repository beans are specified in the same service and also contain the #Transactional annotation. The service looks as follows:
#Service
#Transactional
public class MyServiceImpl implements MyService {
private static final Logger log = LoggerFactory.getLogger(MyServiceImpl.class);
#Autowired
private MyJpaRepository myJpaRepository;
#Autowired
private MyJdbcRepository myJdbcRepository;
...
}
MyJdbcRepositoryImpl.java:
#Repository
#Transactional(propagation = Propagataion.MANDATORY)
public class MyJdbcRepositoryImpl implements MyJdbcRepository {
#Autowired
private NamedParameterJdbcTemplate jdbcTemplate;
// methods within here all use jdbcTemplate.query(...)
}
MyJpaRepository.java:
#Repository
#Transactional(propagation = Propagataion.MANDATORY)
public interface AcquisitionJpaRepository extends JpaRepository<AcquisitionEntity, Long> {
}
Is it at all possible that the jdbctemplate calls are saving to a different h2 database?

The above configuration is correct!
The problem was that the JdbcTemplate calls had the schema owner as a prefix.
For example:
select * from I_AM_SCHEMA.KILL_ME
However, I had both the #Entity annotation and the #Table annotation on the entity object and only specified the table name!
Example:
#Entity
#Table(name = "KILL_ME")
So, we were writing to one table with JdbcTemplate but reading from a completely different other table via JPA/Hibernate due to us missing the prefix.
The correct fix was to prefix the entity name in the #Entity annotation:
#Entity("I_AM_SCHEMA.KILL_ME")
DONE!

Related

why is spring boot's DataJpaTest scanning #Component

Confident this hasn't been asked but reading through the Spring docs and testing utilities I found this annotation and thought I'd start using it. Reading through the fine print I read:
Regular #Component beans will not be loaded into the ApplicationContext.
That sounded good and I even liked the idea of using H2 except from what I found the entity I wanted to use had catalog and schema modifiers to it and the default H2 I couldn't figure out how to support that. I made an H2 datasource for the test branch and use that and override the replace. I wound up with
#RunWith(SpringRunner.class)
#ContextConfiguration(classes=ABCH2Congfiguration.class)
#DataJpaTest
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
public class StatusRepositoryTest {
}
However my tests fails fro Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type.
which leads to:
Error creating bean with name 'customerServiceImpl': Unsatisfied dependency.
However the customerServiceImpl is this bean:
#Component
public class CustomerServiceImpl implements CustomerService {
}
That says #Component. The fine print for DataJpaTest says it doesn't load #Components. Why is it doing that and thus failing the test?
As Kyle and Eugene asked below here's the rest:
package com.xxx.abc.triage;
#Component
public interface CustomerService {
}
Configuration
#ComponentScan("com.xxx.abc")
#EnableJpaRepositories("com.xxx.abc")
//#Profile("h2")
public class ABMH2Congfiguration {
#Primary
#Bean(name = "h2source")
public DataSource dataSource() {
EmbeddedDatabase build = new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2).setName("ABC").addScript("init.sql").build();
return build;
}
#Bean
public JpaVendorAdapter jpaVendorAdapter() {
HibernateJpaVendorAdapter bean = new HibernateJpaVendorAdapter();
bean.setDatabase(Database.H2);
bean.setShowSql(true);
bean.setGenerateDdl(true);
return bean;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
DataSource dataSource, JpaVendorAdapter jpaVendorAdapter) {
LocalContainerEntityManagerFactoryBean bean = new LocalContainerEntityManagerFactoryBean();
bean.setDataSource(dataSource);
bean.setJpaVendorAdapter(jpaVendorAdapter);
bean.setPackagesToScan("com.xxx.abc");
return bean;
}
#Bean
public JpaTransactionManager transactionManager(EntityManagerFactory emf) {
return new JpaTransactionManager(emf);
}
}
And just to clarify the question, why is #Component being loaded into the context within a #DataJpaTest?
#ComponentScan automatically inject all found #Component and #Service into context. You could override it by separate #Bean:
#Bean
CustomerService customerService{
return null;
}
Or remove #Component annotation from CustomerService and CustomerServiceImpl, but you should add #Bean at your production #Configuration
#DataJpaTest does not load #Component, #Service... by default, only #Repository and internal things needed to configure Spring data JPA.
In your test, you can load any #Configuration you need, and in your case, you load #ABMH2Congfiguration which performs a #ComponentScan that's why Spring try to load your CustomerService.
You should only scanning the #Repository in this configuration class, and scan others #Component, #Service... in another #Configuration like DomainConfiguration. It's always a good practice to separate different types of configurations.

SpringBoot web services multiple data sources only one working

I have developed two webservices using Spring Boot framework and I have them in the same project. Each webservice use a different DB, say ws1 uses Oracle1 and ws2 uses Oracle2. I have defined a DataBaseConfig with the beans definition but when I run the app, always works one webservice ( and it's always the same ).
DataBaseConfig
#Configuration
public class DataBaseConfig {
#Bean(name = "ora1")
#ConfigurationProperties(prefix="spring.datasource")
public DataSource mysqlDataSource() {
return DataSourceBuilder.create().build();}
#Bean(name = "ora2")
#ConfigurationProperties(prefix="spring.secondDatasource")
public DataSource sqliteDataSource() {
return DataSourceBuilder.create().build();}
#Bean(name = "clients")
#Autowired
#ConfigurationProperties(prefix = "spring.datasource")
#Qualifier("datasource")
public JdbcTemplate slaveJdbcTemplate(DataSource datasource) {
return new JdbcTemplate(datasource); }
#Bean(name = "places")
#Autowired
#Primary
#ConfigurationProperties(prefix = "spring.secondDatasource")
#Qualifier("secondDatasource")
public JdbcTemplate masterJdbcTemplate(DataSource secondDatasource) {
return new JdbcTemplate(secondDatasource);}
}
I have the services definition with the sql statements and the definition
#Service
public class ClientsService {
#Autowired
#Qualifier("clients")
private JdbcTemplate jdbcTemplate;
and the other service
#Service
public class PlacesService {
#Autowired
#Qualifier("places")
private JdbcTemplate jdbcTemplate;
Then in each controller I have de mapping #RequestMapping. When I run the app I have no connection-related errors and if I separate the webservices in 2 projects, each works fine.
You have a few things going wrong here, including some unnecessary annotations. See below, note the location of #Qualifier and the qualifier name:
#Bean(name = "clients")
public JdbcTemplate slaveJdbcTemplate(#Qualifier("ora1") DataSource datasource) {
return new JdbcTemplate(datasource);
}
#Bean(name = "places")
#Primary
public JdbcTemplate masterJdbcTemplate(#Qualifier("ora2") DataSource secondDatasource) {
return new JdbcTemplate(secondDatasource);
}
Instead of resolving by bean name, which is a bad idea IMO because it's not typesafe, why don't you use constructor injection and create the services in the configuration class (ditch #Service annotation). Create the DataSource and JdbcTemplate beans as usual, don't give them any names (the default is the method name), and also create new PlacesService(placesJdbcTemplate()). The result is a lot simpler code.
This is assuming you want both databases active at runtime. If not, use #Profile.

Spring Java based config "chicken and Egg situation"

Just recently started looking into Spring and specifically its latest features, like Java config etc.
I have this somewhat strange issue:
Java config Snippet:
#Configuration
#ImportResource({"classpath*:application-context.xml","classpath:ApplicationContext_Output.xml"})
#Import(SpringJavaConfig.class)
#ComponentScan(excludeFilters={#ComponentScan.Filter(org.springframework.stereotype.Controller.class)},basePackages = " com.xx.xx.x2.beans")
public class ApplicationContextConfig extends WebMvcConfigurationSupport {
private static final Log log = LogFactory.getLog(ApplicationContextConfig.class);
#Autowired
private Environment env;
#Autowired
private IExtendedDataSourceConfig dsconfig;
#PostConstruct
public void initApp() {
...
}
#Bean(name="transactionManagerOracle")
#Lazy
public DataSourceTransactionManager transactionManagerOracle() {
return new DataSourceTransactionManager(dsconfig.oracleDataSource());
}
IExtendedDataSourceConfig has two implementations which are based on spring active profile one or the other in instantiated. For this example let say this is the implementation :
#Configuration
#PropertySources(value = {
#PropertySource("classpath:MYUI.properties")})
#Profile("dev")
public class MYDataSourceConfig implements IExtendedDataSourceConfig {
private static final Log log = LogFactory.getLog(MYDataSourceConfig.class);
#Resource
#Autowired
private Environment env;
public MYDataSourceConfig() {
log.info("creating dev datasource");
}
#Bean
public DataSource oracleDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("oracle.jdbc.driver.OracleDriver");
dataSource.setUrl(env.getProperty("oracle.url"));
dataSource.setUsername(env.getProperty("oracle.user"));
dataSource.setPassword(env.getProperty("oracle.pass"));
return dataSource;
}
The problem is that when transactionManagerOracle bean is called, (even if I try to mark it as lazy) dsconfig variable value appears to be null.
I guess #beans are processed first and then all Autowires, is there a fix for this? How do I either tell spring to inject dsconfig variable before creating beans, or somehow create #beans after dsconfig is injected?
You can just specify DataSource as method parameter for the transaction manager bean. Spring will then automatically inject the datasource, which is configured in the active profile:
#Bean(name="transactionManagerOracle")
#Lazy
public DataSourceTransactionManager transactionManagerOracle(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
If you still want to do this through the configuration class, specify that as parameter:
public DataSourceTransactionManager transactionManagerOracle(IExtendedDataSourceConfig dsconfig) {}
In both ways you declare a direct dependency to another bean, and Spring will make sure, that the dependent bean exists and will be injected.

Spring Data JPA Transaction - No Transaction in progress - Spring Data Neo4j

I think i'm missing something obvious. Iam trying to make a entity persist into a database via a JUnit Test case, however it doesnt seem to be persisting due to no active transaction.
Configuration:
#Configuration
#EnableTransactionManagement
public class TransactionConfig {
#Inject
private EntityManagerFactory entityMangerFactory;
#Bean
public JpaTransactionManager transactionManager(){
return new JpaTransactionManager(entityMangerFactory);
}
TestCase:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = { Application.class })
#ActiveProfiles(CommonConstants.SPRING_PROFILE_TEST)
#IntegrationTest
#WebAppConfiguration
public class UserRepositoryTest {
#Inject
UserRepository userRepo;
#Test
#Rollback(false)
#Transactional("transactionManager")
public void addUser() {
User user = BootstrapDataPopulator.getUser();
userRepo.save(user);
System.out.println(user.getId()); //Successfully outputs the id generate by hibernate
assertNotNull(user.getId());
}
}
^This test case executed successfully however I do not see any entiites persisted in the database as expected.
When I change the from userRepo.save(user) to userRepo.saveAndFlush(user) I get the following exception:
javax.persistence.TransactionRequiredException: no transaction is in progress
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.checkTransactionNeeded(AbstractEntityManagerImpl.java:1171)
at org.hibernate.jpa.spi.AbstractEntityManagerImpl.flush(AbstractEntityManagerImpl.java:1332)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
Spring Boot AutoConfiguration Report: http://dumptext.com/YcGaR3Wf
Names of all Spring Beans Initialized: http://dumptext.com/jp9O6l8v
I am using Spring Data Neo4j (SDN) in my application as well. SDN comes with a default class Neo4jConfiguration which has:
#Bean(name = {"neo4jTransactionManager","transactionManager"})
#Qualifier("neo4jTransactionManager")
public PlatformTransactionManager neo4jTransactionManager() throws Exception {
return new JtaTransactionManagerFactoryBean(getGraphDatabaseService()).getObject();
}
The "transactionManager" overrides the bean defined in my TransactionConfig class. Hence the reason no Entity transaction was in progress. I stopped using the SDN class Neo4jConfiguration. This resolved my issue.

Injecting Object into #Repository loaded with component-scan - No Autowired

I have the following Dao class defined:
#Repository
public class MyDao {
private JdbcTemplate jdbcTemplate;
private String myString;
#Autowired
public void setDataSource(DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
I'm using component scanning over the package where MyDao is defined so I do not have a bean definition for MyDao in my Spring configuration file. Is there a way to inject a String into myString without using autowiring? What are my alternatives for this?
Spring comes with the #Value annotation that you can use to inject a string.
http://static.springsource.org/spring/docs/3.0.x/javadoc-api/org/springframework/beans/factory/annotation/Value.html
http://chrislovecnm.com/2010/03/08/spring-3-java-based-configuration-with-value/
There is also a configuration framework called Constretto that allows nested configurations (like json) to be injected.

Resources