Spring #Transactional not starting a transaction - spring

I have a multi-module Maven project with the following structure:
project
|
-- data
|
-- DepartmentRepository.java
|
-- domain
|
-- Department.java
|
-- service
|
-- DepartmentService.java
|
-- DepartmentServiceImpl.java
|
-- web
|
-- DepartmentController.java
The project uses Spring 4.1.4, Spring Data JPA 1.7.2 and Maven 3.1.0. The following classes are included:
#Entity class Department {}
interface DepartmentRepository extends JpaRepository<Department, Long> {}
interface DepartmentService {
List<Department> getAll();
}
#Service
#Transactional(readOnly = true)
class DepartmentServiceImpl implements DepartmentService {
#Autowired
private DepartmentRepository departmentRepository;
#Transactional
public List<Department> getAll() {
return departmentRepository.findAll();
}
}
I was hoping that as soon as the code enters DepartmentServiceImpl.getAll, a transaction would have been started. However, I am finding that this is not the case. No transaction is started. I have checked this by examining TransactionSynchronizationManager.isActualTransactionActive() inside this method, which prints false. I have also checked by putting break points in TransactionAspectSupport.invokeWithinTransaction. However, as soon as departmentRepository.findAll is invoked, a transaction is correctly started (since SimpleJpaRepository, the class which provides an implementation of the JPA repository interface is also annotated with #Transactional).
A complete sample application demonstrating the problem is available on Github.

I noticed your annotation-driven mode is set to aspectj
<transaction:annotation-driven mode="aspectj"/>
but you don't seem to have load-time-weaver defined anywhere in your context.
That may or may not be the issue as I only took a quick look. Also, I don't see why you would need aspectj vs the default proxy mode with what you have, so you might be fine just removing the mode="aspectj" altogether and default to proxy.

Related

#Cacheable testing over method

I have a #Cacheable method inside a class.
I try to create that cache after a first call to that method, then, the second call should't go inside the method getCacheLeads.
#Service
public class LeadService {
#Autowired
private LeadRepository leadRepository;
#Autowired
public LeadService(LeadRepository leadRepository) {
this.leadRepository = leadRepository;
}
public void calculateLead(Lead leadBean) {
Lead lead = this.getCacheLeads(leadBean);
}
#Cacheable(cacheNames="leads", key="#leadBean.leadId")
public Lead getCacheLeads(Lead leadBean){
Lead result = leadRepository.findByLeadId(leadBean.getLeadId());
***logic to transform de Lead object***
return result;
}
}
But during testing that cache is never used, calling it twice with same parameter (serviceIsCalled) to ensure it is called twice to check it.
#ExtendWith(SpringExtension.class)
public class LeadServiceTest {
private LeadService leadService;
#Mock
private LeadRepository leadRepository;
#Autowired
CacheManager cacheManager;
#BeforeEach
public void setUp(){
leadService = new LeadService(leadRepository);
}
#Configuration
#EnableCaching
static class Config {
#Bean
CacheManager cacheManager() {
return new ConcurrentMapCacheManager("leads");
}
}
#Test
public void testLead(){
givenData();
serviceIsCalled();
serviceIsCalled();
checkDataArray();
}
private void givenData() {
Lead lead = new Lead();
lead.setLeadId("DC635EA19A39EA128764BB99052E5D1A9A");
Mockito.when(leadRepository.findByLeadId(any()))
.thenReturn(lead);
}
private void serviceIsCalled(){
Lead lead = new Lead();
lead.setLeadId("DC635EA19A39EA128764BB99052E5D1A9A");
leadService.calculateLead(lead);
}
private void checkDataArray(){
verify(leadRepository, times(1)).findByLeadId(anyString());
}
}
Why is it called 2 times?
You have a lot of things going on here, and someone looking at this and answering your question would definitely have to read between the lines.
First, your Spring configuration is not even correct. You are declaring the names of all the caches used by your Spring application (and tests) "statically" with the use of the ConcurrentMapCacheManager constructor accepting an array of cache names as the argument.
NOTE: Caches identified explicitly by name, and only these caches, are available at runtime.
#Bean
CacheManager cacheManager() {
return new ConcurrentMapCacheManager("LEAD_DATA");
}
In this case, your 1 and only cache is called "LEAD_DATA".
NOTE: Only the no arg constructor in `ConcurrentMapCacheManager allows dynamically created caches by name at runtime.
But then, in your #Service LeadService class, #Cacheable getCacheLeads(:Lead) method, you declare the cache to use as "leads".
#Service
public class LeadService {
#Cacheable(cacheNames="leads", key="#leadBean.leadId")
public Lead getCacheLeads(Lead leadBean){
// ...
}
}
This miss configuration will actually lead to an Exception at runtime similar to the following:
java.lang.IllegalArgumentException: Cannot find cache named 'leads' for Builder[public io.stackoverflow.questions.spring.cache.StaticCacheNamesIntegrationTests$Lead io.stackoverflow.questions.spring.cache.StaticCacheNamesIntegrationTests$LeadService.load(io.stackoverflow.questions.spring.cache.StaticCacheNamesIntegrationTests$Lead)] caches=[leads] | key='#lead.id' | keyGenerator='' | cacheManager='' | cacheResolver='' | condition='' | unless='' | sync='false'
at org.springframework.cache.interceptor.AbstractCacheResolver.resolveCaches(AbstractCacheResolver.java:92)
at org.springframework.cache.interceptor.CacheAspectSupport.getCaches(CacheAspectSupport.java:252)
at org.springframework.cache.interceptor.CacheAspectSupport$CacheOperationContext.<init>(CacheAspectSupport.java:724)
at org.springframework.cache.interceptor.CacheAspectSupport.getOperationContext(CacheAspectSupport.java:265)
at org.springframework.cache.interceptor.CacheAspectSupport$CacheOperationContexts.<init>(CacheAspectSupport.java:615)
at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:345)
at org.springframework.cache.interceptor.CacheInterceptor.invoke(CacheInterceptor.java:64)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:698)
at io.stackoverflow.questions.spring.cache.StaticCacheNamesIntegrationTests$LeadService$$EnhancerBySpringCGLIB$$86664246.load(<generated>)
...
..
.
Additionally, I don't see anything "outside" of the LeadsService bean calling the #Cacheable, getCacheLeads(..) method. Inside your test, you are calling:
leadService.calculateLead(lead);
As follows:
private void serviceIsCalled(){
Lead lead = new Lead();
lead.setLeadId("DC635EA19A39EA128764BB99052E5D1A9A");
leadService.calculateLead(lead);
}
If the calculateLead(:Lead) LeadService method is calling the #Cacheable, getCacheLeads(:Lead) LeadService method (internally), then that is not going to cause the caching functionality to kick in since you are already "behind" the AOP Proxy setup by Spring to "enable" caching behavior for your LeadService bean.
See the Spring Framework AOP documentation on this matter.
NOTE: Spring's Cache Abstraction, like the Spring's Transaction Management, is built on the Spring AOP infrastructure, as are many other things in Spring.
In your case this means:
Test -> <PROXY> -> LeadService.calculateLead(:Lead) -> LeadService.getCacheLeads(:Lead)
However, between LeadSevice.calculateLead(:Lead) and LeadService.getCacheLeads(:Lead), NO PROXY is involved, therefore Spring's caching behavior will not be applied.
Only...
Test (or some other bean) -> <PROXY> -> LeadService.getCacheLeads(:Lead)
Will result in the AOP Proxy decorated with the Caching Interceptors being invoked and the caching behavior applied.
You can see that your use case will work correctly when configured and used correctly as demonstrated in my example test class, modeled after your domain.
Look for the comments that explain why your configuration will fail in your case.

Primary/secondary datasource failover in Spring MVC

I have a java web application developed on Spring framework which uses mybatis. I see that the datasource is defined in beans.xml. Now I want to add a secondary data source too as a backup. For e.g, if the application is not able to connect to the DB and gets some error, or if the server is down, then it should be able to connect to a different datasource. Is there a configuration in Spring to do this or we will have to manually code this in the application?
I have seen primary and secondary notations in Spring boot but nothing in Spring. I could achieve these in my code where the connection is created/retrieved, by connecting to the secondary datasource if the connection to the primary datasource fails/timed out. But wanted to know if this can be achieved by making changes just in Spring configuration.
Let me clarify things one-by-one-
Spring Boot has a #Primary annotation but there is no #Secondary annotation.
The purpose of the #Primary annotation is not what you have described. Spring does not automatically switch data sources in any way. #Primary merely tells the spring which data source to use in case we don't specify one in any transaction. For more detail on this- https://www.baeldung.com/spring-data-jpa-multiple-databases
Now, how do we actually switch datasources when one goes down-
Most people don't manage this kind of High-availability in code. People usually prefer to 2 master database instances in an active-passive mode which are kept in sync. For auto-failovers, something like keepalived can be used. This is also a high subjective and contentious topic and there are a lot of things to consider here like can we afford replication lag, are there slaves running for each master(because then we have to switch slaves too as old master's slaves would now become out of sync, etc. etc.) If you have databases spread across regions, this becomes even more difficult(read awesome) and requires yet more engineering, planning, and design.
Now since, the question specifically mentions using application code for this. There is one thing you can do. I don't advice to use it in production though. EVER. You can create an ASPECTJ advice around your all primary transactional methods using your own custom annotation. Lets call this annotation #SmartTransactional for our demo.
Sample Code. Did not test it though-
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface SmartTransactional {}
public class SomeServiceImpl implements SomeService {
#SmartTransactional
#Transactional("primaryTransactionManager")
public boolean someMethod(){
//call a common method here for code reusability or create an abstract class
}
}
public class SomeServiceSecondaryTransactionImpl implements SomeService {
#Transactional("secondaryTransactionManager")
public boolean usingTransactionManager2() {
//call a common method here for code reusability or create an abstract class
}
}
#Component
#Aspect
public class SmartTransactionalAspect {
#Autowired
private ApplicationContext context;
#Pointcut("#annotation(...SmartTransactional)")
public void smartTransactionalAnnotationPointcut() {
}
#Around("smartTransactionalAnnotationPointcut()")
public Object methodsAnnotatedWithSmartTransactional(final ProceedingJoinPoint joinPoint) throws Throwable {
Method method = getMethodFromTarget(joinPoint);
Object result = joinPoint.proceed();
boolean failure = Boolean.TRUE;// check if result is failure
if(failure) {
String secondaryTransactionManagebeanName = ""; // get class name from joinPoint and append 'SecondaryTransactionImpl' instead of 'Impl' in the class name
Object bean = context.getBean(secondaryTransactionManagebeanName);
result = bean.getClass().getMethod(method.getName()).invoke(bean);
}
return result;
}
}

How to use #Autowired in an class annotated with #Entity?

I have an entity called TimeBooking. When I request this entity and return to the client I want to get a list of ActivityTimeBookings from a repository. But when the function get called the repo is null.
So I tried to #Autowired the repo and marked it as transient and also said Spring that there is a dependency which should be injected.
#Configurable(preConstruction = true)
#Entity
public class TimeBooking extends BaseEntity{
#Autowired
private transient ActivityTimeBookingRepository activityTimeBookingRepository;
...
#JsonProperty("activityTimeBookings")
private List<ActivityTimeBooking> activityTimeBookings() {
return this.activityTimeBookingRepository.findByDate(this.timeFrom);
}
}
Any suggestions?
Using #Autowired in a class annotated with #Entity is a bad practice.
The solution is given below :
1. Create a service interface :
public interface TimeBookingService {
public List<ActivityTimeBooking> activityTimeBookings();
}
2. Create an implementation of the service interface :
#Service
public class TimeBookingServiceImpl implements TimeBookingService {
#Autowired
private ActivityTimeBookingRepository activityTimeBookingRepository;
public List<ActivityTimeBooking> activityTimeBookings() {
return this.activityTimeBookingRepository.findByDate(this.timeFrom);
}
}
Usually its indeed a bad practice to inject something into JPA entities.
These are usually created by JPA implementation (like Hibernate) and spring as a DI framework doesn't really participate in this process.
Note, that there can be many instances of this class created as a result of query, so if you later use this for serialization of the list of this object you might end up running N queries to the database given N entities like this were retrieved.
Answering your question about "getting access to the repo" I believe you should consider refactoring:
In the service class (assuming you have a "regular" contoller, service and dao):
you can:
class MyService {
SomeResult.. doSomething() {
List<TimeBooking> allTimeBookings = dao.getAllTimeBooking();
LocalDateTime timeFrom = calculateTimeFrom(allTimeBookings);
List<ActivityTimeBooking> allActivityTimeBookings = dao.findByDate(timeFrom);
return calculateResults(allTimeBookings, allActivityTimeBooking);
}
}
class MyDao {
List<ActivityTimeBooking> findByDate(LocalDateTime timeFrom) {...}
List<TimeBooking> getAllTimeBookings() {...}
}
Regarding the service implementation, I've assumed this use case can't be covered by usual "JOIN between two tables" so that that creating an association between TimeBooking and ActivityTimeBooking is not an option.
Note 2, I've used one repository (dao) for brevity, in real application you might want to inject two different repositories into the service.

Spring data with multiple modules not working

I'm trying to set up a project with two data sources, one is MongoDB and the other is Postgres. I have repositories for each data source in different packages and I annotated my main class as follows:
#Import({MongoDBConfiguration.class, PostgresDBConfiguration.class})
#SpringBootApplication(exclude = {
MongoRepositoriesAutoConfiguration.class,
JpaRepositoriesAutoConfiguration.class
})
public class TemporaryRunner implements CommandLineRunner {
...
}
MongoDBConfiguration:
#Configuration
#EnableMongoRepositories(basePackages = {
"com.example.datastore.mongo",
"com.atlassian.connect.spring"})
public class MongoDBConfiguration {
...
}
PostgresDBConfiguration:
#Configuration
#EnableJpaRepositories(basePackages = {
"com.example.datastore.postgres"
})
public class PostgresDBConfiguration {
...
}
And even though I specified the base packages as described in documentation, I still get those messages in the console:
13:10:44.238 [main] [] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Multiple Spring Data modules found, entering strict repository configuration mode!
13:10:44.266 [main] [] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data MongoDB - Could not safely identify store assignment for repository candidate interface com.atlassian.connect.spring.AtlassianHostRepository.
I managed to solve this issue for all my repositories by using MongoRepository and JpaRepository but AtlassianHostRepository comes from an external lib and it is a regular CrudRepository (which totally makes sense because the consumer of the lib can decide what type of DB he would like to use). Anyway it looks that basePackages I specified are completely ignored and not used in any way, even though I specified com.atlassian.connect.spring package only in #EnableMongoRepositories Spring Data somehow can't figure out which data module should be used.
Am I doing something wrong? Is there any other way I could tell spring data to use mongo for AtlassianHostRepository without changing the AtlassianHostRepository.class itself?
The only working solution I found was to let spring data ignore AtlassianHostRepository (because it couldn't figure out which data source to use) then create a separate configuration for it, and simply create it by hand:
#Configuration
#Import({MongoDBConfiguration.class})
public class AtlassianHostRepositoryConfiguration {
private final MongoTemplate mongoTemplate;
#Autowired
public AtlassianHostRepositoryConfiguration(final MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Bean
public AtlassianHostRepository atlassianHostRepository() {
RepositoryFactorySupport factory = new MongoRepositoryFactory(mongoTemplate);
return factory.getRepository(AtlassianHostRepository.class);
}
}
This solution works fine for a small or limited number of repositories used from a library, it would be rather cumbersome to create all the repositories by hand when there are more of them, but after reading the source code of spring-data I see no way to make it work with basePackages as stated in documentation (I may be wrong though).

maven-surefire-plugin runs single method, but failed on class

I wrote test that require transactions, it looks like :
#RunWith(SpringRunner.class)
#SpringBootTest(classes = ExchangeApp.class)
#EnableTransactionManagement(proxyTargetClass = true, mode = AdviceMode.PROXY)
#ActiveProfiles({JHipsterConstants.SPRING_PROFILE_TEST})
public abstract class AbstractServiceTest {
So when I run single test method : mvn test -Dtest=TestClassName#method1
works as expected, but
mvn test -Dtest=TestClassName
failed, with weird exceptions, exception says constraint violation in #OneToMany and exceptions with decimal calculations during divide in BigDecimal. Same exceptions when I run in IDE.
It looks like transaction managing missed. Any ideas ?
java.lang.ArithmeticException: Non-terminating decimal expansion; no exact representable decimal result.
org.springframework.dao.DataIntegrityViolationException: could not execute statement; SQL [n/a]; constraint ["FK_OPEN_EXEC_ID: PUBLIC.ORDER_PAIR_OPEN_EXEC FOREIGN KEY(EXECUTIONS_ID) REFERENCES PUBLIC.ORDER_PAIR_OPEN(ID) (2)"; SQL statement:
delete from order_pair_open where id=? [23503-197]]; nested exception is org.hibernate.exception.ConstraintViolationException: could not execute statement
UPD: also I already tried
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven-surefire-plugin.version}</version>
<configuration>
<!-- Force alphabetical order to have a reproducible build -->
<runOrder>alphabetical</runOrder>
<parallel>classes</parallel>
<threadCountClasses>1</threadCountClasses>
<threadCountMethods>1</threadCountMethods>
<threadCountSuites>1</threadCountSuites>
</configuration>
</plugin>
UPD: it is specific to my case. I am trying to test service with #Async method inside, so it seems I have to mark #Transactional on test method name, in order to enable transaction support, thats why I tried to use #EnableTransactionManagement(proxyTargetClass = true, mode = AdviceMode.PROXY) to enable transactions managing during class test. Here it is pseudo code :
class Service1Test extends AbstractServiceTest {
Service1 service1;
Repo1 repo1;
//Does not works with class call, but works with method call
//when I mark this method with #Transactional, mentioned exceptions are gone,
// but I cant check result since "registerSynchronization" were not called
#Test
public void test1() throws InterruptedException {
service1.method1();
synchronized (this) {
wait(2000l);
}
assertThat( repo1.findAll().size()).isEqualTo(1);
//repoN check
}
}
#Service
#Transactional
class Service1 {
Service2 service2;
#Async
public void method1() {
//DB operations...
TransactionSynchronizationManager.registerSynchronization(new TransactionSynchronizationAdapter() {
#Override
public void afterCommit() {
service2.method2();
}
});
}
}
#Service
class Service2 {
Repo1 repo1;
public void method2() {
repo1.save(new Entity1());
}
}
#Service
class Service3 {
#Autowired
private ScheduledExecutorService scheduler;
public void method3() {
scheduler.schedule(() -> {
//other transactional services call
}, 1l, TimeUnit.SECONDS);
}
}
#Repository
interface Repo1 extends JpaRepository<Entity1, Long> {
}
#Entity
class Entity1{
}
I can't comment on JHipster side, but one possible reason is that "Test Transactional" behavior is not applied to the test code.
Just to clarify, by default if you make a test as #Transactional, spring opens one transaction, the test runs, and when its done (regardless whether it has passed or failed) the transaction does the rollback effectively cleaning up the database.
Now, I don't see this in tests, so probably you don't use this behavior.
But this is pure spring, not a spring boot.
Now regarding the spring boot part.
If you use #SpringBootTest with a concrete configuration ExchangeAppin this case, the chances are that it won't load any autoconfiguration-s (for example those that define configurations working with transactions, data source management, etc.).
If you want to "mimic" the load of the microservice, you should run #SpringBootTest without configurations but it's beyond the scope of the question.
A "by the book" spring boot way to test DAO with hibernate is using #DataJpaTest that loads only the database related stuff, but it can't be used with #SpringBootTest - you should choose one.
So for me it's clear that the test does something tricky and definitely not something that follows spring boot conventions, so probably spring/spring boot strikes back :)
Now, regarding the Asynchronous stuff. This can also contribute to the mess because transaction support in spring relies heavily on Thread Local concept, so when the new thread gets executed (on another thread pool or something) the information about transaction does not propagate, so spring can't understand that its still in the same transaction. I see that you use the TransactionSynchronizationManager but without debugging its hard to tell what happens.
Now in order to check why doesn't the transaction get propagated, I think you should debug the application and see:
Whether the services are wrapped in proxy that support transactions (that's what #Transactional does, assuming the relevant BeanPostProcessor was applied)
Check that during each step you're in the transaction
Consider using #Transactional on test / test case, so that it would clean up the changes that have been applied during the test

Resources