I am totally confused with one issue Spring data + hibernate
We have a Restful service which we are migrating to V2.
So the old controller looks like
#Api(tags = {"assignments"})
#RestController
#CheckedTransactional
public class AssignmentListController {
#Inject
private AssignmentListService assignmentListService;
//REST function
public list() {....}
}
The REST function list calls AssignmentListService to load assignments, which is a collection, and loads some data lazily. Its works excellent.
What I did is I copied this controller as name AssignmentListControllerV2, and it looks like
#Api(tags = {"assignments"})
#RestController
#CheckedTransactional
public class AssignmentListControllerV2 {
#Inject
private AssignmentListService assignmentListService;
#Inject
private AssignmentDtoMapper assignmentDtoMapper;
public list() {...}
}
The code is same except AssignmentDtoMapper bean added, which is created using MapStruct.
Now the problem, When I call this new service, somehow I get a Lazy Load exception. The error is
could not initialize proxy - no Session
I desperately need some help as I have no clue whats happening. I have just copied the code in a new class and its failing.
The exception is actually pretty clear, Hibernate can't load the lazy fetched member because there is no persistence context open when you hit it.
I suppose that in the V2 the:
#Inject
private AssignmentDtoMapper assignmentDtoMapper;
is to change some JPA business entity into DTO?
It's probably the source of the exception if you try to map not loaded member there.
If you want to avoid the exception on unitiliazed proxy you can try something like
public boolean isProxyInitialized(Object obj){
if(obj instanceof HibernateProxy){
HibernateProxy proxy = (HibernateProxy) obj;
return !proxy.getHibernateLazyInitializer().isUninitialized();
}
return obj != null;
}
It should return true if the member as bean fetched otherwise false.
Related
I'm not sure if anyone has experienced this particular twist of the LazyInitialization issue.
I have a console Spring Boot application (so: no views - everything basically happens within a single execution method). I have the standard beans and bean repositories, and the typical lazy relationship in one of the beans, and it turns out that unless I specify #Transactional, any access to any lazy collection automatically fails even though the session stays the same, is available, and is open. In fact, I can happily do any session-based operation as long as I don't try to access a lazy collection.
Here's a more detailed example :
#Entity
class Path {
... `
#OneToMany(mappedBy = "path",fetch = FetchType.LAZY,cascade = CascadeType.ALL)
#OrderBy("projectOrder") `
public List<Project> getProjects() {
return projects; `
}`
}
Now, the main method does something as simple as this:
class SpringTest {
... `
#Autowired
private PathRepository pathRepository;
void foo() {
Path path = pathRepository.findByNameKey("...");
System.out.println(path.getProjects()); // Boom <- Lazy Initialization exception}
}
Of course if I slap a #Transactional on top of the method or class it works, but the point is - why should I need that? No one is closing the session, so why is Hibernate complaining that thereĀ“s no session when there is one?
In fact, If I do:
void foo() {
System.out.println(entityManager.unwrap(Session.class));
System.out.println(entityManager.unwrap(Session.class).isOpen());
Path basic = pathRepository.findByNameKey("...");
System.out.println(entityManager.unwrap(Session.class));
System.out.println(entityManager.unwrap(Session.class).isOpen());
System.out.println(((AbstractPersistentCollection)basic.projects).getSession());
Path p1 = pathRepository.findByNameKey("....");
}
I get that the session object stays the same the whole time, it stays open the whole time, but the internal session property of the collection is never set to anything other than null, so of course when Hibernate tries to read that collection, in its withTemporarySessionIfNeeded method it immediately throws an exception
private <T> T withTemporarySessionIfNeeded(LazyInitializationWork<T> lazyInitializationWork) {
SharedSessionContractImplementor tempSession = null;
if (this.session == null) {
if (this.allowLoadOutsideTransaction) {
tempSession = this.openTemporarySessionForLoading();
} else {
this.throwLazyInitializationException("could not initialize proxy - no Session");
}
So I guess my question would be - why is this happening? Why doesn't Hibernate store or access the session from which a bean was fetched so that it can load the lazy collection from it?
Digging a bit deeper, it turns out that the repository method executying the query does a
// method here is java.lang.Object org.hibernate.query.Query.getSingleResult()
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
...
if (SharedEntityManagerCreator.queryTerminatingMethods.contains(method.getName())) {
...
EntityManagerFactoryUtils.closeEntityManager(this.entityManager); // <--- Why?!?!?
this.entityManager = null;
}
...
}
and the above closeEntityManager calls unsetSession on all collections:
SharedSessionContractImplementor session = this.getSession();
if (this.collectionEntries != null) {
IdentityMap.onEachKey(this.collectionEntries, (k) -> {
k.unsetSession(session);
});
}
But why?!
(Spring Boot version is 2.7.8)
So, after researching more it appears that this is standard behavior in Spring Boot - unless you use your own EntityManager, the one managed automatically by Spring is either attached to a #Transactional boundary, or opens and closes for each query.
Some relevant links:
Does Entity manager needs to be closed every query?
Do I have to close() every EntityManager?
In the end, I ended using a TransactionTemplate to wrap my code into a transaction without having to mark the whole class #Transactional.
I have problems with save some values in #Service method.
My code:
#Service(value = "SettingsService")
public class SettingsService {
...
public String getGlobalSettingsValue(Settings setting) {
getTotalEhCacheSize();
if(!setting.getGlobal()){
throw new IllegalStateException(setting.name() + " is not global setting");
}
GlobalSettings globalSettings = globalSettingsRepository.findBySetting(setting);
if(globalSettings != null)
return globalSettings.getValue();
else
return getGlobalEnumValue(setting)
}
#Cacheable(value = "noTimeCache", key = "#setting.name()")
public String getGlobalEnumValue(Settings setting) {
return Settings.valueOf(setting.name()).getDefaultValue();
}
My repository class:
#Repository
public interface GlobalSettingsRepository extends CrudRepository<GlobalSettings, Settings> {
#Cacheable(value = "noTimeCache", key = "#setting.name()", unless="#result == null")
GlobalSettings findBySetting(Settings setting);
It should work like this:
get value form DB if data exist,
if not save value from enum.
but it didn't save any data from DB or enum.
My cache config:
#Configuration
#EnableCaching
public class CacheConfig {
#Bean
public EhCacheCacheManager cacheManager(CacheManager cm) {
return new EhCacheCacheManager(cm);
}
#Bean
public EhCacheManagerFactoryBean ehcache() {
EhCacheManagerFactoryBean ehCacheManagerFactoryBean = new EhCacheManagerFactoryBean();
ehCacheManagerFactoryBean.setConfigLocation(new ClassPathResource("ehcache.xml"));
return ehCacheManagerFactoryBean;
}
}
I have some example to make sure that cache is working in my project in rest method:
#RequestMapping(value = "/system/status", method = RequestMethod.GET, produces = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<?> systemStatus() {
Object[] list = userPuzzleRepository.getAverageResponseByDateBetween(startDate, endDate);
...
}
public interface UserPuzzleRepository extends CrudRepository<UserPuzzle, Long> {
#Cacheable(value = "averageTimeAnswer", key = "#startDate")
#Query("select AVG(case when up.status='SUCCESS' OR up.status='FAILURE' OR up.status='TO_CHECK' then up.solvedTime else null end) from UserPuzzle up where up.solvedDate BETWEEN ?1 AND ?2")
Object[] getAverageResponseByDateBetween(Timestamp startDate, Timestamp endDate);
and it's work well.
What am I doing wwrong?
You have two methods in your SettingsService, one that is cached (getGlobalEnumValue(...)) and another one that isn't cached, but calls the other method (getGlobalSettingsValue(...)).
The way the Spring cache abstraction works however is by proxying your class (using Spring AOP). However, calls to methods within the same class will not call the proxied logic, but the direct business logic beneath. This means caching does not work if you're calling methods in the same bean.
So, if you're calling getGlobalSettingsValue(), it will not populate, nor use the cache when that method calls getGlobalEnumValue(...).
The possible solutions are:
Not calling another method in the same class when using proxies
Caching the other method as well
Using AspectJ rather than Spring AOP, which weaves the code directly into the byte code at compile time, rather than proxying the class. You can switch the mode by setting the #EnableCaching(mode = AdviceMode.ASPECTJ). However, you'll have to set up load time weaving as well.
Autowire the service into your service, and use that service rather than calling the method directly. By autowiring the service, you inject the proxy into your service.
The problem is in the place you call your cacheable method from. When you call your #Cacheable method from same class, you just call it from this reference, which means it doesn't wrapped by Spring's proxy, so Spring can't catch your invocation to handle it.
One on ways to solve this problem is to #Autowired service to itself and just call methods you expected spring have to handle by this reference:
#Service(value = "SettingsService")
public class SettingsService {
//...
#Autowired
private SettingsService settingsService;
//...
public String getGlobalSettingsValue(Settings setting) {
// ...
return settingsSerive.getGlobalEnumValue(setting)
//-----------------------^Look Here
}
#Cacheable(value = "noTimeCache", key = "#setting.name()")
public String getGlobalEnumValue(Settings setting) {
return Settings.valueOf(setting.name()).getDefaultValue();
}
}
But if you have such problems it means your classes are take on too much and aren't comply with the principle of "single class - single responsibility". The better solution would be to move method with #Cacheable to dedicated class.
I have a code that looks like this
public class EmployeeController : Controller
{
public ContextWrapper contextWrapper;
public EmployeeController (IContextWrapper wrapper)
{
contextWrapper = wrapper;
}
In my dependency resolver I have the binding for IContextWrapper
kernel.Bind<IContextWrapper>().To<ContextWrapper>();
The implementation of ContextWrapper has an object which is of type Linq DataContext.
public class ContextWrapper : IContextWrapper
{
public MyDataContext dataContext;
public ContextWrapper(MyDataContext context)
{
this.dataContext = context;
}
Now my action method in this controller looks like this
var empRepository = new EmployeeRepository(contextWrapper);
//do some tests with this repository.
some values = contextWrapper.datacontext.get some values from the database table
//do some tests with these values.
To be able to test this method
I should be able to provide some sort of mock database(not literally) or
make the contextWrapper.datacontext return mocked values or
I even thought of creating another implementation of the IContextWrapper that doesn't use a Linq DataContext object. And creating another constructor for this controller and pass that fake implementation. Also in my dependency resolver I would bind the fake object to the IContextWrapper. Although I do not know how to make Ninject
As a last resort test my method against a test database since it all boils down to this Linq DataContext object and it seems I cannot get rid of it past a certain level.
Problem is the more I read about it, more I get confused. I have tried to give as much detail as possible to explain my problem. If any one has a clear cut idea about how to get this, please suggest.
I am using spring jpa transactions in my project.One Case includes inserting a data in a synchronized method and when another thread accesses it the data is not updated.My code is given below :
public UpdatedDTO parentMethod(){
private UpdatedDTO updatedDTO = getSomeMethod();
childmethod1(inputVal);
return updatedDTO;
}
#Transactional
public synchronized childmethod1(inputVal){
//SomeCodes
//Place where update takes place
TableEntityObject obj = objectRepository.findByInputVal(inputVal);
if(obj == null){
childMethod2(inputVal);
}
}
#Transactional
public void childMethod2(inputVal){
//Code for inserting
TableEntityObject obj = new TableEntityObject();
obj.setName("SomeValue");
obj.setValueSet(inputVal);
objectRepository.save(obj);
}
Now if two threads access at the same time and if first thread completes childmethod2 and childmethod1 and without completing parentMethod() after that if second thread comes to the childMethod1() and checks if data exists,the data is null and is not updated by first thread.I have tried many ways like
#Transactional(propagation = Propagation.REQUIRES_NEW)
public synchronized childmethod1(inputVal){
//SomeCodes
//Place where update takes place
TableEntityObject obj = objectRepository.findByInputVal(inputVal);
if(obj == null){
childMethod2(inputVal);
}
}
#Transactional(propagation = Propagation.REQUIRES_NEW)
public void childMethod2(inputVal){
//Code for inserting
TableEntityObject obj = new TableEntityObject();
obj.setName("SomeValue");
obj.setValueSet(inputVal);
objectRepository.save(obj);
}
also tried taking off #transactional in the childMethod1() but nothing works out.I know im doing something wrong here , but couldnt figure out where and what exactly i am doing wrong.Can anyone help me out with this
#Transactional is resolved using proxies on spring beans. It means it will have no effect if your method with #Transactional is called from the same class. Take a look at Spring #Transaction method call by the method within the same class, does not work?
The easiest would be moving those methods into separate service.
Typical checklist I follow in cases like these :
If Java based configuration then make sure
#EnableTransactionManagement annocation is present in the class
containing the #Configuration annotation
Make sure the transactionManager bean is created, again this should be mentioned in the configuration class.
Use of #Transactional annocatio over the method which is calling the repository, typically a class in the DAO layer
Adding the #Service annotation for the class which is invoking the methods in the repository
Nice blog which explains the Transaction configuration with JPA in depth --> http://www.baeldung.com/2011/12/26/transaction-configuration-with-jpa-and-spring-3-1/68954
I would like to port two projects to Spring Boot 1.1.6. The are each part of a larger project. They both need to make SQL connections to 1 of 7 production databases per web request based region. One of them persists configuration setting to a Mongo database. They are both functional at the moment but the SQL configuration is XML based and the Mongo is application.properties based. I'd like to move to either xml or annotation before release to simplify maintenance.
This is my first try at this forum, I may need some guidance in that arena as well. I put the multi-database tag on there. Most of those deal with two connections open at a time. Only one here and only the URL changes. Schema and the rest are the same.
In XML Fashion ...
#Controller
public class CommonController {
private CommonService CommonService_i;
#RequestMapping(value = "/rest/Practice/{enterprise_id}", method = RequestMethod.GET)
public #ResponseBody List<Map<String, Object>> getPracticeList(#PathVariable("enterprise_id") String enterprise_id){
CommonService_i = new CommonService(enterprise_id);
return CommonService_i.getPracticeList();
}
#Service
public class CommonService {
private ApplicationContext ctx = null;
private JdbcTemplate template = null;
private DataSource datasource = null;
private SimpleJdbcCall jdbcCall = null;
public CommonService(String enterprise_id) {
ctx = new ClassPathXmlApplicationContext("database-beans.xml");
datasource = ctx.getBean(enterprise_id, DataSource.class);
template = new JdbcTemplate(datasource);
}
Each time a request is made, a new instance of the required service is created with the appropriate database connection.
In the spring boot world, I've come across one article that extended TomcatDataSourceConfiguration.
http://xantorohara.blogspot.com/2013/11/spring-boot-jdbc-with-multiple.html That at least allowed me to create a java configuration class however, I cannot come up with a way to change the prefix for the ConfigurationProperties per request like I am doing with the XML above. I can set up multiple configuration classes but the #Qualifier("00002") in the DAO has to be a static value. //The value for annotation attribute Qualifier.value must be a constant expression
#Configuration
#ConfigurationProperties(prefix = "Region1")
public class DbConfigR1 extends TomcatDataSourceConfiguration {
#Bean(name = "dsRegion1")
public DataSource dataSource() {
return super.dataSource();
}
#Bean(name = "00001")
public JdbcTemplate jdbcTemplate(DataSource dsRegion1) {
return new JdbcTemplate(dsRegion1);
}
}
On the Mongo side, I am able to define variables in the configurationProperties class and, if there is a matching entry in the appropriate application.properties file, it overwrites it with the value in the file. If not, it uses the value in the code. That does not work for the JDBC side. If you define a variable in your config classes, that value is what is used. (yeah.. I know it says mondoUrl)
#ConfigurationProperties(prefix = "spring.mongo")
public class MongoConnectionProperties {
private String mondoURL = "localhost";
public String getMondoURL() {
return mondoURL;
}
public void setMondoURL(String mondoURL) {
this.mondoURL = mondoURL;
}
There was a question anwsered today that got me a little closer. Spring Boot application.properties value not populating The answer showed me how to at least get #Value to function. With that, I can set up a dbConfigProperties class that grabs the #Value. The only issue is that the value grabbed by #Value is only available in when the program first starts. I'm not certain how to use that other than seeing it in the console log when the program starts. What I do know now is that, at some point, in the #Autowired of the dbConfigProperties class, it does return the appropriate value. By the time I want to use it though, it is returning ${spring.datasource.url} instead of the value.
Ok... someone please tell me that #Value is not my only choice. I put the following code in my controller. I'm able to reliably retrieve one value, Yay. I suppose I could hard code each possible property name from my properties file in an argument for this function and populate a class. I'm clearly doing something wrong.
private String url;
//private String propname = "${spring.datasource.url}"; //can't use this
#Value("${spring.datasource.url}")
public void setUrl( String val) {
this.url = val;
System.out.println("==== value ==== " + url);
}
This was awesome... finally some progress. I believe I am giving up on changing ConfigurationProperties and using #Value for that matter. With this guy's answer, I can access the beans created at startup. Y'all were probably wondering why I didn't in the first place... still learning. I'm bumping him up. That saved my bacon. https://stackoverflow.com/a/24595685/4028704
The plan now is to create a JdbcTemplate producing bean for each of the regions like this:
#Configuration
#ConfigurationProperties(prefix = "Region1")
public class DbConfigR1 extends TomcatDataSourceConfiguration {
#Bean(name = "dsRegion1")
public DataSource dataSource() {
return super.dataSource();
}
#Bean(name = "00001")
public JdbcTemplate jdbcTemplate(DataSource dsRegion1) {
return new JdbcTemplate(dsRegion1);
}
}
When I call my service, I'll use something like this:
public AccessBeans(ServletRequest request, String enterprise_id) {
ctx = RequestContextUtils.getWebApplicationContext(request);
template = ctx.getBean(enterprise_id, JdbcTemplate.class);
}
Still open to better ways or insight into foreseeable issues, etc but this way seems to be about equivalent to my current XML based ways. Thoughts?