OpenSessionInView vs. Transactional? (Spring/Hibernate/JPA) - spring

I have a JPA entity with Lazy loaded collection on it. I do not need the collection every time.
#Entity(name = "Foo")
#Access(AccessType.FIELD)
#Table(name = "TEST", schema = "TEST")
public class Foo implements Serializable {
private static final long serialVersionUID = 1L;
#OneToMany(mappedBy="foo", targetEntity=Bar.class, fetch=FetchType.LAZY, cascade=CascadeType.ALL)
private List<Bar> bars;
}
#Entity(name = "Bar")
#Access(AccessType.FIELD)
#Table(name = "TEST", schema = "TEST")
public class Bar implements Serializable {
private static final long serialVersionUID = 1L;
#ManyToOne(targetEntity = Foo.class)
#JoinColumn(name = "FOO_ID", referencedColumnName = "ID")
private Foo foo;
}
I have a few methods on a service class that perform a lot of database interactions and at the end save a Foo entity to the database. I need this to happen for about a 100 items in a collection.
#Service
public class FooService {
#Autowired
private FooRepository fooRepository;
public void processAllFoos() {
fooRepository.findAll().forEach(foo -> {
processFoo(foo);
});
}
private void processFoo(Foo foo) {
foo.getBars().forEach(bar -> {
// Do a lot of time consuming stuff here that involves
// entities of other types and modify each bar object
});
fooRepository.save(foo);
}
}
processAllFoos gets called from a #RESTController whenever it gets a request.
However, I do not want processAllFoos to be wrapped in a single database transaction, because that locks up the entire Foo table till the business logic is executed for all Foos.
If I make the processFoo method #Transactional I get the LazyInitializationException which complains that the Hibernate session is non-existent. To make this work I need to make all methods in the call stack #Transactional so that the nested methods can join onto the calling method's transaction. But this locks the entire Foo table as mentioned above.
Adding a OpenSessionInViewFilter for the dispatcher servlet solves my problem but I've read that there are issues with performance and entity detaching/reattaching (which I do in other parts of the application) with this approach.
Is there a way I can do what I want to without using the OpenSessionInView approach? What other vulnerabilities am I adding by using this approach?
Spring/Hibernate 4.x
Based on the answer below, I was able to do the following:
#Service
public class FooService {
#Autowired
private FooRepository fooRepository;
#Autowired
private TransactionTemplate transactionTemplate;
public void processAllFoos() {
fooRepository.findAll().forEach(foo -> {
transactionTemplate.execute(new TransactionCallback<Object>() {
public Object doInTransaction(TransactionStatus status) {
try {
processFoo(foo);
status.flush();
} catch(Exception e) {
status.setRollbackOnly();
}
return null;
}
});
});
}
private void processBar(Foo foo) {
foo.getBars().foreEach(bar -> {
// Do a lot of time consuming stuff here that involves
// entities of other types and modify each bar object
});
fooRepository.save(foo);
}
}

OpenSessionInViewFilter commonly used to solve LazyInitialization problem in View layer (UI components or page templates), because View layer can't and must not manage transactions directly.
In your case another way to get all the Bar objects can be applied.
First You get all the Foo object ids instead to get fully objects.
Second Use Foo ids collection to iterate thru related Bar objects.
Third If you don't want one BIG transaction then you can use Spring Transaction template to manage transactions explicitly.
Your code example may look like this:
#Service
public class FooService {
#Autowired
private FooRepository fooRepository;
#Autowired
private BarRepository barRepository;
#Autowired
private TransactionTemplate transactionTemplate;
public void processAllFoos() {
final List < Long > fooIdList = transactionTemplate.execute(new TransactionCallback() {
public Object doInTransaction(TransactionStatus status) {
return fooRepository.findIdList();
}
});
transactionTemplate.execute(new TransactionCallback() {
public Object doInTransaction(TransactionStatus status) {
barRepository.findByFooIdList(fooIdList).forEach(bar - > {
processBar(bar);
});
return null;
}
});
}
private void processBar(Bar bar) {
// Do a lot of time consuming stuff here that involves
// entities of other types and modify each bar object
barRepository.save(bar);
}
}
Example below shows how to solve your task without some performance overheads. But you should understand that if Foo and Bar tables linked with foreign key constraint, then related record in Foo table may be blocked by RDBMS each time you update row in Bar table.

Related

Bidirectional #OneToOne Spring Data JPA, Hibernate

I am using Bidirectional #OneToOne from Hibernate documentation. I have created an identical model for the test.
I can't get Phone via PhoneDetails. I get an error - Message Request processing failed; nested exception is org.hibernate.LazyInitializationException: could not initialize proxy [com.example.model.Phone#1] - no Session.
I've tried many options and it doesn't work.
Please tell me how to get the Phone correctly? I sit all day trying to do this. I did not find any options on the Internet, so I ask here.
Phone.java
#Entity(name = "Phone")
public class Phone {
#Id
#GeneratedValue
private Long id;
#Column(name = "`number`")
private String number;
#OneToOne(mappedBy = "phone",
cascade = CascadeType.ALL,
orphanRemoval = true,
fetch = FetchType.LAZY)
private PhoneDetails details;
public Phone() {
}
public Phone(String number) {
this.number = number;
}
// Getters and setters are omitted for brevity
public void addDetails(PhoneDetails details) {
details.setPhone( this );
this.details = details;
}
public void removeDetails() {
if ( details != null ) {
details.setPhone( null );
this.details = null;
}
}
}
PhoneDetails.java
#Entity(name = "PhoneDetails")
public class PhoneDetails {
#Id
#GeneratedValue
private Long id;
private String provider;
private String technology;
#OneToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "phone_id")
private Phone phone;
public PhoneDetails() {
}
public PhoneDetails(String provider, String technology) {
this.provider = provider;
this.technology = technology;
}
// Getters and setters are omitted for brevity
}
LifecycleController.java
#Controller
public class LifecycleController {
#Autowired
ServiceJpa serviceJpa;
#GetMapping(value = "/savePhoneAndPhoneDetails")
public String savePersonAddress () {
Phone phone = new Phone( "123-456-7890" );
PhoneDetails details = new PhoneDetails( "T-Mobile", "GSM" );
phone.addDetails( details );
serviceJpa.savPhone( phone );
return "/savePhoneAndPhoneDetails";
}
#GetMapping(value = "/getPhone")
public String addPersonAddress () {
PhoneDetails address = serviceJpa.findPhoneDetailsById(2L).orElseThrow();
Phone phone = address.getPhone();
/*
An error appears here -
could not initialize proxy
[com.example.model.Phone#1] - no Session
*/
System.out.println(phone.getNumber());
return "/getPhone";
}
}
ServiceJpa.java
#Service
#Transactional
public class ServiceJpa {
#Autowired
PhoneJpa phoneJpa;
#Autowired
PhoneDetailsJpa phoneDetailsJpa;
#Transactional
public void savPhone(Phone phone) {
phoneJpa.save(phone);
}
#Transactional
public Optional<PhoneDetails> findPhoneDetailsById(Long id) {
return phoneDetailsJpa.findById(id);
}
}
interface PhoneJpa.java
#Repository
public interface PhoneJpa extends JpaRepository<Phone, Long> {
}
interface PhoneDetailsJpa.java
#Repository
public interface PhoneDetailsJpa extends JpaRepository<PhoneDetails, Long> {
}
I agree with Andriy's comment with a slight addition of "You should not access [lazily loaded] entity details outside transaction bounds". But, for starters, is there some reason you want the OneToOne to be FetchType.LAZY to begin with? If you changed it to EAGER, your "lazy" problem would be resolved by virtue of it no longer being a lazy reference but being a real hydrated object.
If that is not the exact route you want to take, there are a dozen ways to EAGERLY fetch things in general and frankly too many to present a single solution here as best/ideal. As your code exists, since all the dereferencing (for now) is happening inside your Controller, then Andriy's suggestion to add #Transaction to the Controller may suffice in that it will be lazily fetched when you need it.
But in the future, if you have Lazy elements in a POJO that get returned to the stack higher than the controller, say, just before they are serialized to JSON for example, then even the CONTROLLER's #Transactional wouldn't be "high" enough in the stack and you'll end up with the same Lazy init problem..
Also, by having it be Lazy and then dereferencing it elsewhere, you're guaranteeing two trips to the Database. With proper FETCH/JOIN eager loads, you would limit that to one, which can be another performance benefit.
So either way, you're back to the real problem at hand.. looking for ways to ensure your operations occur ENTIRELY inside a Transaction boundary OR having to completely hydrate the object so no "Lazy" danglers get dereferenced outside of that.. i.e. by making them eager or by force-initializing any potential Lazy proxies/collections.

Spring-data JdbcTemplate does not commit

I need to update thousands of records in the database but i would like to commit after a batch of 5000 records.
#Service
#Transactional (rollbackFor=Throwable.class)
public class AttributeProcessorServiceImpl extends DataLoader implements
AttributeProcessorService
{
.....
private final TransactionTemplate transTemplate;
private final JdbcTemplate jdbcTemplate;
#Autowired private PlatformTransactionManager platformTransactionManager;
#Autowired
public BlockAttributeProcessorServiceImpl(
TransactionTemplate transTemplate,
JdbcTemplate jdbcTemplate,
.....)
{
super();
this.transTemplate = transTemplate;
this.jdbcTemplate=jdbcTemplate;
.....
}
#Async
#Transactional (propagation=Propagation.NOT_SUPPORTED)
public void reloadAttrs()
{
loadAttrs();
updateAttrs();
}
private void loadAttrs()
{
...some data fetching and processing, finally call db update.
updateDbInBatches(rowcount, sql);
}
private void updateAttrs()
{
...some data fetching and processing, finally call db update.
updateDbInBatches(rowcount, sql);
}
private void updateDbInBatches(long rowcount, String sql)
{
DefaultTransactionDefinition def;
boolean hasMore=true;
Integer from;
Integer to = 0;
int batchSize=5000; //gets from property
while (hasMore)
{
from = to+1;
to = batchSize;
def = new DefaultTransactionDefinition();
def.setName("backCommitTx");
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
TransactionStatus status = platformTransactionManager.getTransaction(def);
int rows = jdbcTemplate.update(sql,paramValues,paramTypes);
logger.debug("Loaded ["+rows+"] records.");
platformTransactionManager.commit(status);
if (to > rowcount)
{
hasMore=false;
logger.debug("All records ["+rowcount+"] updated.");
}
}
}
}
If I put a breakpoint after loadAttrs(), it shows it loaded bunch of records to the database and issued a commit(), but database does not reflect that commit, until after entire public method completes. How do i ensure data is indeed written to the database after each commit. commit neither gives any error as well.
I missed an important piece of information that solved the problem.
I had another public method which is what was called from outside.
public void reloadAttrs(TransDetail trans)
{
reloadAttrs();
}
Above method was infact using default Transaction Propagation as i did not mention it specifically. Since this was the first public method that was called, spring was ignoring transaction demarcation on next public (async) method that was called. I changed above signature to:
#Transactional (propagation=Propagation.NOT_SUPPORTED)
public void reloadAttrs(TransDetail trans)
{
reloadAttrs();
}
It then worked. I was able to see changes in the database after every commit.

Spring boot cache not working in #PostConstruct

I'm building a "class cache", with classes I want to call later.
The main goal is that I don't want scan the context every time that a class instance is needed.
# Model / Repository classes
#Getter
#RequiredArgsConstructor
public class Block implements Serializable {
private final String className;
private final Set<String> classCandidates = new HashSet<>();
public boolean addCandidate(final String classCandidate) {
return this.classCandidates.add(classCandidate);
}
}
#Slf4j
#Component
#CacheConfig(cacheNames = ConstantsCache.CACHE_BLOCK)
public class BlockRepository {
#Cacheable(key = "#className")
public Block findByInputClass(final String className) {
log.info("---> Loading classes for class '{}'", className);
val block = new Block(className);
findCandidates(block);
return block;
}
}
First to evaluate the cache, I've put the cache method #Autowired in a #RestController, wich works fine. The cache is populated when I call the rest method.
#RestController
public class Controller {
#Autowired
BlockRepository blockRepository;
#RequestMapping("/findByInputClass")
public Block findByInputClass(#RequestParam("className") final String className) {
return blockRepository.findByInputClass(className);
}
}
After doing that, I've moved the #Autowired object to a #Service, creating a method to self-populate the cache. But this does not work. The cache is not populated when the #PostConstructor method is called.
#Slf4j
#Component
public class BlockCacheService {
#Autowired
BlockRepository blockRepository;
#PostConstruct
private void postConstruct() {
log.info("*** {} PostConstruct called.", this.getClass().getTypeName());
val block = blockRepository.findByInputClass(ConstantsGenerics.BLOCK_PARENT_CLASS);
final Set<String> inputClasses = getInputFromCandidates(block.getClassCandidates());
appendClassesToCache(inputClasses);
}
private void appendClassesToCache(final Set<String> inputClasses) {
for (val inputClass : inputClasses) {
blockRepository.findByInputClass(inputClass);
}
}
}
How can I properly populate the cache using a service or component, that must start with the application.
Thanks in advance.
EDIT:
I've found a possible solution here: https://stackoverflow.com/a/28311225/1703546
Than I've changed the #Service code to put the cache manually instead of use the #Cacheable magic abstraction.
The class now is like this.
#Slf4j
#Component
public class BlockCacheService {
#Autowired
CacheManager cacheManager;
#Autowired
BlockRepository blockRepository;
#PostConstruct
private void postConstruct() {
log.info("*** {} PostConstruct called.", this.getClass().getTypeName());
val block = blockRepository.findByInputClass(ConstantsGenerics.BLOCK_PARENT_CLASS);
final Set<String> inputClasses = getInputFromCandidates(block.getClassCandidates());
appendClassesToCache(inputClasses);
}
private void appendClassesToCache(final Set<String> inputClasses) {
for (val inputClass : inputClasses) {
val block = blockRepository.findByInputClass(inputClass);
cacheManager.getCache(ConstantsCache.CACHE_BLOCK).put(block.getClassName(), block);
}
}
}
Now the cache is populated correctly, but the question is, this is the best solution?
Thanks.
You can't use an aspect in #PostConstruct as it may not have been created yet (and that is documented by the way).
One possible way to make that work is to implement SmartInitializingBean instead as it gives a callback when all singletons have been fully initialized (including their aspect. Changing that on your original service should work.
Having said that, this code of yours has an impact on the startup time. Why don't you let your cache to be filled lazily instead?

Transactions and relationship entities mapping problems with Neo4j OGM

Versions used: spring-data-neo4j 4.2.0-BUILD-SNAPSHOT / neo4j-ogm 2.0.6-SNAPSHOT
I'm having problems to correctly fetch relationship entities.
The following fetch calls don't return consistent results (executed in the same transaction):
session.query("MATCH (:A)-[b:HAS_B]-(:C) RETURN count(b) as count") returns 1
session.query("MATCH (:A)-[b:HAS_B]-(:C) RETURN b") correctly returns the relationship entity as a RelationshipModel object
session.query(B.class, "MATCH (:A)-[b:HAS_B]-(:C) RETURN b") returns null !
Important remark: When all operations (create, fetch) are done in the same transaction, it seems to be fine.
I have been able to implement a workaround by using session.query(String, Map) to query the relationship entity and map it by myself into my POJO.
#NodeEntity
public class A {
public A () {}
public A (String name) {
this.name = name;
}
#GraphId
private Long graphId;
private String name;
#Relationship(type="HAS_B", direction=Relationship.OUTGOING)
private B b;
}
#RelationshipEntity(type="HAS_B")
public class B {
public B () {}
public B (String name, A a, C c) {
this.name = name;
this.a = a;
this.c = c;
}
#GraphId
private Long graphId;
#StartNode
private A a;
#EndNode
private C c;
private String name;
}
#NodeEntity
public class C {
public C () {}
public C (String name) {
this.name = name;
}
#GraphId
private Long graphId;
private String name;
}
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(loader=AnnotationConfigContextLoader.class, classes={MyTest.TestConfiguration.class})
public class MyTest {
#Autowired
private MyBean myBean;
#Configuration
#EnableAutoConfiguration
#EnableTransactionManagement
#EnableNeo4jRepositories("com.nagra.ml.sp.cpm.core.repositories")
public static class TestConfiguration {
#Bean
public org.neo4j.ogm.config.Configuration configuration() {
org.neo4j.ogm.config.Configuration config = new org.neo4j.ogm.config.Configuration();
config.driverConfiguration().setDriverClassName("org.neo4j.ogm.drivers.embedded.driver.EmbeddedDriver");
return config;
}
#Bean
public SessionFactory sessionFactory() {
return new SessionFactory(configuration(), "com.nagra.ml.sp.cpm.model");
}
#Bean
public Neo4jTransactionManager transactionManager() {
return new Neo4jTransactionManager(sessionFactory());
}
#Bean
public MyBean myBean() {
return new MyBean();
}
}
#Test
public void alwaysFails() {
myBean.delete();
myBean.create("1");
try { Thread.sleep(2000); } catch (InterruptedException e) {} //useless
myBean.check("1"); // FAILS HERE !
}
#Test
public void ok() {
myBean.delete();
myBean.createAndCheck("2");
}
}
#Transactional(propagation = Propagation.REQUIRED)
public class MyBean {
#Autowired
private Session neo4jSession;
public void delete() {
neo4jSession.query("MATCH (n) DETACH DELETE n", new HashMap<>());
}
public void create(String suffix) {
C c = new C("c"+suffix);
neo4jSession.save(c);
A a = new A("a"+suffix);
neo4jSession.save(a);
B bRel = new B("b"+suffix, a, c);
neo4jSession.save(bRel);
}
public void check(String suffix) {
//neo4jSession.clear(); //Not working even with this
Number countBRels = (Number) neo4jSession.query("MATCH (:A)-[b:HAS_B]-(:C) WHERE b.name = 'b"+suffix+"' RETURN count(b) as count", new HashMap<>()).iterator().next().get("count");
assertEquals(1, countBRels.intValue()); // OK
Iterable<B> bRels = neo4jSession.query(B.class, "MATCH (:A)-[b:HAS_B]-(:C) WHERE b.name = 'b"+suffix+"' RETURN b", new HashMap<>());
boolean relationshipFound = bRels.iterator().hasNext();
assertTrue(relationshipFound); // FAILS HERE !
}
public void createAndCheck(String suffix) {
create(suffix);
check(suffix);
}
}
This query session.query(B.class, "MATCH (:A)-[b:HAS_B]-(:C) RETURN b") returns only the relationship but not the start node or end node and so the OGM cannot hydrate this. You need to always return the start and end node along with the relationship like session.query(B.class, "MATCH (a:A)-[b:HAS_B]-(c:C) RETURN a,b,c")
The reason it appears to work when you both create and fetch data in the same transaction is that the session already has a cached copy of a and c and hence b can be hydrated with cached start and end nodes.
Firstly, please upgrade from OGM 2.0.6-SNAPSHOT to 2.1.0-SNAPSHOT. I have noticed some off behaviour in the former which might be one part of the issue.
Now on to your test. There are several things going on here which are worth investigating.
Use of #DirtiesContext: You don't seem to be touching the context and if you are using it to reset the context between tests so you get a new Session/Transaction then that's going about it the wrong way. Just use #Transactional instead. The Spring JUnit runner will treat this in a special manner (see next point).
Being aware that Transactional tests automatically roll back: Jasper is right. Spring Integration Tests will always roll back by default. If you want to make sure your JUnit test commits then you will have to #Commit it. A good example of how to set up your test can be seen here.
Knowing how Spring Transaction proxies work. On top of all this confusion you have to make sure you don't simply call transactional method to transactional method in the same class and expect Spring's Transactional behaviour to apply. A quick write up on why can be seen here.
If you address those issues everything should be fine.

Neo4j eager/lazy loading with Spring data

I'm investigating Neo4j and have a question with regards to object eager/lazy loading. Lets say I have class Trolley with has Set<Item> (with getters/setters). If I do the following:
Trolley t = new Trolley(...); // create empty trolley
t.addItem(f); // add one item to the trolley
t.persist(); // persist the object
I then later find the trolley based on the nodeId:
repo.findOne(xxx); // returns the trolley successfully
When I do something like:
trolley.getItems().size()
this is empty. I guess this is the intended behaviour. Is there any mechanism similar to JPA where is the session/tx is open to load the collection dynamically.
Code:
#NodeEntity
public class Trolley
{
#Indexed
private String name;
#RelatedTo
private Set<Item> items;
public Trolley(){}
public Trolley(String name)
{
this.name = name;
}
public void addItem(Item item)
{
this.items.add(item);
}
public Set<Item> getItems()
{
return items;
}
}
#NodeEntity
public class Item
{
private String name;
public Item(){}
public Item(String name)
{
this.name = name;
}
public String getName()
{
return name;
}
}
#Test
public void trolleyWithItemPersist()
{
Trolley trolley = new Trolley("trolley1").persist();
// Persisting - however I would've expected a cascade to
// occur when adding to the collection.
Item item = new Item("item1").persist();
// now add to the trolley
trolley.addItem(item);
// persist
trolley.persist();
// Now use repo to get trolley
Trolley loadedTrolley = trolleyRepository.findOne(trolley.getNodeId());
// should have one item
assertEquals(1, loadedTrolley.getItems().size());
}
Afaik, in Spring Data Jpa, to populate an lazy loaded field you need to annotate the method which call the findOne(xxx) with
#Transactional
from (org.springframework.transaction.annotation.Transactional)
Maybe it works also with neo4j...
I'm not really an skilled developper on Spring Data but this is the only way I know to retrieve lazy loaded fields. If someone has a better solution, feel free to write it!

Resources