I'm performing an update via a method using Hibernate and the EntityManager.
This update method is called multiple times (within a loop).
It seems like when I execute it the first time, it locks the table and does not free it.
When trying to update the table via SQL Developer after having closed the application, I see the table is still locked because the update is hanging.
What do you see as a solution to this problem? If you need more information, let me know.
Class
#Repository
#Transactional(propagation = REQUIRES_NEW)
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(String id) {
String query = "UPDATE_QUERY";
Query nativeQuery = entityManager.createNativeQuery(String.format(query, id));
nativeQuery.executeUpdate();
}
}
UPDATE
After having waited more than one hour, I launched the application again and it worked fine once but now again, it hangs.
UPDATE 2 -- I'll give a maximum bounty to whoever helps me solve this
On another place I use an application managed entity manager and it still gives me the same type of errors.
public void fillYirInfo() {
File inputFile = new File("path");
try (InputStream inputStream = new FileInputStream(inputFile);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream))) {
bufferedReader.lines().skip(1).limit(20).forEach(line -> {
String[] data = line.split(",");
String rnr = data[0];
String linked = data[1];
String email = data.length > 2 ? data[2] : "";
String insuredId = insuredPeopleRepository.getInsuredIdFromNationalId(rnr);
int modifiedCounter = 0;
if (!isNullOrEmpty(insuredId)) {
EntityManager entityManager = emf.createEntityManager();
EntityTransaction transaction = entityManager.getTransaction();
Query nativeQuery = entityManager.createNativeQuery(
"QUERY"
);
transaction.begin();
nativeQuery.executeUpdate();
entityManager.flush();
transaction.commit();
entityManager.close();
}
System.out.println(modifiedCounter + " rows modified");
});
} catch (IOException e) {
e.printStackTrace();
}
}
Try without an update-query:
#Repository
#Transactional(propagation = REQUIRES_NEW)
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(String id) {
//guessing your class name and method..
final YirInfo yirInfo = entityManager.find(YirInfo.class, id);
yirInfo.setSent();
}
}
Might not be as fast as a single update query, but it's possible to get it reasonably fast, unless the amount of data is huge. This is the preferred way of using Hibernate/JPA, instead of thinking in terms of single values and SQL queries, you work with entities/objects and (sometimes) HQL/JPQL queries.
You are using #Transactional annotation. This means you are using Spring Transaction. Then in your UPDATE 2 you are using transaction by yourself and managed by spring (I guess it's another project or class not managed by Spring).
In any case what I would do is to try to update your records in single spring transaction and I'd not use #Transactional in DAO layer but in service layer. Something like this:
Service layer:
#Service
public class YirInfoService {
#Autowired
YirInfoRepository dao;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(List < String > ids) {
dao.setSents(ids);
}
}
DAO layer:
#Repository
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
//Here you can update by using and IN statement or by doing a cycle
//Let's suppose a bulk operation
public void setSents(List < String > ids) {
String query = "UPDATE_QUERY";
for (int i = 0; i < ids.size(); i++) {
String id = ids.get(i);
Query nativeQuery = entityManager.createNativeQuery(String.format(query, id));
nativeQuery.executeUpdate();
if (i % 20 == 0) {
entityManager.flush();
entityManager.clear();
}
}
}
}
The first thing you have to understand is that for the first example, you are using a native query to update rows in the DB. In this case you are completely skipping Hibernate to do anything for you.
In your second example, you have the same thing, you are updating via an update query. You don't need to flush the entity manager as it's only necessary for transferring the pending changes made to your entity objects within that entity manager.
Plus I don't know how your example works as you are autowiring the entity manager and not using the #PersistenceContext annotation. Make sure you use this one properly because you might have misconfigured the application. Also there is no need to manually create the entity manager when using Spring as it looks in the second example. Just use #PersistenceContext to get an entity manager in your app.
You are also mixing up transaction management. In the first example, it's enough if you put the #Transactional annotation to either of your method or to the class.
For the other example, you are doing manual transaction management which makes no sense in this case. If you are using Spring, you can simply rely on declarative transaction management.
The first thing I'd check here is to integrate datasource-proxy into your connection management and log out how your statements are executed. With this info, you can make sure that the query is sent to the DB side and the DB is executing it very slowly, or you are having a network issue between your app and db.
If you find out that the query is sent properly to the DB, you want to analyze your query, because most probably it's just executed very slowly and needs some optimizations. For this, you can use the Explain plan feature, to find out how your execution plan looks like and then make it faster.
Related
I do understand the difference between the method save and the method saveAndFlush of the class JpaRepository Spring Data JPA. As per my understanding, the save method will run and commit the sql only at the end of the transaction whereas the saveAndFlush method will synchronize the persistence context with the database by running the SQL statement but without committing it. Below is a sample code where I Wanted to experience with it and please review it.
This is the repository class for the update
#Repository
public interface ClassRepository extends JpaRepository<ClassA, Long> {
#Modifying(clearAutomatically = true)
#Query(value = "UPDATE class e SET e.class_name = ? WHERE e.employee_id = ?", nativeQuery = true)
int updateClassNative(String className, String empId);
}
This is the test case where I am testing the methods
#Test
void saveAndUpdateWithFlushJPA() {
ClassA classA = ClassA.builder().className("Test").employeeId("S0810").build();
this.classRepository.save(classA);
int size = this.classRepository.updateClassNative("TestQ", "S0810");
assertThat(size).isEqualTo(1);
}
In the above test case, the test passed. I was not expecting the record to be saved since I am using the save method. In the source code, the save method is wrapped with #Transactional. Is it because of that the save method is already committing the insert statement?
Ashley
The problem with your test scenario is that JPA always flushes the persistence context before executing a native query (this is also the default behaviour for JPQL queries, though it can be overriden). The rationale is that a query should report a state reflecting the changes already made in the current unit of work.
To see the difference between save/saveAndFlush, you can use the following test cases instead:
#Repository
public interface ClassRepository extends JpaRepository<ClassA, Long> {
#Query("SELECT COUNT(c.id) FROM ClassA c")
#QueryHints({
#QueryHint(name = org.hibernate.annotations.QueryHints.FLUSH_MODE, value = "COMMIT")
})
int countClassAEntities();
}
#Test
#Transactional
void saveAndUpdate() {
int initialCount = classRepository.countClassAEntities();
ClassA classA = ClassA.builder().className("Test").employeeId("S0810").build();
classRepository.save(classA);
int finalCount = classRepository.countClassAEntities();
assertEquals(initialCount, finalCount);
}
#Test
#Transactional
void saveAndUpdateWithFlush() {
int initialCount = classRepository.countClassAEntities();
ClassA classA = ClassA.builder().className("Test").employeeId("S0810").build();
classRepository.saveAndFlush(classA);
int finalCount = classRepository.countClassAEntities();
assertEquals(initialCount + 1, finalCount);
}
In the above setup, the count query has flush mode set to COMMIT, meaning that executing the query will not trigger a flush. If you were to use the default repository.count() method instead, the first test case would fail because by default, the flush mode is set to AUTO.
In my Spring boot app I'm deleting and inserting a large amount of data into my MySQL db in a single transaction. Ideally, I want to only commit the results to my database at the end, so all or nothing. I'm running into issues where my deletions will be committed before my insertions, so during that period any calls to the db will return no data (not good). Is there a way to manually commit transaction?
My main logic is:
#Transactional
public void saveParents(List<Parent> parents) {
parentRepo.deleteAllInBatch();
parentRepo.resetAutoIncrement();
//I'm setting the id manually before hand
String sql = "INSERT INTO parent " +
"(id, name, address, number) " +
"VALUES ( ?, ?, ?, ?)";
jdbcTemplate.batchUpdate(sql, new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
Parent parent = parents.get(i);
ps.setInt(1, parent.getId());
ps.setString(2, parent.getName());
ps.setString(3, parent.getAddress());
ps.setString(4, parent.getNumber());
}
#Override
public int getBatchSize() {
return parents.size();
}
});
}
ParentRepo
#Repository
#Transactional
public interface ParentRepo extends JpaRepository<Parent, Integer> {
#Modifying
#Query(
value = "alter table parent auto_increment = 1",
nativeQuery = true
)
void resetAutoIncrement();
}
EDIT:
So I changed
parentRepo.deleteAllInBatch();
parentRepo.resetAutoIncrement();
to
jdbcTemplate.update("DELETE FROM output_stream");
jdbcTemplate.update("alter table output_stream auto_increment = 1");
in order to try avoiding jpa's transaction but each operation seems to be committing separately no matter what I try. I have tried TransactionTemplate and implementing PlatformTransactionManager (seen here) but I can't seem to get these operations to commit together.
EDIT: It seems the issue I was having was with the alter table as it will always commit.
I'm running into issues where my deletions will be committed before my insertions, so during that period any calls to the db will return no data
Did you configure JPA and JDBC to share transactions?
If not, then you're basically using two different mechanisms to access the data (EntityManager and JdbcTempate), each of them maintaining a separate connection to the database. What likely happens is that only EntityManager joins the transaction created by #Transactional; the JdbcTemplate operation executes either without a transaction context (read: in AUTOCOMMIT mode) or in a separate transaction altogether.
See this question. It is a little old, but then again, using JPA and Jdbc together is not exactly a common use case. Also, have a look at the Javadoc for JpaTransactionManager.
I have been working to generalize the methods of the DAO for a project using Spring, JPA and Hibernate. However, I am still very much learning Spring, Java, and coding in general.
Is the below design bad or perfectly fine? Is there a better way to accomplish the same thing? Any advice would be greatly appreciated.
I have simplified the class:
#Repository
public class TestRepository
{
#PersistenceContext
private EntityManager entityManager;
public List<?> getListResults(Class<?> dtoClass, String sqlString)
{
List<?> returnList = null;
Query query = entityManager.createNativeQuery(sqlString, dtoClass);
try
{
returnList = (List<?>) query.getResultList();
}
catch (Exception e)
{
}
return returnList;
}
}
Spring Data JPA is the must convenient way in order to interact with your databases because it helps you to avoid the common mistakes that occurs when you try to configure your ORM mapping, entityManager, transacctionManager and all the rest of necessary components in order to establish a communication between your entity domains and your database.
For example you have a pojo like this:
#Entity
public class Item {
#Id
private Long id;
......
}
You can create an interface in order to get or put information to the item repository like this:
public interface ItemRepository extends from JpaRepository<Item,Long>{}
When you need to save the Item just #Autowired the ItemRepository, this is the must important part because the previous interface that is created without methods now exposes ready-to-work methods that will interact with your database, this is the abstraction level that makes Spring Data JPA very useful:
#Autowired
ItemRepository itemRepo
public void createItem(){
Item item = new Item();
itemRepo.save(item);
//or you can get information
List<Item> itemList = itemRepo.findAll();
}
More information in Spring Data JPA Documentation
How about using Spring Data Repositories?
#Repository
public interface SomethingRepository extends JpaRepository<Something, Long> {
}
That way you get lots of methods without having to manually write your SQL query as a string, you retain type safety and you can leverage the power of JPA queries and dynamic proxies that do this whole SQL business for you.
I need to write some temporary code in my existing Spring Boot 1.2.5 application that will do some complex SQL queries. By complex, I mean a single queries about 4 different tables and I have a number of these. We all decided to use existing SQL to reduce potential risk of getting the new queries wrong, which in this case is a good way to go.
My application uses JPA / Hibernate and maps some entities to tables. From my research it seems like I would have to do a lot of entity mapping.
I tried writing a class that would just get the Hibernate session object and execute a native query but when it tried to configure the session factory it threw an exception complaining it could not find the config file.
Could I perhaps do this from one of my existing entities, or at least find a way to get the Hibernate session that already exists?
UPDATE:
Here is the exception, which makes perfect sense since there is no config file to find. Its app configured in the properties file.
org.hibernate.HibernateException: /hibernate.cfg.xml not found
at org.hibernate.internal.util.ConfigHelper.getResourceAsStream(ConfigHelper.java:173)
For what it's worth, the code:
#NamedNativeQuery(name = "verifyEa", query = "select account_nm from per_person where account_nm = :accountName")
public class VerifyEaResult
{
private SessionFactory sessionFact = null;
String accountName;
private void initSessionFactory()
{
Configuration config = new Configuration().configure();
ServiceRegistry serviceRegistry = new ServiceRegistryBuilder().applySettings(config.getProperties()).getBootstrapServiceRegistry();
sessionFact = config.buildSessionFactory(serviceRegistry);
}
public String getAccountName()
{
// Quick simple test query
String sql = "SELECT * FROM PER_ACCOUNT WHERE ACCOUNT_NM = 'lynnedelete'";
initSessionFactory();
Session session = sessionFact.getCurrentSession();
SQLQuery q = session.createSQLQuery(sql);
List<Object> result = q.list();
return accountName;
}
}
You can use Data access with JDBC, for example:
public class Client {
private final JdbcTemplate jdbcTemplate;
// Quick simple test query
final static String SQL = "SELECT * FROM PER_ACCOUNT WHERE ACCOUNT_NM = ?";
#Autowired
public Client(DataSource dataSource) {
jdbcTemplate = new JdbcTemplate(dataSource);
}
public List<Map<String, Object>> getData(String name) {
return jdbcTemplate.queryForList(SQL, name);
}
}
The short way is:
jdbcTemplate.queryForList("SELECT 1", Collections.emptyMap());
I'm working with Spring Boot 1.3.0.M4 and a MySQL database.
I have a problem when using modifying queries, the EntityManager contains outdated entities after the query has executed.
Original JPA Repository:
public interface EmailRepository extends JpaRepository<Email, Long> {
#Transactional
#Modifying
#Query("update Email e set e.active = false where e.active = true and e.expire <= NOW()")
Integer deactivateByExpired();
}
Suppose we have Email [id=1, active=true, expire=2015/01/01] in DB.
After executing:
emailRepository.save(email);
emailRepository.deactivateByExpired();
System.out.println(emailRepository.findOne(1L).isActive()); // prints true!! it should print false
First approach to solve the problem: add clearAutomatically = true
public interface EmailRepository extends JpaRepository<Email, Long> {
#Transactional
#Modifying(clearAutomatically = true)
#Query("update Email e set e.active = false where e.active = true and e.expire <= NOW()")
Integer deactivateByExpired();
}
This approach clears the persistence context not to have outdated values, but it drops all non-flushed changes still pending in the EntityManager. As I use only save() methods and not saveAndFlush() some changes are lost for other entities :(
Second approach to solve the problem: custom implementation for repository
public interface EmailRepository extends JpaRepository<Email, Long>, EmailRepositoryCustom {
}
public interface EmailRepositoryCustom {
Integer deactivateByExpired();
}
public class EmailRepositoryImpl implements EmailRepositoryCustom {
#PersistenceContext
private EntityManager entityManager;
#Transactional
#Override
public Integer deactivateByExpired() {
String hsql = "update Email e set e.active = false where e.active = true and e.expire <= NOW()";
Query query = entityManager.createQuery(hsql);
entityManager.flush();
Integer result = query.executeUpdate();
entityManager.clear();
return result;
}
}
This approach works similar to #Modifying(clearAutomatically = true) but it first forces the EntityManager to flush all changes to DB before executing the update and then it clears the persistence context. This way there won't be outdated entities and all changes will be saved in DB.
I would like to know if there's a better way to execute update statements in JPA without having the issue of the outdated entities and without the manual flush to DB. Perhaps disabling the 2nd level cache? How can I do it in Spring Boot?
Update 2018
Spring Data JPA approved my PR, there's a flushAutomatically option in #Modifying() now.
#Modifying(flushAutomatically = true, clearAutomatically = true)
I know this is not a direct answer to your question, since you already have built a fix and started a pull request on Github. Thank you for that!
But I would like to explain the JPA way you can go. So you would like to change all entities which match a specific criteria and update a value on each. The normal approach is just to load all needed entities:
#Query("SELECT * FROM Email e where e.active = true and e.expire <= NOW()")
List<Email> findExpired();
Then iterate over them and update the values:
for (Email email : findExpired()) {
email.setActive(false);
}
Now hibernate knows all changes and will write them to the database if the transaction is done or you call EntityManager.flush() manually. I know this won't work well if you have a big amount of data entries, since you load all entities into memory. But this is the best way, to keep the hibernate entity cache, 2nd level caches and the database in sync.
Does this answer say "the `#Modifying´ annotation is useless"? No! If you ensure the modified entities are not in your local cache e.g. write-only application, this approach is just the way to go.
And just for the record: you don't need #Transactional on your repository methods.
Just for the record v2: the active column looks as it has a direct dependency to expire. So why not delete active completely and look just on expire in every query?
As klaus-groenbaek said, you can inject EntityManager and use its refresh method :
#Inject
EntityManager entityManager;
...
emailRepository.save(email);
emailRepository.deactivateByExpired();
Email email2 = emailRepository.findOne(1L);
entityManager.refresh(email2);
System.out.println(email2.isActive()); // prints false