I do understand the difference between the method save and the method saveAndFlush of the class JpaRepository Spring Data JPA. As per my understanding, the save method will run and commit the sql only at the end of the transaction whereas the saveAndFlush method will synchronize the persistence context with the database by running the SQL statement but without committing it. Below is a sample code where I Wanted to experience with it and please review it.
This is the repository class for the update
#Repository
public interface ClassRepository extends JpaRepository<ClassA, Long> {
#Modifying(clearAutomatically = true)
#Query(value = "UPDATE class e SET e.class_name = ? WHERE e.employee_id = ?", nativeQuery = true)
int updateClassNative(String className, String empId);
}
This is the test case where I am testing the methods
#Test
void saveAndUpdateWithFlushJPA() {
ClassA classA = ClassA.builder().className("Test").employeeId("S0810").build();
this.classRepository.save(classA);
int size = this.classRepository.updateClassNative("TestQ", "S0810");
assertThat(size).isEqualTo(1);
}
In the above test case, the test passed. I was not expecting the record to be saved since I am using the save method. In the source code, the save method is wrapped with #Transactional. Is it because of that the save method is already committing the insert statement?
Ashley
The problem with your test scenario is that JPA always flushes the persistence context before executing a native query (this is also the default behaviour for JPQL queries, though it can be overriden). The rationale is that a query should report a state reflecting the changes already made in the current unit of work.
To see the difference between save/saveAndFlush, you can use the following test cases instead:
#Repository
public interface ClassRepository extends JpaRepository<ClassA, Long> {
#Query("SELECT COUNT(c.id) FROM ClassA c")
#QueryHints({
#QueryHint(name = org.hibernate.annotations.QueryHints.FLUSH_MODE, value = "COMMIT")
})
int countClassAEntities();
}
#Test
#Transactional
void saveAndUpdate() {
int initialCount = classRepository.countClassAEntities();
ClassA classA = ClassA.builder().className("Test").employeeId("S0810").build();
classRepository.save(classA);
int finalCount = classRepository.countClassAEntities();
assertEquals(initialCount, finalCount);
}
#Test
#Transactional
void saveAndUpdateWithFlush() {
int initialCount = classRepository.countClassAEntities();
ClassA classA = ClassA.builder().className("Test").employeeId("S0810").build();
classRepository.saveAndFlush(classA);
int finalCount = classRepository.countClassAEntities();
assertEquals(initialCount + 1, finalCount);
}
In the above setup, the count query has flush mode set to COMMIT, meaning that executing the query will not trigger a flush. If you were to use the default repository.count() method instead, the first test case would fail because by default, the flush mode is set to AUTO.
Related
I have a scenario where I am consuming an event and saving the details in the DB. Now the record that is being stored in the database has the id field autogenerated #GeneratedValue(strategy = GenerationType.IDENTITY).
In my test case I need to check if data is getting stored in the DB or not and is as per expectation.
But I am not sure how will I do findById() of SpringBoot Crud/JPA Repository since I do not know what value got generated.
Any help would be appreciated.
Take a look at save method from CrudRepository interface. Spring executes this method in transaction and after its completion Hibernate will generate identifier in returned entity.
Suppose your entity and repository looks as following:
....
public class SomeEntity {
#Id
#GeneratedValue
private Long id;
private String name;
public SomeEntity(String name){
this.name = name;
}
....
}
public interface SomeRepository extends CrudRepository<SomeEntity, Long> {
}
After saving entity:
SomeEntity someEntity = someRepository.save(new SomeEntity("Some entity"));
someEntity.getId() will contain actual record id which can be used further in your tests.
I think you are looking for annotation #DirtiesContext .
It is a Test annotation which indicates that the ApplicationContext associated with a test is dirty and should therefore be closed and removed from the context cache. - javadoc
Read Section 9.3.4 - Here
Check - Example ans below as well:
#Test
#DirtiesContext
public void save_basic() {
// get a course
Course course = courseJpaRepository.findById(10001L);
assertEquals("JPA ", course.getName());
// update details
course.setName("JPA - Updated");
courseJpaRepository.saveOrUpdate(course);
// check the value
Course course1 = courseJpaRepository.findById(10001L);
assertEquals("JPA - Updated", course1.getName());
}
BTW - how you can get the id : simply via getter method from the return type of save
EmployeeDetails employeeDetails = emaployeeService.saveEmployeeDetails(employee);
int temp = employeeDetails.getID()
Related Post : Here
I have an Employee entity with the following column:
#Entity
class Employee {
#Column(name = "first_name", length = 14)
private String firstName;
and I have a Spring JPA Repository for it:
#Repository
public interface EmployeeRepository extends CrudRepository<Employee, Integer> {
In test/resources/application.properties I have the following so that I use an in-memory h2 database with tables auto-generated:
spring.jpa.hibernate.ddl-auto=create
spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.url=jdbc:h2:mem:db;DB_CLOSE_DELAY=-1
spring.datasource.username=sa
spring.datasource.password=sa
I was expecting this test to fail, since the firstName is longer than what is allowed:
#DataJpaTest
public class EmployeeRepositoryTest {
#Autowired
private EmployeeRepository employeeRepository;
#Test
public void mustNotSaveFirstNameLongerThan14() {
Employee employee = new Employee();
employee.setFirstName("koraykoraykoray"); // 15 characters!
employeeRepository.save(employee);
}
}
And I was surprised to see this test was not failing, however the following does fail:
#DataJpaTest
public class EmployeeRepositoryTest {
#Autowired
private EmployeeRepository employeeRepository;
#Test
public void testMustNotSaveFirstNameLongerThan14() {
Employee employee = new Employee();
employee.setFirstName("koraykoraykoray"); // 15 characters!
employeeRepository.save(employee);
employeeRepository.findAll();
}
}
with the stacktrace:
Caused by: org.h2.jdbc.JdbcSQLDataException: Value too long for column "FIRST_NAME VARCHAR(14)": "'koraykoraykoray' (15)"; SQL statement:
The only difference is the second test has the additional employeeRepository.findAll(); statement, which forces Hibernate to flush as far as I understand.
This does not feel right to me, I would much rather want the test to fail immediately for save.
I can also have
#Autowired
private TestEntityManager testEntityManager;
and call
testEntityManager.flush();
but again, this does not feel correct either.. How do I make this test fail without any workaround or additional statements?
The easiest option in your case is configure #Transactional annotation, forcing to send database all changes in your tests (it can be used only in specific ones):
import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional;
import static org.junit.jupiter.api.Assertions.assertThrows;
#Transactional(propagation = Propagation.NOT_SUPPORTED)
#DataJpaTest
public class EmployeeRepositoryTest {
#Autowired
private EmployeeRepository employeeRepository;
#Test
public void mustNotSaveFirstNameLongerThan14() {
Employee employee = new Employee();
employee.setId(1);
employee.setFirstName("koraykoraykoray"); // 15 characters!
assertThrows(DataIntegrityViolationException.class, () -> {
employeeRepository.save(employee);
});
}
#Test
public void mustSaveFirstNameShorterThan14() {
Employee employee = new Employee();
employee.setId(1);
employee.setFirstName("koraykor"); // 8 characters!
employeeRepository.save(employee);
}
}
PD: I have added a simple Integer property as PK of Employee entity due to your repository definition.
You can see the results in the following picture:
You could use JpaRepository<T,ID> instead of CrudRepository<T,ID>. Something like:
#Repository
public interface EmployeeRepository extends JpaRepository<Employee, Integer>
Then you can use its saveAndFlush() method anywhere you need to send data immediately:
#Test
public void mustNotSaveFirstNameLongerThan14() {
Employee employee = new Employee();
employee.setFirstName("koraykoraykoray"); // 15 characters!
employeeRepository.saveAndFlush(employee);
}
And in code where you would like to have optimization you still can use save() method.
Thanks doctore for your answer, I had the similar problem as OP and your solution has helped. I decided to dig a little and figure out why it works, should someone else have this problem.
With #DataJpaTest annotated test class, your class implicitly becomes #Transactional with default propagation type Propagation.REQUIRED. That means every test method is also #Transactional with same default configuration. Now, all CRUD methods in CrudRepository are also #Transactional, but it has nothing to do with #DataJpaTest - they are transactional due to implementation. Whoa, that's a lot of transactions!
As soon as you annotate your whole class (or just a test method) with #Transactional(propagation = Propagation.NOT_SUPPORTED), your test method(s) are no longer #Transactional. However, inner methods of your test method(s), that is, CRUD operations from CrudRepository, remain transactional, meaning that they will have their own transaction scopes. Because of that, they will be committed to database immediately after execution, because by default (in Spring Boot, which users HikariCP connection pool), auto commits are turned on. Auto commits happen after every SQL query. And thus tests pass as you'd expect.
I like to visualize things, so here is the visualization of the whole process:
I hope this was helpful. URLs from the diagram:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/transaction/annotation/Propagation.html
https://docs.spring.io/spring-data/jpa/docs/current/reference/html/#transactions
https://docs.oracle.com/javase/tutorial/jdbc/basics/transactions.html#disable_auto_commit
https://github.com/brettwooldridge/HikariCP/blob/dev/src/main/java/com/zaxxer/hikari/HikariConfig.java#L126
https://dzone.com/articles/spring-boot-transactions-tutorial-understanding-tr (not from diagram, but explains transaction very well!)
The #Commit can do the job ( it was added since 4.2)
#Test
#Commit
public void mustNotSaveFirstNameLongerThan14() {
Employee employee = new Employee();
employee.setId(1);
employee.setFirstName("koraykoraykoray"); // 15 characters!
assertThrows(DataIntegrityViolationException.class, () -> {
employeeRepository.save(employee);
});
}
I'm performing an update via a method using Hibernate and the EntityManager.
This update method is called multiple times (within a loop).
It seems like when I execute it the first time, it locks the table and does not free it.
When trying to update the table via SQL Developer after having closed the application, I see the table is still locked because the update is hanging.
What do you see as a solution to this problem? If you need more information, let me know.
Class
#Repository
#Transactional(propagation = REQUIRES_NEW)
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(String id) {
String query = "UPDATE_QUERY";
Query nativeQuery = entityManager.createNativeQuery(String.format(query, id));
nativeQuery.executeUpdate();
}
}
UPDATE
After having waited more than one hour, I launched the application again and it worked fine once but now again, it hangs.
UPDATE 2 -- I'll give a maximum bounty to whoever helps me solve this
On another place I use an application managed entity manager and it still gives me the same type of errors.
public void fillYirInfo() {
File inputFile = new File("path");
try (InputStream inputStream = new FileInputStream(inputFile);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream))) {
bufferedReader.lines().skip(1).limit(20).forEach(line -> {
String[] data = line.split(",");
String rnr = data[0];
String linked = data[1];
String email = data.length > 2 ? data[2] : "";
String insuredId = insuredPeopleRepository.getInsuredIdFromNationalId(rnr);
int modifiedCounter = 0;
if (!isNullOrEmpty(insuredId)) {
EntityManager entityManager = emf.createEntityManager();
EntityTransaction transaction = entityManager.getTransaction();
Query nativeQuery = entityManager.createNativeQuery(
"QUERY"
);
transaction.begin();
nativeQuery.executeUpdate();
entityManager.flush();
transaction.commit();
entityManager.close();
}
System.out.println(modifiedCounter + " rows modified");
});
} catch (IOException e) {
e.printStackTrace();
}
}
Try without an update-query:
#Repository
#Transactional(propagation = REQUIRES_NEW)
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(String id) {
//guessing your class name and method..
final YirInfo yirInfo = entityManager.find(YirInfo.class, id);
yirInfo.setSent();
}
}
Might not be as fast as a single update query, but it's possible to get it reasonably fast, unless the amount of data is huge. This is the preferred way of using Hibernate/JPA, instead of thinking in terms of single values and SQL queries, you work with entities/objects and (sometimes) HQL/JPQL queries.
You are using #Transactional annotation. This means you are using Spring Transaction. Then in your UPDATE 2 you are using transaction by yourself and managed by spring (I guess it's another project or class not managed by Spring).
In any case what I would do is to try to update your records in single spring transaction and I'd not use #Transactional in DAO layer but in service layer. Something like this:
Service layer:
#Service
public class YirInfoService {
#Autowired
YirInfoRepository dao;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(List < String > ids) {
dao.setSents(ids);
}
}
DAO layer:
#Repository
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
//Here you can update by using and IN statement or by doing a cycle
//Let's suppose a bulk operation
public void setSents(List < String > ids) {
String query = "UPDATE_QUERY";
for (int i = 0; i < ids.size(); i++) {
String id = ids.get(i);
Query nativeQuery = entityManager.createNativeQuery(String.format(query, id));
nativeQuery.executeUpdate();
if (i % 20 == 0) {
entityManager.flush();
entityManager.clear();
}
}
}
}
The first thing you have to understand is that for the first example, you are using a native query to update rows in the DB. In this case you are completely skipping Hibernate to do anything for you.
In your second example, you have the same thing, you are updating via an update query. You don't need to flush the entity manager as it's only necessary for transferring the pending changes made to your entity objects within that entity manager.
Plus I don't know how your example works as you are autowiring the entity manager and not using the #PersistenceContext annotation. Make sure you use this one properly because you might have misconfigured the application. Also there is no need to manually create the entity manager when using Spring as it looks in the second example. Just use #PersistenceContext to get an entity manager in your app.
You are also mixing up transaction management. In the first example, it's enough if you put the #Transactional annotation to either of your method or to the class.
For the other example, you are doing manual transaction management which makes no sense in this case. If you are using Spring, you can simply rely on declarative transaction management.
The first thing I'd check here is to integrate datasource-proxy into your connection management and log out how your statements are executed. With this info, you can make sure that the query is sent to the DB side and the DB is executing it very slowly, or you are having a network issue between your app and db.
If you find out that the query is sent properly to the DB, you want to analyze your query, because most probably it's just executed very slowly and needs some optimizations. For this, you can use the Explain plan feature, to find out how your execution plan looks like and then make it faster.
I have the following method in a JpaRepository that simply updates the parent_id column for every row with a specific value in that column. It works perfectly in pure SQL but it failed in Spring Data. I think it's due to just some problem with the transaction scope, because the update is done (first assert passes). It's just I can't see what code is changed in DB. I want to use the
#Transactional
#Rollback
because I guess this is best practice. Is there any way I can see what's don in the code from my test method?
#Modifying
#Query("update MovementCategory mc set mc.parentId = :newParentId where mc.parentId = :previousParentId and mc.userId = :userId")
int updateChildrenParentId(#Param("userId") long userId, #Param("previousParentId") long previousParentId, #Param("newParentId") long newParentId);
I've created a simple integration test that checks that changes are correctly set in DB, but it doesn't seem to work and I can't get to see why. I thought it could be because of transaction scope but I did a smaller test and discarded it, so no idea. The following is the integration test.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = MySpringBootMavenApplication.class)
#WebAppConfiguration
#IntegrationTest("server.port:0")
#Transactional
#Rollback
public class MovementCategoryRepositoryIT {
private static final long USER_ID = -1L;
#Autowired MovementCategoryRepository repo;
#Test
public void testUpdateChildrenParentId() {
long newParentId= -9723854;
long previousParentId = -1239842;
MovementCategory mc1 = getFilled(null, "DELETE ME");
mc1.setParentId(previousParentId);
mc1 = repo.saveAndFlush(mc1);
int updates = repo.updateChildrenParentId(USER_ID, previousParentId, newParentId);
// First assert passes, so there update is done
assertEquals(1, updates);
MovementCategory children1 = repo.findOneByIdAndUserId(mc1.getId(), USER_ID);
// Second one fails
assertEquals(newParentId, children1.getParentId().longValue());
}
Result:
java.lang.AssertionError: expected:<-9723854> but was:<-1239842>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at com.ldepablo.repository.MovementCategoryRepositoryIT.testUpdateChildrenParentId(MovementCategoryRepositoryIT.java:171)
...
You need to start the transaction for reading explicitly and also roll back the transaction explicitly as the REST calls are in a different session than the test. Use TransactionTemplate as described in this answer.
(Answering too late, of course; but this was the first hit on the search engine)
I'm working with Spring Boot 1.3.0.M4 and a MySQL database.
I have a problem when using modifying queries, the EntityManager contains outdated entities after the query has executed.
Original JPA Repository:
public interface EmailRepository extends JpaRepository<Email, Long> {
#Transactional
#Modifying
#Query("update Email e set e.active = false where e.active = true and e.expire <= NOW()")
Integer deactivateByExpired();
}
Suppose we have Email [id=1, active=true, expire=2015/01/01] in DB.
After executing:
emailRepository.save(email);
emailRepository.deactivateByExpired();
System.out.println(emailRepository.findOne(1L).isActive()); // prints true!! it should print false
First approach to solve the problem: add clearAutomatically = true
public interface EmailRepository extends JpaRepository<Email, Long> {
#Transactional
#Modifying(clearAutomatically = true)
#Query("update Email e set e.active = false where e.active = true and e.expire <= NOW()")
Integer deactivateByExpired();
}
This approach clears the persistence context not to have outdated values, but it drops all non-flushed changes still pending in the EntityManager. As I use only save() methods and not saveAndFlush() some changes are lost for other entities :(
Second approach to solve the problem: custom implementation for repository
public interface EmailRepository extends JpaRepository<Email, Long>, EmailRepositoryCustom {
}
public interface EmailRepositoryCustom {
Integer deactivateByExpired();
}
public class EmailRepositoryImpl implements EmailRepositoryCustom {
#PersistenceContext
private EntityManager entityManager;
#Transactional
#Override
public Integer deactivateByExpired() {
String hsql = "update Email e set e.active = false where e.active = true and e.expire <= NOW()";
Query query = entityManager.createQuery(hsql);
entityManager.flush();
Integer result = query.executeUpdate();
entityManager.clear();
return result;
}
}
This approach works similar to #Modifying(clearAutomatically = true) but it first forces the EntityManager to flush all changes to DB before executing the update and then it clears the persistence context. This way there won't be outdated entities and all changes will be saved in DB.
I would like to know if there's a better way to execute update statements in JPA without having the issue of the outdated entities and without the manual flush to DB. Perhaps disabling the 2nd level cache? How can I do it in Spring Boot?
Update 2018
Spring Data JPA approved my PR, there's a flushAutomatically option in #Modifying() now.
#Modifying(flushAutomatically = true, clearAutomatically = true)
I know this is not a direct answer to your question, since you already have built a fix and started a pull request on Github. Thank you for that!
But I would like to explain the JPA way you can go. So you would like to change all entities which match a specific criteria and update a value on each. The normal approach is just to load all needed entities:
#Query("SELECT * FROM Email e where e.active = true and e.expire <= NOW()")
List<Email> findExpired();
Then iterate over them and update the values:
for (Email email : findExpired()) {
email.setActive(false);
}
Now hibernate knows all changes and will write them to the database if the transaction is done or you call EntityManager.flush() manually. I know this won't work well if you have a big amount of data entries, since you load all entities into memory. But this is the best way, to keep the hibernate entity cache, 2nd level caches and the database in sync.
Does this answer say "the `#Modifying´ annotation is useless"? No! If you ensure the modified entities are not in your local cache e.g. write-only application, this approach is just the way to go.
And just for the record: you don't need #Transactional on your repository methods.
Just for the record v2: the active column looks as it has a direct dependency to expire. So why not delete active completely and look just on expire in every query?
As klaus-groenbaek said, you can inject EntityManager and use its refresh method :
#Inject
EntityManager entityManager;
...
emailRepository.save(email);
emailRepository.deactivateByExpired();
Email email2 = emailRepository.findOne(1L);
entityManager.refresh(email2);
System.out.println(email2.isActive()); // prints false