Transactional annotation slows down the performance than HibernateDAOSupport - spring

I am exporting a report in my code, I am using HibernateDAOSupport and the method is not annotated with #Transactional. So when the request comes from the UI then automatically Transaction is created and the report is exported in 2 mins.
But when I try to use thread, so I had to put annotation #Transactional, otherwise I get an error of LazyInitialization as no Transcaction is present.
So when I put #Transactional then the same report takes time 2.4 mins. This time keeps increasing and sometimes it take double of the time without #Transactional
I am not sure why it takes time when I put annotation #Transactional
The main class:
CommonDAO
public class CommonDAO extends HibernateDaoSupport
private HibernateTransactionManager txnManager;
public void setTxnManager(HibernateTransactionManager txnManager) {
this.txnManager = txnManager;
}
public List executeSQLQueryPaging(final String hql,
final Object[] params, final Integer[] pagingParam) throws ServiceException {
List results = null;
results = getHibernateTemplate().executeFind(new HibernateCallback() {
public Object doInHibernate(Session session) {
SQLQuery query = session.createSQLQuery(hql);
if (params != null) {
for (int i = 0; i < params.length; i++) {
query.setParameter(i, params[i]);
}
}
query.setFirstResult(pagingParam[0]);
query.setMaxResults(pagingParam[1]);
return query.list();
}
});
return results;
}
ModuleDAO extends CommonDAO
public List getReportData{
executeSQLQueryPaging();
...
return list}
Service
public List getReportData(){
.....
return ModuleDAO.getReportData();
}
If I put #Transactional at service layer then the performance detriorates, or else it is faster if executed from web.

Related

Spring-data JdbcTemplate does not commit

I need to update thousands of records in the database but i would like to commit after a batch of 5000 records.
#Service
#Transactional (rollbackFor=Throwable.class)
public class AttributeProcessorServiceImpl extends DataLoader implements
AttributeProcessorService
{
.....
private final TransactionTemplate transTemplate;
private final JdbcTemplate jdbcTemplate;
#Autowired private PlatformTransactionManager platformTransactionManager;
#Autowired
public BlockAttributeProcessorServiceImpl(
TransactionTemplate transTemplate,
JdbcTemplate jdbcTemplate,
.....)
{
super();
this.transTemplate = transTemplate;
this.jdbcTemplate=jdbcTemplate;
.....
}
#Async
#Transactional (propagation=Propagation.NOT_SUPPORTED)
public void reloadAttrs()
{
loadAttrs();
updateAttrs();
}
private void loadAttrs()
{
...some data fetching and processing, finally call db update.
updateDbInBatches(rowcount, sql);
}
private void updateAttrs()
{
...some data fetching and processing, finally call db update.
updateDbInBatches(rowcount, sql);
}
private void updateDbInBatches(long rowcount, String sql)
{
DefaultTransactionDefinition def;
boolean hasMore=true;
Integer from;
Integer to = 0;
int batchSize=5000; //gets from property
while (hasMore)
{
from = to+1;
to = batchSize;
def = new DefaultTransactionDefinition();
def.setName("backCommitTx");
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
TransactionStatus status = platformTransactionManager.getTransaction(def);
int rows = jdbcTemplate.update(sql,paramValues,paramTypes);
logger.debug("Loaded ["+rows+"] records.");
platformTransactionManager.commit(status);
if (to > rowcount)
{
hasMore=false;
logger.debug("All records ["+rowcount+"] updated.");
}
}
}
}
If I put a breakpoint after loadAttrs(), it shows it loaded bunch of records to the database and issued a commit(), but database does not reflect that commit, until after entire public method completes. How do i ensure data is indeed written to the database after each commit. commit neither gives any error as well.
I missed an important piece of information that solved the problem.
I had another public method which is what was called from outside.
public void reloadAttrs(TransDetail trans)
{
reloadAttrs();
}
Above method was infact using default Transaction Propagation as i did not mention it specifically. Since this was the first public method that was called, spring was ignoring transaction demarcation on next public (async) method that was called. I changed above signature to:
#Transactional (propagation=Propagation.NOT_SUPPORTED)
public void reloadAttrs(TransDetail trans)
{
reloadAttrs();
}
It then worked. I was able to see changes in the database after every commit.

Cannot Write Data to ElasticSearch with AbstractReactiveElasticsearchConfiguration

I am trying out to write data to my local Elasticsearch Docker Container (7.4.2), for simplicity I used the AbstractReactiveElasticsearchConfiguration given from Spring also Overriding the entityMapper function. The I constructed my repository extending the ReactiveElasticsearchRepository
Then in the end I used my autowired repository to saveAll() my collection of elements containing the data. However Elasticsearch doesn't write any data. Also i have a REST controller which is starting my whole process returning nothing basicly, DeferredResult>
The REST method coming from my ApiDelegateImpl
#Override
public DeferredResult<ResponseEntity<Void>> openUsageExporterStartPost() {
final DeferredResult<ResponseEntity<Void>> deferredResult = new DeferredResult<>();
ForkJoinPool.commonPool().execute(() -> {
try {
openUsageExporterAdapter.startExport();
deferredResult.setResult(ResponseEntity.accepted().build());
} catch (Exception e) {
deferredResult.setErrorResult(e);
}
}
);
return deferredResult;
}
My Elasticsearch Configuration
#Configuration
public class ElasticSearchConfig extends AbstractReactiveElasticsearchConfiguration {
#Value("${spring.data.elasticsearch.client.reactive.endpoints}")
private String elasticSearchEndpoint;
#Bean
#Override
public EntityMapper entityMapper() {
final ElasticsearchEntityMapper entityMapper = new ElasticsearchEntityMapper(elasticsearchMappingContext(), new DefaultConversionService());
entityMapper.setConversions(elasticsearchCustomConversions());
return entityMapper;
}
#Override
public ReactiveElasticsearchClient reactiveElasticsearchClient() {
ClientConfiguration clientConfiguration = ClientConfiguration.builder()
.connectedTo(elasticSearchEndpoint)
.build();
return ReactiveRestClients.create(clientConfiguration);
}
}
My Repository
public interface OpenUsageRepository extends ReactiveElasticsearchRepository<OpenUsage, Long> {
}
My DTO
#Data
#Document(indexName = "open_usages", type = "open_usages")
#TypeAlias("OpenUsage")
public class OpenUsage {
#Field(name = "id")
#Id
private Long id;
......
}
My Adapter Implementation
#Autowired
private final OpenUsageRepository openUsageRepository;
...transform entity into OpenUsage...
public void doSomething(final List<OpenUsage> openUsages){
openUsageRepository.saveAll(openUsages)
}
And finally my IT test
#SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT)
#Testcontainers
#TestPropertySource(locations = {"classpath:application-it.properties"})
#ContextConfiguration(initializers = OpenUsageExporterApplicationIT.Initializer.class)
class OpenUsageExporterApplicationIT {
#LocalServerPort
private int port;
private final static String STARTCALL = "http://localhost:%s/open-usage-exporter/start/";
#Container
private static ElasticsearchContainer container = new ElasticsearchContainer("docker.elastic.co/elasticsearch/elasticsearch:6.8.4").withExposedPorts(9200);
static class Initializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(final ConfigurableApplicationContext configurableApplicationContext) {
final List<String> pairs = new ArrayList<>();
pairs.add("spring.data.elasticsearch.client.reactive.endpoints=" + container.getContainerIpAddress() + ":" + container.getFirstMappedPort());
pairs.add("spring.elasticsearch.rest.uris=http://" + container.getContainerIpAddress() + ":" + container.getFirstMappedPort());
TestPropertyValues.of(pairs).applyTo(configurableApplicationContext);
}
}
#Test
void testExportToES() throws IOException, InterruptedException {
final List<OpenUsageEntity> openUsageEntities = dbPreparator.insertTestData();
assertTrue(openUsageEntities.size() > 0);
final String result = executeRestCall(STARTCALL);
// Awaitility here tells me nothing is in ElasticSearch :(
}
private String executeRestCall(final String urlTemplate) throws IOException {
final String url = String.format(urlTemplate, port);
final HttpUriRequest request = new HttpPost(url);
final HttpResponse response = HttpClientBuilder.create().build().execute(request);
// Get the result.
return EntityUtils.toString(response.getEntity());
}
}
public void doSomething(final List<OpenUsage> openUsages){
openUsageRepository.saveAll(openUsages)
}
This lacks a semicolon at the end, so it should not compile.
But I assume this is just a typo, and there is a semicolon in reality.
Anyway, saveAll() returns a Flux. This Flux is just a recipe for saving your data, and it is not 'executed' until subscribe() is called by someone (or something like blockLast()). You just throw that Flux away, so the saving never gets executed.
How to fix this? One option is to add .blockLast() call:
openUsageRepository.saveAll(openUsages).blockLast();
But this will save the data in a blocking way effectively defeating the reactivity.
Another option is, if the code you are calling saveAll() from supports reactivity is just to return the Flux returned by saveAll(), but, as your doSomething() has void return type, this is doubtful.
It is not seen how your startExport() connects to doSomething() anyway. But it looks like your 'calling code' does not use any notion of reactivity, so a real solution would be to either rewrite the calling code to use reactivity (obtain a Publisher and subscribe() on it, then wait till the data arrives), or revert to using blocking API (ElasticsearchRepository instead of ReactiveElasticsearchRepository).

Transactions and relationship entities mapping problems with Neo4j OGM

Versions used: spring-data-neo4j 4.2.0-BUILD-SNAPSHOT / neo4j-ogm 2.0.6-SNAPSHOT
I'm having problems to correctly fetch relationship entities.
The following fetch calls don't return consistent results (executed in the same transaction):
session.query("MATCH (:A)-[b:HAS_B]-(:C) RETURN count(b) as count") returns 1
session.query("MATCH (:A)-[b:HAS_B]-(:C) RETURN b") correctly returns the relationship entity as a RelationshipModel object
session.query(B.class, "MATCH (:A)-[b:HAS_B]-(:C) RETURN b") returns null !
Important remark: When all operations (create, fetch) are done in the same transaction, it seems to be fine.
I have been able to implement a workaround by using session.query(String, Map) to query the relationship entity and map it by myself into my POJO.
#NodeEntity
public class A {
public A () {}
public A (String name) {
this.name = name;
}
#GraphId
private Long graphId;
private String name;
#Relationship(type="HAS_B", direction=Relationship.OUTGOING)
private B b;
}
#RelationshipEntity(type="HAS_B")
public class B {
public B () {}
public B (String name, A a, C c) {
this.name = name;
this.a = a;
this.c = c;
}
#GraphId
private Long graphId;
#StartNode
private A a;
#EndNode
private C c;
private String name;
}
#NodeEntity
public class C {
public C () {}
public C (String name) {
this.name = name;
}
#GraphId
private Long graphId;
private String name;
}
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(loader=AnnotationConfigContextLoader.class, classes={MyTest.TestConfiguration.class})
public class MyTest {
#Autowired
private MyBean myBean;
#Configuration
#EnableAutoConfiguration
#EnableTransactionManagement
#EnableNeo4jRepositories("com.nagra.ml.sp.cpm.core.repositories")
public static class TestConfiguration {
#Bean
public org.neo4j.ogm.config.Configuration configuration() {
org.neo4j.ogm.config.Configuration config = new org.neo4j.ogm.config.Configuration();
config.driverConfiguration().setDriverClassName("org.neo4j.ogm.drivers.embedded.driver.EmbeddedDriver");
return config;
}
#Bean
public SessionFactory sessionFactory() {
return new SessionFactory(configuration(), "com.nagra.ml.sp.cpm.model");
}
#Bean
public Neo4jTransactionManager transactionManager() {
return new Neo4jTransactionManager(sessionFactory());
}
#Bean
public MyBean myBean() {
return new MyBean();
}
}
#Test
public void alwaysFails() {
myBean.delete();
myBean.create("1");
try { Thread.sleep(2000); } catch (InterruptedException e) {} //useless
myBean.check("1"); // FAILS HERE !
}
#Test
public void ok() {
myBean.delete();
myBean.createAndCheck("2");
}
}
#Transactional(propagation = Propagation.REQUIRED)
public class MyBean {
#Autowired
private Session neo4jSession;
public void delete() {
neo4jSession.query("MATCH (n) DETACH DELETE n", new HashMap<>());
}
public void create(String suffix) {
C c = new C("c"+suffix);
neo4jSession.save(c);
A a = new A("a"+suffix);
neo4jSession.save(a);
B bRel = new B("b"+suffix, a, c);
neo4jSession.save(bRel);
}
public void check(String suffix) {
//neo4jSession.clear(); //Not working even with this
Number countBRels = (Number) neo4jSession.query("MATCH (:A)-[b:HAS_B]-(:C) WHERE b.name = 'b"+suffix+"' RETURN count(b) as count", new HashMap<>()).iterator().next().get("count");
assertEquals(1, countBRels.intValue()); // OK
Iterable<B> bRels = neo4jSession.query(B.class, "MATCH (:A)-[b:HAS_B]-(:C) WHERE b.name = 'b"+suffix+"' RETURN b", new HashMap<>());
boolean relationshipFound = bRels.iterator().hasNext();
assertTrue(relationshipFound); // FAILS HERE !
}
public void createAndCheck(String suffix) {
create(suffix);
check(suffix);
}
}
This query session.query(B.class, "MATCH (:A)-[b:HAS_B]-(:C) RETURN b") returns only the relationship but not the start node or end node and so the OGM cannot hydrate this. You need to always return the start and end node along with the relationship like session.query(B.class, "MATCH (a:A)-[b:HAS_B]-(c:C) RETURN a,b,c")
The reason it appears to work when you both create and fetch data in the same transaction is that the session already has a cached copy of a and c and hence b can be hydrated with cached start and end nodes.
Firstly, please upgrade from OGM 2.0.6-SNAPSHOT to 2.1.0-SNAPSHOT. I have noticed some off behaviour in the former which might be one part of the issue.
Now on to your test. There are several things going on here which are worth investigating.
Use of #DirtiesContext: You don't seem to be touching the context and if you are using it to reset the context between tests so you get a new Session/Transaction then that's going about it the wrong way. Just use #Transactional instead. The Spring JUnit runner will treat this in a special manner (see next point).
Being aware that Transactional tests automatically roll back: Jasper is right. Spring Integration Tests will always roll back by default. If you want to make sure your JUnit test commits then you will have to #Commit it. A good example of how to set up your test can be seen here.
Knowing how Spring Transaction proxies work. On top of all this confusion you have to make sure you don't simply call transactional method to transactional method in the same class and expect Spring's Transactional behaviour to apply. A quick write up on why can be seen here.
If you address those issues everything should be fine.

How to Execute a Database Call after sending a HTTP Response with Spring

so i basically want to process a HTTP-Post with a Controller in Spring and send back a result for the User AND after that i want to make a database call.
So here is my example:
#Controller
public class AngebotController extends WebMvcConfigurerAdapter {
#Autowired
private DatabaseUtils dbUtil;
#Autowired
private MyMailSender mailSender;
#RequestMapping(value = REQUEST_PATH, method = RequestMethod.POST)
public String doPost(#Valid FormInput input, BindingResult bindingResult) {
// .. some input validations here
// after the validation is complete i will have accesss to a object, that i just created, just like the following
// Lets say that this object holds important values for the database query
final MyObject validatedInput = new MyObject();
// Start a new Thread to do the remaining work (the Database Call)
Thread t = new Thread() {
#Override
public void run() {
// so i am starting the database query with the information that i just validated above
// That Database Query will Return a List of Items based on the given MyObject
// This Query will take a long time and i dont want the user to wait, because this data is not nessessary for the user
List<Item> items = dbUtil.getStuffByInput(validatedInput);
for(Item i : items) {
// Now i just want to send some informations about the item via email, this part works
mailSender.sendMail("mail#mail.mail", i);
}
}
}.start();
return "viewname";
}
}
#Service
public class DatabaseUtils {
#Autowired
private ItemRepository repository;
public List<Item> getStuffByInput(MyObject o) {
List<Item> items = repository.findAllByMyObject(o);
// Doing some more stuff with those items here ..
return items;
}
}
// The Implementation will be generated by Spring
public interface ItemRepository extends CrudRepository<Item, Long> {
// will select all Items by comparing the myObject with each Item
// This also works like intended
public List<Item> findAllByMyObject(MyObject myObject);
}
So where is my Problem?
The only Problem i have is, that the Database Query will end throwing an Exception, because the Database Connection was closed (i guess by Spring)
The Exception: Exception in thread "Thread-6" org.hibernate.SessionException: Session is closed!
Any Help appreciated.
Thanks!

Spring #Transactional propagation effect of REQUIRES_NEW?

I am doing some tests to understand the behaviour of #Transactional in Spring 3. Though, it is not working as I would expect. If have one method with Propagation.REQUIRED calling another with Propagation.REQUIRES_NEW, will the second method be able to retrieve from the DB the data inserted by the first method?
EDITED:
I AM seeing uncommitted changed in a #Transaction, here is my (nasty looking) code.
#Service
public class FeedManager {
#Autowired
JdbcTemplate jdbcTemplate;
#Transactional(isolation = Isolation.READ_COMMITTED, propagation = Propagation.REQUIRED)
public boolean createFeed(Feed feed, boolean anonymizeIt) {
String query = "INSERT INTO feed (name, url, is_active) values (?, ?, ?)";
int rowsAffected = jdbcTemplate.update(query, feed.getName(), feed.getUrl(), feed.isActive());
boolean success = (rowsAffected == 1);
if (anonymizeIt) {
success = success && this.anonymizeFeedName(feed);
}
return success;
}
#Transactional(isolation = Isolation.READ_COMMITTED, propagation = Propagation.REQUIRES_NEW)
public boolean anonymizeFeedName(Feed feed) {
String query = "UPDATE feed set name = ? where name = ?";
int rowsAffected = jdbcTemplate.update(query, feed.getName() + (new Date()).toString(), feed.getName());
boolean success = (rowsAffected == 1);
return success;
}
}
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("classpath:mrpomario/springcore/jdbc/jdbc-testenv-config.xml")
public class TransactionalTest {
#Autowired
FeedManager feedManager;
Feed feed;
#Before
public void setup() {
feed = new Feed("RSS", "http://www.feedlink.com", true);
}
#Test
public void test_Create() {
assertTrue(feedManager.createFeed(feed, false));
}
#Test
public void test_Anonymize() {
assertTrue(feedManager.anonymizeFeedName(feed));
}
#Test
public void test_Create_And_Anonymize() {
Feed feedo = new Feed("AnotherRSS", "http://www.anotherfeedlink.com", true);
assertTrue(feedManager.createFeed(feedo, true));
}
}
It should not be able to see any changes made by the first method (as long as your isolation level is READ COMMITTED or above).
If you get different results, make sure that #Transactional actually takes effect. In particular, make sure that you don't call another #Transactional method of the same class - due to limitations of Spring proxy-based AOP model transactional aspect is applied only to calls that come from the outside of the class.
See also:
7.6.1 Understanding AOP proxies

Resources