I need to write some temporary code in my existing Spring Boot 1.2.5 application that will do some complex SQL queries. By complex, I mean a single queries about 4 different tables and I have a number of these. We all decided to use existing SQL to reduce potential risk of getting the new queries wrong, which in this case is a good way to go.
My application uses JPA / Hibernate and maps some entities to tables. From my research it seems like I would have to do a lot of entity mapping.
I tried writing a class that would just get the Hibernate session object and execute a native query but when it tried to configure the session factory it threw an exception complaining it could not find the config file.
Could I perhaps do this from one of my existing entities, or at least find a way to get the Hibernate session that already exists?
UPDATE:
Here is the exception, which makes perfect sense since there is no config file to find. Its app configured in the properties file.
org.hibernate.HibernateException: /hibernate.cfg.xml not found
at org.hibernate.internal.util.ConfigHelper.getResourceAsStream(ConfigHelper.java:173)
For what it's worth, the code:
#NamedNativeQuery(name = "verifyEa", query = "select account_nm from per_person where account_nm = :accountName")
public class VerifyEaResult
{
private SessionFactory sessionFact = null;
String accountName;
private void initSessionFactory()
{
Configuration config = new Configuration().configure();
ServiceRegistry serviceRegistry = new ServiceRegistryBuilder().applySettings(config.getProperties()).getBootstrapServiceRegistry();
sessionFact = config.buildSessionFactory(serviceRegistry);
}
public String getAccountName()
{
// Quick simple test query
String sql = "SELECT * FROM PER_ACCOUNT WHERE ACCOUNT_NM = 'lynnedelete'";
initSessionFactory();
Session session = sessionFact.getCurrentSession();
SQLQuery q = session.createSQLQuery(sql);
List<Object> result = q.list();
return accountName;
}
}
You can use Data access with JDBC, for example:
public class Client {
private final JdbcTemplate jdbcTemplate;
// Quick simple test query
final static String SQL = "SELECT * FROM PER_ACCOUNT WHERE ACCOUNT_NM = ?";
#Autowired
public Client(DataSource dataSource) {
jdbcTemplate = new JdbcTemplate(dataSource);
}
public List<Map<String, Object>> getData(String name) {
return jdbcTemplate.queryForList(SQL, name);
}
}
The short way is:
jdbcTemplate.queryForList("SELECT 1", Collections.emptyMap());
Related
I have a native query to fetch a sequence of the form:
#Repository
public class GetSequenceRepository {
#PersistenceContext
private EntityManager entityManager;
public String getSequenceUsingNativeQuery() {
// POSTGRES Syntax
return entityManager.createNativeQuery("SELECT nextval ('my_custom_seq')")
.getSingleResult().toString();
}
}
Since the syntax is different for Postgres, MySQL and Oracle; I want to create a query to get sequence value in a database-agnostic manner.
I want something like #Sequencegenerator but at the Repository layer. Is there implementation of #Sequencegenerator at Repository layer.
Note:
I have a sequence already present in Database.
You can use the following:
entityManager.getEntityManagerFactory()
.unwrap(SessionFactoryImplementor.class)
.getIdentifierGenerator("your.hibernate.entity.name")
.generate(entityManager.unwrap(SharedSessionContractImplementor.class), null)
If the sequence is not mapped for an entity, you need to construct a IdentifierGenerator yourself:
IdentifierGeneratorFactory identifierGeneratorFactory = new DefaultIdentifierGeneratorFactory();
identifierGeneratorFactory.setDialect(entityManager.getEntityManagerFactory().unwrap(SessionFactoryImplementor.class).getDialect());
Properties properties = new Properties();
properties.put("sequence_name", "YOUR_NAME");
IdentifierGenerator identifierGenerator = identifierGeneratorFactory.createIdentifierGenerator("sequence", StringType.INSTANCE, properties);
identifierGenerator.generate(entityManager.unwrap(SharedSessionContractImplementor.class), null);
I have a paginated endpoint which internally uses Hibernate Criteria to fetch certain objects and relations. The FetchMode is set as FetchMode.JOIN.
When I am trying to hit the endpoint, the request seems to work fine for few pages but is then erring out with :
could not initialize proxy - no Session
Method is as as below:
#Override
public Page<Person> findAllNotDeleted(final Pageable pageable)
{
final var criteria = createCriteria();
criteria.add(Restrictions.or(Restrictions.isNull(DELETED), Restrictions.eq(DELETED, false)));
criteria.setFetchMode(PERSON_RELATION, FetchMode.JOIN);
criteria.setFetchMode(DEPARTMENT_RELATION, FetchMode.JOIN);
criteria.setFirstResult((int) pageable.getOffset());
criteria.setMaxResults(pageable.getPageSize());
criteria.addOrder(asc("id"));
final var totalResult = getTotalResult();
return new PageImpl<>(criteria.list(), pageable, totalResult);
}
private int getTotalResult()
{
final Criteria countCriteria = createCriteria();
countCriteria.add(Restrictions.or(Restrictions.isNull(DELETED), Restrictions.eq(DELETED, false)));
return ((Number) countCriteria.setProjection(Projections.rowCount()).uniqueResult()).intValue();
}
Also, the call to findAllNotDeleted is done from a method anotated with #Transactional.
Not sure what is going wrong.
Any help would be highly appreciated.
EDIT
I read that FetchMode.Join does not work with Restrictions. So I tried implementing it using CriteriaBuilder but again stuck with the issue.
#Override
public Page<Driver> findAllNotDeleted(final Pageable pageable)
{
final var session = getCurrentSession();
final var builder = session.getCriteriaBuilder();
final var query = builder.createQuery(Person.class);
final var root = query.from(Driver.class);
root.join(PERSON_RELATION, JoinType.INNER)
.join(DEPARTMENT_RELATION,JoinType.INNER);
//flow does not reach here.....
var restrictions_1 = builder.isNull(root.get(DELETED));
var restrictions_2 = builder.equal(root.get(DELETED), false);
query.select(root).where(builder.or(restrictions_1,restrictions_2));
final var result = session.createQuery(query).getResultList();
return new PageImpl<>(result, pageable, result.size());
}
The flow does not seem to reach after root.join.
EDIT-2
The relations are as follows:
String PERSON_RELATIONSHIP = "person.address"
String DEPARTMENT_RELATION = "person.department"
and both person, address, department themselves are classes which extend Entity
I guess the associations you try to fetch i.e. PERSON_RELATION or DEPARTMENT_RELATION are collections? In such a case, it is not possible to directly do pagination on the entity level with Hibernate. You would have to fetch the ids first and then do a second query to fetch just the entities with the matching ids.
You could use Blaze-Persistence on top of Hibernate though which has a special pagination API that does these tricks for you behind the scenes. Here is the documentation about the pagination: https://persistence.blazebit.com/documentation/core/manual/en_US/index.html#pagination
There is also a Spring Data integration, so you could also use the Spring Data pagination convention along with Blaze-Persistence Entity-Views which are like Spring Data Projections on steroids. You'd use Page<DriverView> findByDeletedFalseOrDeletedNull(Pageable p) with
#EntityView(Driver.class)
interface DriverView {
Long getId();
String getName();
PersonView getPersonRelation();
DepartmentView getDepartmentRelation();
}
#EntityView(Person.class)
interface PersonView {
Long getId();
String getName();
}
#EntityView(Department.class)
interface DepartmentView {
Long getId();
String getName();
}
Using entity views will only fetch what you declare, nothing else. You could also use entity graphs though:
#EntityGraph(attributePaths = {"personRelation", "departmentRelation"})
Page<Driver> findByDeletedFalseOrDeletedNull(Pageable p);
I'm performing an update via a method using Hibernate and the EntityManager.
This update method is called multiple times (within a loop).
It seems like when I execute it the first time, it locks the table and does not free it.
When trying to update the table via SQL Developer after having closed the application, I see the table is still locked because the update is hanging.
What do you see as a solution to this problem? If you need more information, let me know.
Class
#Repository
#Transactional(propagation = REQUIRES_NEW)
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(String id) {
String query = "UPDATE_QUERY";
Query nativeQuery = entityManager.createNativeQuery(String.format(query, id));
nativeQuery.executeUpdate();
}
}
UPDATE
After having waited more than one hour, I launched the application again and it worked fine once but now again, it hangs.
UPDATE 2 -- I'll give a maximum bounty to whoever helps me solve this
On another place I use an application managed entity manager and it still gives me the same type of errors.
public void fillYirInfo() {
File inputFile = new File("path");
try (InputStream inputStream = new FileInputStream(inputFile);
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream))) {
bufferedReader.lines().skip(1).limit(20).forEach(line -> {
String[] data = line.split(",");
String rnr = data[0];
String linked = data[1];
String email = data.length > 2 ? data[2] : "";
String insuredId = insuredPeopleRepository.getInsuredIdFromNationalId(rnr);
int modifiedCounter = 0;
if (!isNullOrEmpty(insuredId)) {
EntityManager entityManager = emf.createEntityManager();
EntityTransaction transaction = entityManager.getTransaction();
Query nativeQuery = entityManager.createNativeQuery(
"QUERY"
);
transaction.begin();
nativeQuery.executeUpdate();
entityManager.flush();
transaction.commit();
entityManager.close();
}
System.out.println(modifiedCounter + " rows modified");
});
} catch (IOException e) {
e.printStackTrace();
}
}
Try without an update-query:
#Repository
#Transactional(propagation = REQUIRES_NEW)
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(String id) {
//guessing your class name and method..
final YirInfo yirInfo = entityManager.find(YirInfo.class, id);
yirInfo.setSent();
}
}
Might not be as fast as a single update query, but it's possible to get it reasonably fast, unless the amount of data is huge. This is the preferred way of using Hibernate/JPA, instead of thinking in terms of single values and SQL queries, you work with entities/objects and (sometimes) HQL/JPQL queries.
You are using #Transactional annotation. This means you are using Spring Transaction. Then in your UPDATE 2 you are using transaction by yourself and managed by spring (I guess it's another project or class not managed by Spring).
In any case what I would do is to try to update your records in single spring transaction and I'd not use #Transactional in DAO layer but in service layer. Something like this:
Service layer:
#Service
public class YirInfoService {
#Autowired
YirInfoRepository dao;
#Transactional(propagation = REQUIRES_NEW)
public void setSent(List < String > ids) {
dao.setSents(ids);
}
}
DAO layer:
#Repository
public class YirInfoRepository {
#Autowired
EntityManager entityManager;
//Here you can update by using and IN statement or by doing a cycle
//Let's suppose a bulk operation
public void setSents(List < String > ids) {
String query = "UPDATE_QUERY";
for (int i = 0; i < ids.size(); i++) {
String id = ids.get(i);
Query nativeQuery = entityManager.createNativeQuery(String.format(query, id));
nativeQuery.executeUpdate();
if (i % 20 == 0) {
entityManager.flush();
entityManager.clear();
}
}
}
}
The first thing you have to understand is that for the first example, you are using a native query to update rows in the DB. In this case you are completely skipping Hibernate to do anything for you.
In your second example, you have the same thing, you are updating via an update query. You don't need to flush the entity manager as it's only necessary for transferring the pending changes made to your entity objects within that entity manager.
Plus I don't know how your example works as you are autowiring the entity manager and not using the #PersistenceContext annotation. Make sure you use this one properly because you might have misconfigured the application. Also there is no need to manually create the entity manager when using Spring as it looks in the second example. Just use #PersistenceContext to get an entity manager in your app.
You are also mixing up transaction management. In the first example, it's enough if you put the #Transactional annotation to either of your method or to the class.
For the other example, you are doing manual transaction management which makes no sense in this case. If you are using Spring, you can simply rely on declarative transaction management.
The first thing I'd check here is to integrate datasource-proxy into your connection management and log out how your statements are executed. With this info, you can make sure that the query is sent to the DB side and the DB is executing it very slowly, or you are having a network issue between your app and db.
If you find out that the query is sent properly to the DB, you want to analyze your query, because most probably it's just executed very slowly and needs some optimizations. For this, you can use the Explain plan feature, to find out how your execution plan looks like and then make it faster.
I know this is going to be the repetitive question , but I feel my question is bit different.
I have JdbcDAO classes like
#Component
public class JdbcUserDAO implements UserDAO {
#Autowired
MyJdbc myJdbc;
}
I have defined the MyJdbc class as follows :
#Component
public class MyJdbc {
#Autowired
protected JdbcTemplate jdbc;
}
In the MyJdbc class I am defining the insert and batchupdate and calling them through jdbc variable.
Will it create too many connections exceptions.
I have defined the jdbc parameters in application.properties file :
spring.datasource.url=#databaseurl
spring.datasource.username=#username
spring.datasource.password=#password
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.test-on-borrow=true
spring.datasource.max-active=100
spring.datasource.max-wait=10000
spring.datasource.min-idle=10
spring.datasource.validation-query=SELECT 1
spring.datasource.time-between-eviction-runs-millis= 5000
spring.datasource.min-evictable-idle-time-millis=30000
spring.datasource.test-while-idle=true
spring.datasource.test-on-borrow=true
spring.datasource.test-on-return=false
I am getting the exception :
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Too many connections
I have done many changes to the application.properties file for various constant but it didn't work. My db is hosted on AWS RDS.
But for updating the blob image values I do :
blob= myJdbc.jdbc.getDataSource().getConnection().createBlob();
blob.setBytes(1, str.getBytes());
pstmt = myJdbc.jdbc.getDataSource().getConnection().prepareStatement("update user_profile set profileImage=? where user_profile.id in ( select id from user_login where email=?)");
blob= myJdbc.jdbc.getDataSource().getConnection().createBlob();
blob.setBytes(1, str.getBytes());
pstmt = myJdbc.jdbc.getDataSource().getConnection().prepareStatement("update user_profile set profileImage=? where user_profile.id in ( select id from user_login where email=?)");
The problem is with your code. That code opens 2 additional connections to the database without closing them. You are opening connections yourself then you should also close them. However it is better to use a ConnectionCallback in those cases.
myJdbc.execute(new ConnectionCallback() {
public Object doInConnection(Connection con) throws SQLException, DataAccessException {
blob = con.createBlob();
blob.setBytes(1, str.getBytes());
pstmt = con.prepareStatement("update user_profile set profileImage=? where user_profile.id in ( select id from user_login where email=?)");
return null;
}
});
However it is even easier to use Spring JDBCs Blob support (see the reference guide). That way you don't need to mess around with connections and blobs yourself.
final String query = "update user_profile set profileImage=? where user_profile.id in ( select id from user_login where email=?)";
myJdbc.jdbc.execute(query, new AbstractLobCreatingPreparedStatementCallback(lobHandler) { 1
protected void setValues(PreparedStatement ps, LobCreator lobCreator) throws SQLException {
byte[] bytes = str.getBytes();
ps.setString(2, email);
lobCreator.setBlobAsBinaryStream(ps, 1, str.getBytes());
}
});
What I am trying to perform is the following:
I have some complex SQL (with SUM(distance) distanceSum as identifier for the returned columnd) that returns some values that should be parsed to a class (containing just the values needed for these columns).
However, I only need the result in memory and not as entity.
I already tried to create a repository to execute the SQL with a #Query annotation with native = true. However, the repository can't be autowired, probably because Repositories are only meant for entities.
So is there some way to tweak a repository for non-entities or is there a approach other than repositories that would let me execute SQL and parse the result automatically into an object.
Basically as #dunni said you can use the JdbcTemplate with your own mapper to convert the SQL result to Java POJO:
public CustomResult getCustomResult(){
final String complexSql = "SELECT SUM(distance) as distanceSum....";
final CustomResult customResult = (CustomResult) jdbcTemplate.queryForObject(complexSql, new CustomResultRowMapper());
return customResult;
}
public class CustomResultRowMapper implements RowMapper {
public Object mapRow(ResultSet rs, int rowNum) throws SQLException {
CustomResult customResult = new CustomResult();
customResult.setDistanceSum(rs.getInt("distanceSum"));
...
return customResult;
}
}
Also in Spring Boot you don't need to do anything just add your jdbcTemplate to your Dao class:
#Autowired
private JdbcTemplate jdbcTemplate;