DataNucleus: Class ... for query has not been resolved. Check the query and any imports specification - spring

I have a problem, hopefully someone can give me some hints.
Environment:
maven project with two modules
one module is the 'model', and has DataNucleus 3.1, HSQLDB and Spring 3 dependencies. HSQLDB runs embedded, in memory, configured from spring applicationContext.xml
the other module is the 'web' and has GWT dependencies
The application is built using some Spring Roo generated code as basis, later modified and extended.
The issue is that, when starting the app and trying to load the data, I receive the exception:
Class Document for query has not been resolved. Check the query and any imports specification; nested exception is javax.persistence.PersistenceException: Class Document for query has not been resolved. Check the query and any imports specification
The weirdest thing is that the sample roo-generated aplication used as basis, with exactly the same dependencies, but a different modularization works like a charm, without this symptom, so I am puzzled now...
Please also note that I tried to replace the 'Document' with the explicit qualification 'com.myvdm.server.domain.Document' in the query, with no positive result:
return entityManager().createQuery("SELECT COUNT(o) FROM Document o", Long.class).getSingleResult();
Another thing, although it might not be relevant, on every request, this exception is thrown:
DEBUG org.springframework.orm.jpa.EntityManagerFactoryUtils - Unexpected exception on closing JPA EntityManager [INFO] java.lang.IllegalStateException: EntityManager is managed by a container (JEE) and so cannot be closed by calling the EM.close() method. Please read JPA2 spec 3.1.1 for the close() method.
The last exception is thrown by DataNucleus. It's also confusing, since I do not run in a Java EE container, but GWT development mode.
Here's the document entity:
#RooJavaBean
#RooToString
#RooJpaActiveRecord
public class Document {
#NotNull
private String name;
#ManyToOne
private DocumentType type;
#OneToMany(fetch = FetchType.EAGER,
cascade = CascadeType.ALL)
private Set<Field> fields;
}
The annotation #RooJpaActiveRecord adds EntityManager operations but these are declared in a separate file - ITD(inter-type declarations)
Any suggestions, please?
Thanks a lot in advance.
----------- EDIT --------------
privileged aspect Document_Roo_Jpa_ActiveRecord {
#PersistenceContext
transient EntityManager Document.entityManager;
public static final EntityManager Document.entityManager() {
EntityManager em = new Document().entityManager;
if (em == null) throw new IllegalStateException("Entity manager has not been injected (is the Spring Aspects JAR configured as an AJC/AJDT aspects library?)");
return em;
}
public static long Document.countDocuments() {
return entityManager().createQuery("SELECT COUNT(o) FROM Document o", Long.class).getSingleResult();
}
public static List<Document> Document.findAllDocuments() {
return entityManager().createQuery("SELECT o FROM Document o", Document.class).getResultList();
}
public static Document Document.findDocument(Long id) {
if (id == null) return null;
return entityManager().find(Document.class, id);
}
public static List<Document> Document.findDocumentEntries(int firstResult, int maxResults) {
return entityManager().createQuery("SELECT o FROM Document o", Document.class).setFirstResult(firstResult).setMaxResults(maxResults).getResultList();
}
#Transactional
public void Document.persist() {
if (this.entityManager == null) this.entityManager = entityManager();
this.entityManager.persist(this);
}
#Transactional
public void Document.remove() {
if (this.entityManager == null) this.entityManager = entityManager();
if (this.entityManager.contains(this)) {
this.entityManager.remove(this);
} else {
Document attached = Document.findDocument(this.id);
this.entityManager.remove(attached);
}
}
#Transactional
public void Document.flush() {
if (this.entityManager == null) this.entityManager = entityManager();
this.entityManager.flush();
}
#Transactional
public void Document.clear() {
if (this.entityManager == null) this.entityManager = entityManager();
this.entityManager.clear();
}
#Transactional
public Document Document.merge() {
if (this.entityManager == null) this.entityManager = entityManager();
Document merged = this.entityManager.merge(this);
this.entityManager.flush();
return merged;
}
}
#Entity declaration
privileged aspect Document_Roo_Jpa_Entity {
declare #type: Document: #Entity;
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name = "id")
private Long Document.id;
#Version
#Column(name = "version")
private Integer Document.version;
public Long Document.getId() {
return this.id;
}
public void Document.setId(Long id) {
this.id = id;
}
public Integer Document.getVersion() {
return this.version;
}
public void Document.setVersion(Integer version) {
this.version = version;
}
}

Ok, I found a fix for this problem.
As I posted earlier, using Spring's applicationContext.xml and persistence.xml files as basic configuration I could not make it work. I deleted persistence.xml and instead I used this configuration (please note the usage of packagesToScan property and passsing DataNucleus properties - and basically all the info that was traditionally inside persistence.xml):
<bean class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
id="entityManagerFactory">
<property name="persistenceUnitName" value="persistenceUnit"/>
<property name="packagesToScan" value="com.myvdm.server.domain"/>
<property name="persistenceProviderClass" value="org.datanucleus.api.jpa.PersistenceProviderImpl"/>
<property name="jpaPropertyMap">
<map>
<entry key="datanucleus.ConnectionDriverName" value="org.hsqldb.jdbc.JDBCDriver"/>
<entry key="datanucleus.storeManagerType" value="rdbms"/>
<entry key="datanucleus.ConnectionURL" value="jdbc:hsqldb:mem:myvdm"/>
<entry key="datanucleus.ConnectionUserName" value="sa"/>
<entry key="datanucleus.ConnectionPassword" value=""/>
<entry key="datanucleus.autoCreateSchema" value="true"/>
<entry key="datanucleus.autoCreateTables" value="true"/>
<entry key="datanucleus.autoCreateColumns" value="false"/>
<entry key="datanucleus.autoCreateConstraints" value="false"/>
<entry key="datanucleus.validateTables" value="false"/>
<entry key="datanucleus.validateConstraints" value="false"/>
<entry key="datanucleus.jpa.addClassTransformer" value="false"/>
</map>
</property>
<property name="dataSource" ref="dataSource"/>
</bean>
So this is the only way I could make it work, could this be a Spring bug?
And about that second (minor) issue, obviously the exception will be thrown since I am using Spring's LocalContainerEntityManagerFactoryBean :)

It is not a Spring bug, it is simply a "Debug" level message. To suppress that message, change the log level <logger name="org.springframework.orm.jpa.EntityManagerFactoryUtils" additivity="false" level="error"/>.
See my ORM demo here: https://github.com/gordonad/core-spring-demos/tree/master/demos/orms

Related

Spring Data Solr multiple cores and repository

I have apache solr with multiple cores e.g. currency, country etc... So using Spring Data Solr I can retrieve information from one core. I have got this XML configuration right now queries against 'currency' core. If I wanted to query against 'country' core how can I set this up?
<!-- Enable Solr repositories and configure repository base package -->
<solr:repositories base-package="com.acme.repository" solr-template-ref="solrCurrencyTemplate"/>
<solr:solr-server id="solrCurrencyServer" url="http://localhost:8983/solr/currency"/>
<bean id="solrCurrencyTemplate" class="org.springframework.data.solr.core.SolrTemplate">
<constructor-arg ref="solrCurrencyServer" />
</bean>
and have the repository defined as
#Repository
public interface CurrencyRepository extends SolrCrudRepository<Currency, String> {
}
and from my service I can do this
#Override
public List<Currency> getCurrencies() {
Page<Currency> currencies = (Page<Currency>) currencyRepository.findAll();
return currencies.getContent();
}
I have also tried using #SolrDocument(solrCoreName = "currency") but this din't work.
#SolrDocument(solrCoreName = "currency")
public class Currency {
public static final String FIELD_CURRENCY_NAME = "currency_name";
public static final String FIELD_CURRENCY_CODE = "currency_code";
public static final String FIELD_DECIMALS = "decimals";
#Id
#Field(value = FIELD_CURRENCY_CODE)
private String currencyCode;
//currency_name,decimals
#Field(value = FIELD_CURRENCY_NAME)
private String currencyName;
#Field(value = FIELD_DECIMALS)
private String decimals;
...
...
...
}
I need help on this asap... otherwise I will have to go back to the RestTemplate Solution :-(
Hope someone can help.
Thanks
GM
Thought I would share, We spend lot of time recently configuring multiple cores. We did in java, not xml.
As part of spring #configuration add following.
#Bean(name="solrCore1Template")
public SolrTemplate solrCore1Template() throws Exception {
EmbeddedSolrServer embeddedSolrServer = new EmbeddedSolrServer(getCoreContainer(), "core1");
return new SolrTemplate(embeddedSolrServer);
}
#Bean(name="solrCore2Template")
public SolrTemplate solrCore2Template() throws Exception {
EmbeddedSolrServer embeddedSolrServer = new EmbeddedSolrServer(getCoreContainer(), "core2");
return new SolrTemplate(embeddedSolrServer);
}
#Bean
#Scope
public CoreContainer getCoreContainer() throws FileNotFoundException{
String dir = <path_to_solr_home>;
System.setProperty("solr.solr.home", dir);
CoreContainer.Initializer initializer = new CoreContainer.Initializer();
return initializer.initialize();
}
And to use each template use like below in service classes.
#Resource
private SolrTemplate solrCore1Template;
Embedded server can be relaced with HTTP using below code.
HttpSolrServer httpSolrServer = new HttpSolrServer(getSolrURL());
return new SolrTemplate(httpSolrServer, "core1");
Hope this helps. I know it's a very late reply for the question asked.
multicore support via namespace config is unfortunately an open issue. You'll need to have a separate SolrTemplate for each core and create repositories manually.
#Autowired
#Qualifier("solrCurrencyTemplate")
private SolrTemplate solrCurrencyTemplate;
#Autowired
#Qualifier("solrCountryTemplate")
private SolrTemplate solrCountryTemplate;
//...
CurrencyRepository currencyRepo = new SolrRepositoryFactory(this.solrCurrencyTemplate)
.getRepository(CurrencyRepository.class);
CountryRepository countryRepo = new SolrRepositoryFactory(this.solrCountryTemplate)
.getRepository(CountryRepository.class);
Spring Data now supports multiple cores with their respective repositories.
The multicoreSupport flag needs to be true in #EnableSolrRepositories annotation and the corresponding document needs to be told what core they belong to. Like:
#SolrDocument(solrCoreName = "currency")
public class Currency
{
// attributes
}
the other class should be
#SolrDocument(solrCoreName = "country")
public class Country
{
// attributes
}
The respective repositories should know what pojo they are working with.
public interface CurrencyRepository extends SolrCrudRepository<Currency,String>
{
}
and
public interface CountryRepository extends SolrCrudRepository<Country,String>
{
}
and configuration should be
#Configuration
#EnableSolrRepositories(value = "com.package.name",multicoreSupport = true)
public class SolrConfig
{
#Bean
public SolrServer solrServer() throws Exception
{
HttpSolrServerFactoryBean f = new HttpSolrServerFactoryBean();
f.setUrl("http://localhost:8983/solr");
f.afterPropertiesSet();
return f.getSolrServer();
}
#Bean
public SolrTemplate solrTemplate(SolrServer solrServer) throws Exception
{
return new SolrTemplate(solrServer());
}
}
With Spring Data Solr 1.1.0.RC1 multiple cores works as described by Christoph Strobl with #EnableSolrRepositories. It works also with an XML configuration by set multicore-support="true".
<solr:repositories base-package="your.solr.repo.package" repository-impl-postfix="Impl" multicore-support="true"/>
<solr:solr-server id="solrServer" url="${solr.server.base.connection.url}" />
<bean id="solrTemplate" class="org.springframework.data.solr.core.SolrTemplate">
<constructor-arg index="0" ref="solrServer" />
</bean>
<solr:solr-server id="solrServer" timeout="1000" maxConnections="1000" url="${solr.server.1},${solr.server.2}"/>
<bean id="solrServerFactory" class="org.springframework.data.solr.server.support.MulticoreSolrServerFactory">
<constructor-arg ref="solrServer" />
<constructor-arg name="cores">
<list>
<value>${solr.index.customer}</value>
<value>${solr.index.task}</value>
</list>
</constructor-arg>
</bean>
<bean id="solrTemplate" class="org.springframework.data.solr.core.SolrTemplate">
<constructor-arg ref="solrServerFactory" />
</bean>
<solr:repositories base-package="com.deve.pig.solr" multicore-support="true" solr-template-ref="solrTemplate" />

HibernateTemplate save performs inserts but not updates

I have a typical Spring / Hibernate setup. Here's my spring config:
<context:annotation-config />
<context:component-scan base-package="com.myco.myapp.modules" />
<tx:annotation-driven transaction-manager="transactionManager"/>
<bean id="sessionFactory"
...
</bean>
<bean id="transactionManager"
class="org.springframework.orm.hibernate3.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
I have a BaseRepository:
#Transactional(propagation = Propagation.MANDATORY)
public final T save(final T entity) throws RepositoryException {
try {
getHibernateTemplate().save(entity);
return entity;
} catch (DataAccessException e) {
throw new EntityCouldNotBeSavedException(getPersistentClass(),
e);
}
}
And a Repository class that extends it:
#Repository
public class PersonRepositoryImpl extends BaseRepositoryImpl<Person, String>
And a Service:
#Service
public class PersonServiceImpl {
#Autowired
private PersonRepository _personRespository;
I call the following method, saveSomeStuff(), an when I insert using BaseRepository.save() it works perfectly. But when I try to update, it doesn't make the change:
#Override
#Transactional
public void saveSomeStuff() {
try {
Person existingPerson = _personRespository.findById("1");
existingPerson.setName("John");
_personRespository.save(existingPerson);
Person dbExistingPerson = _personRespository.findById("1");
// This prints "John".
System.out.println(dbExistingPerson.getName());
Person newPerson = new Person();
newPerson.setName("Jack");
_personRespository.save(newPerson);
} catch (RepositoryException e) {
e1.printStackTrace();
}
}
I thought I might have a transaccionality problem, but as I said, upon leaving the Service method the new Person is persisted in the database. In the log I see:
insert into person ...
However, the update I made is not persisted, and there is no error and no 'update' sql statement in the log. I thought the HibernateTemplate.save() method might be the problem but from within the saveSomeStuff() method, after loading the Person from the database, I do a System.out, and the Person loaded from the database has the updated name.
What am I missing here?
There is a separate method, saveOrUpdate(entity). You can use it if you don't want hibernate to generate id while saving.
Save method will Persists an entity. Will assign an identifier if one doesn't exist. If one does, it's essentially doing an update. Returns the generated ID of the entity.
Figured out the problem. If I had included my Entity class, someone probably would have seen it sooner than me.
#Entity
#Cache(usage = CacheConcurrencyStrategy.READ_ONLY)
#Immutable
#Table(name = "PEOPLE")
public class Person {
...
}
Initially I was getting a cache error:
java.lang.UnsupportedOperationException: Can't write to a readonly object
The quick solution? Add the #Immutable annotation. But if you read the docs for it:
An immutable entity may not be updated by the application.
Updates to an immutable entity will be ignored, but no exception is thrown.
Which explains why 1) updates were being ignored and 2) no exceptions were being thrown.
So I got rid of the #Immutable annotation and changed Cache to:
#Cache(usage = CacheConcurrencyStrategy.READ_WRITE)
And now everything works fine.
In summary: rtfm.
I had stumbled upon the same problem. The entity was getting inserted into the database, but while updating some of the columns where not getting updated and there were no errors in the log. After going through the entity class, I figured out that I had annotated some of my fields as below
#Column(name = "CREATED_DT", updatable = false)
private Date createdOn;
After removing the updatable attribute from the annotation, the update was working fine.

Programmatic access to properties created by property-placeholder

I'm reading properties file using context:property-placeholder. How can I access them programatically (#Value doesn't work - I don't know property titles at the moment of developing)?
The main problem is I can't change applicationContext.xml file because it's setted up by "parent" framework
ps. It's strange but Environment.getProperty returns null
No you can't. PropertyPlaceholderConfigurer is a BeanFactoryPostProcessor, it is only "alive" during bean creation. When it encounters a ${property} notation, it tries to resolve that against its internal properties, but it does not make these properties available to the container.
That said: similar questions have appeared again and again, the proposed solution is usually to subclass PropertyPlaceHolderConfigurer and make the Properties available to the context manually. Or use a PropertiesFactoryBean
We use the following approach to access properties for our applications
<util:properties id="appProperties" location="classpath:app-config.properties" />
<context:property-placeholder properties-ref="appProperties"/>
Then you have the luxury of just autowiring properties into beans using a qualifier.
#Component
public class PropertyAccessBean {
private Properties properties;
#Autowired
#Qualifier("appProperties")
public void setProperties(Properties properties) {
this.properties = properties;
}
public void doSomething() {
String property = properties.getProperty("code.version");
}
}
If you have more complex properties you can still use ignore-resource-not-found and ignore-unresolvable. We use this approach to externalise some of our application settings.
<util:properties id="appProperties" ignore-resource-not-found="true"
location="classpath:build.properties,classpath:application.properties,
file:/data/override.properties"/>
<context:property-placeholder ignore-unresolvable="true" properties-ref="appProperties"/>
#Value
annotation works on new releases of Spring (tested on v3.2.2)
Here is how it is done:
Map your properties file in spring configuration file
<!--Import Info:
xmlns:context="http://www.springframework.org/schema/context"
http://www.springframework.org/schema/context
http://www.springframework.org/schema/context/spring-context-3.2.xsd-->
<context:property-placeholder location="classpath:/app-config.properties" />
Create app-config.properties inside (root) your source folder
my.property=test
my.property2=test2
Create a controller class
#Controller
public class XRDSBuilder
{
#Value("${my.property}")
private String myProperty;
public String getMyProperty() { return myProperty; }
}
Spring will automatically map the content of my.property to your variable inside the controller
Mapping to a list
Property value:
my.list.property=test,test2,test3
Controller class configuration:
#Value("#{'${my.list.property}'.split(',')}")
private List<String> myListProperty;
Advanced mapping
#Component("PropertySplitter")
public class PropertySplitter {
/**
* Example: one.example.property = KEY1:VALUE1,KEY2:VALUE2
*/
public Map<String, String> map(String property) {
return this.map(property, ",");
}
/**
* Example: one.example.property = KEY1:VALUE1.1,VALUE1.2;KEY2:VALUE2.1,VALUE2.2
*/
public Map<String, List<String>> mapOfList(String property) {
Map<String, String> map = this.map(property, ";");
Map<String, List<String>> mapOfList = new HashMap<>();
for (Entry<String, String> entry : map.entrySet()) {
mapOfList.put(entry.getKey(), this.list(entry.getValue()));
}
return mapOfList;
}
/**
* Example: one.example.property = VALUE1,VALUE2,VALUE3,VALUE4
*/
public List<String> list(String property) {
return this.list(property, ",");
}
/**
* Example: one.example.property = VALUE1.1,VALUE1.2;VALUE2.1,VALUE2.2
*/
public List<List<String>> groupedList(String property) {
List<String> unGroupedList = this.list(property, ";");
List<List<String>> groupedList = new ArrayList<>();
for (String group : unGroupedList) {
groupedList.add(this.list(group));
}
return groupedList;
}
private List<String> list(String property, String splitter) {
return Splitter.on(splitter).omitEmptyStrings().trimResults().splitToList(property);
}
private Map<String, String> map(String property, String splitter) {
return Splitter.on(splitter).omitEmptyStrings().trimResults().withKeyValueSeparator(":").split(property);
}
}
Property value:
my.complex.property=test1:value1,test2:value2
Controller class:
#Value("#{PropertySplitter.map('${my.complex.property}')}")
Map<String, String> myComplexProperty;
Spring follows Inversion Of Control approach, this means that we can simply inject particular property into POJO. But there are some cases, when you would like to access property given by name directly from your code - some might see it as anti-pattern - this is palpably true, but lets concentrate on how to do it.
The PropertiesAccessor below provides access to properties loaded by Property Placeholder and encapsulates container specific stuff. It also caches found properties because call on AbstractBeanFactory#resolveEmbeddedValue(String) is not cheap.
#Named
public class PropertiesAccessor {
private final AbstractBeanFactory beanFactory;
private final Map<String,String> cache = new ConcurrentHashMap<>();
#Inject
protected PropertiesAccessor(AbstractBeanFactory beanFactory) {
this.beanFactory = beanFactory;
}
public String getProperty(String key) {
if(cache.containsKey(key)){
return cache.get(key);
}
String foundProp = null;
try {
foundProp = beanFactory.resolveEmbeddedValue("${" + key.trim() + "}");
cache.put(key,foundProp);
} catch (IllegalArgumentException ex) {
// ok - property was not found
}
return foundProp;
}
}
Found answer at below site:
http://forum.spring.io/forum/spring-projects/container/106180-programmatic-access-to-properties-defined-for-the-propertyplaceholderconfigurer
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer" id="propertyConfigurer">
<property name="properties" ref="props" />
</bean>
<bean id="props" class="org.springframework.beans.factory.config.PropertiesFactoryBean">
<property name="location" value="file:C:/CONFIG/settings.properties"/>
</bean>
<util:properties id="prop" location="location of prop file" />
This return java.util.Properties object
In JAVA Code
Properties prop = (Properties) context.getBean("prop");
Now you can access ,
prop.getProperty("key");
This works if you need to scan multiple locations for your properties ...
<bean id="yourProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean">
<property name="locations">
<array value-type="org.springframework.core.io.Resource">
<value>classpath:yourProperties.properties</value>
<value>file:../conf/yourProperties.properties</value>
<value>file:conf/yourProperties.properties</value>
<value>file:yourProperties.properties</value>
</array>
</property>
<property name="ignoreResourceNotFound" value="true" />
</bean>
<context:property-placeholder properties-ref="yourProperties" ignore-unresolvable="true"/>
And then in your actual classes ...
#Autowired
Properties yourProperties;
Tested using Spring 5.1.4
Create beans for your properties before putting them in property-placeholder to make the properties easy to access in-code.
Ex:
<bean id="configProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean">
<property name="resources" value="classpath:META-INF/spring/config.properties" />
</bean>
<context:property-placeholder properties-ref="configProperties" ignore-unresolvable="true"/>
Code:
#Autowired
private PropertiesFactoryBean configProperties;
You can also use #Resource(name="configProperties")
Let's asume that you the properties file defined in that "parent" framework
<bean id="applicationProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean">
<property name="location" value="classpath:main.properties" />
</bean>
You can use the #Value annotation in this way:
#Value( value = "#{applicationProperties['my.app.property']}" )
private String myProperty;

Dozer custom converter ID mapping: Object to Long and Long to Object via DozerConverter getParameter

I need help configuring my dozer mapping file.
Mainly I would like to know how to get User user obejct to convert to Long userId.
Hence map: user >> userId
But I have multiple objects such as comment >> commentId or address >> addressId
therefor I'd like to have something more elegant than just writing mapping for each of the fields. All of the object implement Loadable interface.
The bellow code is now functioning thanks to the getParameter() DozerConverter method, but if you know any better way than the converter that I wrote please let me know.
// dozer.xml
<?xml version="1.0" encoding="UTF-8"?>
<mappings xmlns="http://dozer.sourceforge.net"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://dozer.sourceforge.net http://dozer.sourceforge.net/schema/beanmapping.xsd">
<configuration>
<custom-converters>
<converter type="project.shared.domain.dto.dozer.LoadableIdConverter" >
<class-a>project.shared.domain.Loadable</class-a>
<class-b>java.lang.Long</class-b>
</converter>
</custom-converters>
</configuration>
<mapping>
<class-a>project.shared.domain.Suggestion</class-a>
<class-b>project.shared.domain.dto.DTOSuggestion</class-b>
<field custom-converter-param="User">
<a>user</a>
<b>userId</b>
</field>
</mapping>
</mappings>\
// Spring Application context
<bean id="loadableIdConverter" class="project.shared.domain.dto.dozer.LoadableIdConverter">
<property name="userService" ref="userService"/>
<property name="commentService" ref="commentService"/>
<property name="addressService" ref="addressService"/>
</bean>
<bean id="gwtMapper" class="org.dozer.DozerBeanMapper">
<property name="mappingFiles">
<list>
<value>classpath:/dozer.xml</value>
</list>
</property>
<property name="customConverters">
<list>
<ref bean="loadableIdConverter"/>
</list>
</property>
</bean>
//Standard hibernate object
public class Suggestion implements Serializable, Loadable {
private long id = -1;
private Date dateCreated;
private User user; //trying to use dozer to covert this bad boy to Long userId
//...
}
//DTO object
public class DTOSuggestion implements IsSerializable {
private long id = -1;
private Date dateCreated;
private Long userId; //trying to get this ID via the dozer converter
//...
}
//Loadable interface
public interface Loadable extends Serializable {
public long getId();
public void setId(long id);
}
//Dozer converter
public class LoadableIdConverter extends DozerConverter<Loadable, Long> {
private UserService userService; //configured in applicationcontext
private AddressService addressService; //configured in applicationcontext
private CommentService commentService; //configured in applicationcontext
public LoadableIdConverter() {
super(Loadable.class, Long.class);
}
public Long convertTo(Loadable object, Long id) {
return object.getId();
}
public Loadable convertFrom(Long id, Loadable object) {
if (id < 0) return null;
String loadable = getParameter();
if (loadable.equalsIgnoreCase("User"))
return userService.get(User.class, id);
if (loadable.equalsIgnoreCase("Address"))
return addressService.get(Address.class, id);
if (loadable.equalsIgnoreCase("Comment"))
return commentService.get(Comment.class, id);
return null;
}
}
There is one trick you could use to avoid converter parameters. If you fall back to older custom converter approach in Dozer, which is implementing CustomConverter interface, you will get two additional parameters: existingDestinationValue and destinationClass.
convert(Object existingDestinationFieldValue, Object sourceFieldValue, Class<?> destinationClass, Class<?> sourceClass)
By using these values you could introspect your destination field via reflection and know what is the expected concrete implementation of Loadable interface. This works only if you define the field types with concrete types of course. But you already have it in your example, so this should not be a problem. CustomConverter implementation will be more verbose as you need to determine the direction of the mapping manually, but it gives you full control of what is going on during the mapping process.

Spring #Transactional wrapping 2 methods

I'm a Spring newby. I use the #Transactional annotation for my dao methods:
#Transactional
public Person getById(long id) {
return new Person(jdbcTemplate.queryForMap(...));
}
#Transactional
public void save(Person person) {
jdbcTemplate.update(...);
}
and I've set up the transaction manager like this:
<tx:annotation-driven transaction-manager="txManager" />
<bean id="txManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource" />
</bean>
The problem is that when my client code calls dao.save(..) and then dao.getById(4) these happen in two separate transactions. How is it possible to wrap those 2 calls in the same database transaction? Ideally without doing it in a programmatic way.
thanks
It is bad practice to put transactional attributes in DAO layer. Also, I am not sure why do you require transaction for getById method. Even if you want to use transaction then you need to specify propagation behaviour as REQUIRES_NEW for save and getById method.
#Transactional(propagation = REQUIRES_NEW, readOnly = false)
public Person saveAndGetById(Person person, long id) {
save(person);
return getById(id);
}
#Transactional(propagation = REQUIRED)
public Person getById(long id) {
return new Person(jdbcTemplate.queryForMap(...));
}
#Transactional(propagation = REQUIRED, readOnly = false)
public void save(Person person) {
jdbcTemplate.update(...);
}
However, the best thing would be to have the "save" method return an ID, because it is hard to know beforehand which ID the Person will have once persisted.
Good practice in this case would be marking service method which invokes both these DAO methods as #Transactional. The case was clearly discussed here.

Resources