I have nested map.
Map<String, Map<String, Long>> domains;
When I access it in following manner I get InvalidReferenceException
<#list somelist as item>
<#list domains[item]?keys as key>
${key}
</#list>
</#list>
Expression domains[item] is undefined on line...
I tried using BeansWrapper as well as DefaultObjectWrapper. Also I have checked contents on the map and none of the values are null.
Works for me with both DefaultObjectWrapper and BeansWrapper, so I guess somelist contains some bad keys.
Here's the test code I have used; put the template into test.ftl in the same directory where the class bellow is:
package adhoc;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.OutputStreamWriter;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import freemarker.ext.beans.BeansWrapper;
import freemarker.template.Configuration;
import freemarker.template.TemplateException;
public class TestTemplateRunner {
public static void main(String[] args) throws FileNotFoundException, IOException, TemplateException {
Configuration cfg = createConfiguration(false);
cfg.getTemplate("test.ftl").process(
createDataModel(),
new OutputStreamWriter(System.out));
}
private static Configuration createConfiguration(boolean useBeansWrapper) {
Configuration cfg = new Configuration();
cfg.setClassForTemplateLoading(TestTemplateRunner.class, "");
if (useBeansWrapper) {
BeansWrapper bw = new BeansWrapper();
bw.setSimpleMapWrapper(true);
cfg.setObjectWrapper(bw);
}
return cfg;
}
private static Map<String, Object> createDataModel() {
Map<String, Object> dataModel = new HashMap<String, Object>();
Map<String, Map<String, Long>> domains = new HashMap<String, Map<String, Long>>();
dataModel.put("domains", domains);
Map<String, Long> domain1Hits = new HashMap<String, Long>();
domain1Hits.put("/index", 1234L);
domain1Hits.put("/about", 123L);
domains.put("site1.com", domain1Hits);
Map<String, Long> domain2Hits = new HashMap<String, Long>();
domain2Hits.put("/", 2234L);
domain2Hits.put("/contact", 223L);
domains.put("site2.com", domain2Hits);
List<String> somelist = new ArrayList<String>();
somelist.add("site1.com");
somelist.add("site2.com");
dataModel.put("somelist", somelist);
return dataModel;
}
}
Related
spring boot 2.x, kie-ci 7.59.0.Final
package com.collyer.rest.config;
import org.kie.api.KieServices;
import org.kie.api.builder.*;
import org.kie.api.runtime.KieContainer;
import org.kie.internal.io.ResourceFactory;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.Resource;
import org.springframework.core.io.support.PathMatchingResourcePatternResolver;
import org.springframework.core.io.support.ResourcePatternResolver;
import java.io.IOException;
#Configuration
public class DroolsConfig {
private static final String RULES_PATH = "rule/";
#Bean
public KieFileSystem kieFileSystem() throws IOException {
KieFileSystem kieFileSystem = getKieServices().newKieFileSystem();
for (Resource file : getRuleFiles()) {
kieFileSystem.write(ResourceFactory.newClassPathResource(RULES_PATH + file.getFilename(), "UTF-8"));
}
return kieFileSystem;
}
private Resource[] getRuleFiles() throws IOException {
ResourcePatternResolver resourcePatternResolver = new PathMatchingResourcePatternResolver();
return resourcePatternResolver.getResources("classpath*:" + RULES_PATH + "**/*.*");
}
#Bean
public KieContainer kieContainer() throws IOException {
final KieRepository kieRepository = getKieServices().getRepository();
kieRepository.addKieModule(new KieModule() {
public ReleaseId getReleaseId() {
return kieRepository.getDefaultReleaseId();
}
});
KieBuilder kieBuilder = getKieServices().newKieBuilder(kieFileSystem());
kieBuilder.buildAll();
return getKieServices().newKieContainer(kieRepository.getDefaultReleaseId());
}
private KieServices getKieServices() {
return KieServices.Factory.get();
}
}
I have one d.drl file under the src/main/resources/rule, it looks like:
package com.my.rest.service;
//set import and global
rule "groupA_A_0"
agenda-group "groupA"
activation-group "groupA_A"
when
$j : JSONObject(aSize == 1)
then
System.out.println("####A-0");
....
end
the code to fire is:
#Autowired
private KieContainer kieContainer;
public testA() {
KieSession kSession = kieContainer.newKieSession();
//...set global
JSONObject json = new JSONObject();
json.put("aSize", 1);
kSession.getAgenda().getAgendaGroup("groupA").setFocus();
kSession.insert(json);
kSession.fireAllRules();
}
it works fine.
However, if I add another a.drl whose is
package com.my.rest.service;
//set import and global
rule "groupB_A_0"
agenda-group "groupB"
activation-group "groupB_A"
when
$j : JSONObject(bSize == 1)
then
System.out.println("####A-0");
....
end
the corresponding code to fire is:
public testB() {
KieSession kSession = kieContainer.newKieSession();
//...set global
JSONObject json = new JSONObject();
json.put("bSize", 2);
kSession.getAgenda().getAgendaGroup("groupB").setFocus();
kSession.insert(json);
kSession.fireAllRules();
}
the error thrown when executing testB() which is used to fire rules of a.drl:
org.drools.mvel.ConstraintEvaluationException: Error evaluating constraint 'aSize == 1' in [Rule "groupA_A_0" in rule/d.drl]
JSONObject is net.sf.json.JSONObject(like a Map), aSize is put into JSONObject when fire rules of d.drl while bSize is put when fire rules of a.drl.
i just want to fire rules of a.drl, why it said the error of d.drl?(and at this time, fire rules of d.drl is successed)
how to fix it?
I don't know if it's possible, but this is my question:
I hava a batch developed using spring-boot and spring-batch, and I have to call another microservice using Feign...
...help!
this is my class Reader
package it.batch.step;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.NonTransientResourceException;
import org.springframework.batch.item.ParseException;
import org.springframework.batch.item.UnexpectedInputException;
import org.springframework.beans.factory.annotation.Autowired;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import it.client.feign.EmailAccountClient;
import it.dto.feign.mail.account.AccountOutDto;
import it.dto.feign.mail.account.SearchAccountFilterDto;
import it.dto.feign.mail.account.SearchAccountResponseDto;
public class Reader implements ItemReader <String> {
private static final Logger LOGGER = LoggerFactory.getLogger(Reader.class);
private int count = 0;
#Autowired
private EmailAccountClient emailAccountClient;
#Override
public String read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
LOGGER.info("read - begin ");
SearchAccountResponseDto clientResponse = emailAccountClient.searchAccount(getFilter());
if (count < clientResponse.getAccounts().size()) {
return convertToJsonString(clientResponse.getAccounts().get(count++));
} else {
count = 0;
}
return null;
}
private static SearchAccountFilterDto getFilter() {
SearchAccountFilterDto filter = new SearchAccountFilterDto();
return filter;
}
private String convertToJsonString(AccountOutDto account) {
ObjectMapper mapper = new ObjectMapper();
String jsonString = "";
try {
jsonString = mapper.writeValueAsString(account);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
LOGGER.info("Contenuto JSON: " + jsonString);
return jsonString;
}
}
...
when I launch the batch I have this error:
java.lang.NullPointerException: null
at it.batch.step.Reader.read(Reader.java:32) ~[classes/:?]
where line 32 is:
SearchAccountResponseDto clientResponse = emailAccountClient.searchAccount(getFilter());
EmailAccountClient is null
Your client is null: your reader is not a #Component so Spring can't autowire the client. You must use a workaround like passing the autowired client through the constructor when you instantiate the reader, like this:
private EmailAccountClient client;
public reader(EmailAccountClient client){
this.client=client;
}
in the other class:
#Autowired
private EmailAccountClient client;
#Bean
public ItemReader<String> reader(){
return new Reader(client)
}
I am facing issue while testing kafka with camel. I used Embedded kafka with camel and here's what I tried
I have tried this example which tells us about testing kafka using embedded kafka
https://codenotfound.com/spring-kafka-embedded-unit-test-example.html
package com.codenotfound.kafka.producer;
import static org.assertj.core.api.Assertions.assertThat;
import static org.junit.Assert.assertThat;
import static org.springframework.kafka.test.assertj.KafkaConditions.key;
import static org.springframework.kafka.test.hamcrest.KafkaMatchers.hasValue;
import java.util.Map;
import java.util.Map.Entry;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.TimeUnit;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.main.Main;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.junit.After;
import org.junit.Before;
import org.junit.ClassRule;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.listener.ContainerProperties;
import org.springframework.kafka.listener.KafkaMessageListenerContainer;
import org.springframework.kafka.listener.MessageListener;
import org.springframework.kafka.test.rule.EmbeddedKafkaRule;
import org.springframework.kafka.test.utils.ContainerTestUtils;
import org.springframework.kafka.test.utils.KafkaTestUtils;
import org.springframework.test.annotation.DirtiesContext;
import org.springframework.test.context.junit4.SpringRunner;
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
public class SpringKafkaSenderTest {
private static final Logger LOGGER = LoggerFactory.getLogger(SpringKafkaSenderTest.class);
private static String SENDER_TOPIC = "sender.t";
#Autowired
private Sender sender;
private KafkaMessageListenerContainer<String, String> container;
private BlockingQueue<ConsumerRecord<String, String>> records;
Object groupId;
Object bootstrapServers;
#ClassRule
public static EmbeddedKafkaRule embeddedKafka = new EmbeddedKafkaRule(1, true, SENDER_TOPIC);
#Before
public void setUp() throws Exception {
// set up the Kafka consumer properties
Map<String, Object> consumerProperties = KafkaTestUtils.consumerProps("sender", "false",
embeddedKafka.getEmbeddedKafka());
for (Entry<String, Object> entry : consumerProperties.entrySet()) {
System.out.println("Key = " + entry.getKey() + ", Value = " + entry.getValue());
if (entry.getKey().equals("group.id")) {
groupId = entry.getValue();
} else if (entry.getKey().equals("bootstrap.servers")) {
bootstrapServers = entry.getValue();
}
}
// create a Kafka consumer factory
DefaultKafkaConsumerFactory<String, String> consumerFactory = new DefaultKafkaConsumerFactory<String, String>(
consumerProperties);
// set the topic that needs to be consumed
ContainerProperties containerProperties = new ContainerProperties(SENDER_TOPIC);
// create a Kafka MessageListenerContainer
container = new KafkaMessageListenerContainer<>(consumerFactory, containerProperties);
// create a thread safe queue to store the received message
records = new LinkedBlockingQueue<>();
// setup a Kafka message listener
container.setupMessageListener(new MessageListener<String, String>() {
#Override
public void onMessage(ConsumerRecord<String, String> record) {
LOGGER.debug("test-listener received message='{}'", record.toString());
records.add(record);
}
});
// start the container and underlying message listener
container.start();
// wait until the container has the required number of assigned partitions
ContainerTestUtils.waitForAssignment(container, embeddedKafka.getEmbeddedKafka().getPartitionsPerTopic());
}
#After
public void tearDown() {
// stop the container
container.stop();
}
#Test
public void testCamelWithKafka() throws Exception {
String topicName = "topic=javainuse-topic";
String kafkaServer = "kafka:localhost:9092";
String zooKeeperHost = "zookeeperHost=localhost&zookeeperPort=2181";
String serializerClass = "serializerClass=kafka.serializer.StringEncoder";
String toKafka = new StringBuilder().append(kafkaServer).append("?").append(topicName).append("&")
.append(zooKeeperHost).append("&").append(serializerClass).toString();
String embedded = new StringBuilder().append(bootstrapServers).append("?").append(topicName).append("&")
// .append(embeddedKafka.getEmbeddedKafka().getZookeeperConnectionString())
.append(zooKeeperHost).append("&").append(serializerClass).toString();
Main main = new Main();
main.addRouteBuilder(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("file:D://inbox//?noop=true").split().tokenize("\n").to("direct:embedded");
}
});
main.run();
ConsumerRecord<String, String> received =
records.poll(10, TimeUnit.SECONDS);
// assertThat(received, hasValue(greeting));
// AssertJ Condition to check the key
// assertThat(received).has(key(null));
// System.out.println(received);
}
}
Camel should able to read from a file and move the data to kafka and consumer should able to read it.
You can use the #Runwith(CamelSpringBootRunner.class) to run the test case.
We are just starting on AWS and have requirement to use AWS ElasticCache with Redis-jedis with Spring.
Spring-data-redis 1.8.8.RELEASE
aws-java-sdk 1.11.228
Spring 4.2.9.RELEASE
jedis 2.9.0
I was able to connect and cache data to local redis with below code. I have tried making code changes as https://github.com/fishercoder1534/AmazonElastiCacheExample/tree/master/src/main/java , but not been successful. Would really appreciate some guidance and help with some sample code.
AWS ElasticCache is currently configured as option 1, but would also need to go to option 2 soon.
1. Non-replicated cluster - Redis cluster-disabled with no replicas
2. Replicated cluster - Redis cluster-enabled and Redis cluster disabled with read replicas.
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.cache.CacheManager;
import org.springframework.cache.annotation.CachingConfigurerSupport;
import org.springframework.cache.annotation.EnableCaching;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
import org.springframework.data.redis.cache.RedisCacheManager;
import org.springframework.data.redis.connection.RedisConnectionFactory;
import org.springframework.data.redis.connection.jedis.JedisConnectionFactory;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.data.redis.serializer.JdkSerializationRedisSerializer;
import org.springframework.data.redis.serializer.StringRedisSerializer;
import redis.clients.jedis.Jedis;
import org.springframework.cache.interceptor.KeyGenerator;
import java.lang.reflect.Method;
import java.util.List;
#Configuration
#EnableCaching
// #PropertySource("classpath:/redis.properties")
public class CacheConfig extends CachingConfigurerSupport {
// private #Value("${redis.host}") String redisHost;
// private #Value("${redis.port}") int redisPort;
//#Bean
public KeyGenerator keyGenerator() {
return new KeyGenerator() {
#Override
public Object generate(Object o, Method method, Object... objects) {
// This will generate a unique key of the class name, the method name, and all method parameters appended.
StringBuilder sb = new StringBuilder();
sb.append(o.getClass().getName());
sb.append(method.getName());
for (Object obj : objects) {
sb.append(obj.toString());
}
return sb.toString();
}
};
}
#Bean
public JedisConnectionFactory redisConnectionFactory() {
JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory();
// Defaults for redis running on Local Docker
redisConnectionFactory.setHostName("192.168.99.100");
redisConnectionFactory.setPort(6379);
return redisConnectionFactory;
}
#Bean
public RedisTemplate<String, String> redisTemplate(RedisConnectionFactory cf) {
RedisTemplate<String, String> redisTemplate = new RedisTemplate<String, String>();
redisTemplate.setConnectionFactory(cf);
redisTemplate.setDefaultSerializer(new JdkSerializationRedisSerializer());
return redisTemplate;
}
#Bean
public CacheManager cacheManager(RedisTemplate<String, String> redisTemplate) {
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate);
// Number of seconds before expiration. Defaults to unlimited (0)
cacheManager.setDefaultExpiration(1200);
cacheManager.getCacheNames().forEach(cacheM-> {System.out.println(cacheM);});
return cacheManager;
}
}
Implemented Caching with AWS Elastic Cache + Lettuce (Redis java Client) + spring-data-redis. 3 master with 2 slaves and SSL using spring #Cachable and #CacheEvict annotation. Please provide any inputs if you see any issue or it can be done in a better way.
Spring 4.3.12.RELEASE
Spring-data-redis 1.8.8.RELEASE
aws-java-sdk 1.11.228
Lettuce (Redis java Client) 4.4.2.Final
#Configuration
#EnableCaching
public class CacheConfig extends CachingConfigurerSupport {
long expirationDate = 1200;
static AWSCredentials credentials = null;
static {
try {
//credentials = new ProfileCredentialsProvider("default").getCredentials();
credentials = new SystemPropertiesCredentialsProvider().getCredentials();
} catch (Exception e) {
System.out.println("Got exception..........");
throw new AmazonClientException("Cannot load the credentials from the credential profiles file. "
+ "Please make sure that your credentials file is at the correct "
+ "location (/Users/USERNAME/.aws/credentials), and is in valid format.", e);
}
}
#Bean
public LettuceConnectionFactory redisConnectionFactory() {
AmazonElastiCache elasticacheClient = AmazonElastiCacheClientBuilder.standard().withCredentials(new AWSStaticCredentialsProvider(credentials)).withRegion(Regions.US_EAST_1).build();
DescribeCacheClustersRequest dccRequest = new DescribeCacheClustersRequest();
dccRequest.setShowCacheNodeInfo(true);
DescribeCacheClustersResult clusterResult = elasticacheClient.describeCacheClusters(dccRequest);
List<CacheCluster> cacheClusters = clusterResult.getCacheClusters();
List<String> clusterNodes = new ArrayList <String> ();
try {
for (CacheCluster cacheCluster : cacheClusters) {
for (CacheNode cacheNode : cacheCluster.getCacheNodes()) {
String addr = cacheNode.getEndpoint().getAddress();
int port = cacheNode.getEndpoint().getPort();
String url = addr + ":" + port;
if(<CLUSTER NAME>.equalsIgnoreCase(cacheCluster.getReplicationGroupId()))
clusterNodes.add(url);
}
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
LettuceConnectionFactory redisConnectionFactory = new LettuceConnectionFactory(new RedisClusterConfiguration(clusterNodes));
redisConnectionFactory.setUseSsl(true);
redisConnectionFactory.afterPropertiesSet();
return redisConnectionFactory;
}
#Bean
public RedisTemplate<String, String> redisTemplate(RedisConnectionFactory cf) {
RedisTemplate<String, String> redisTemplate = new RedisTemplate<String, String>();
redisTemplate.setConnectionFactory(cf);
redisTemplate.setDefaultSerializer(new JdkSerializationRedisSerializer());
return redisTemplate;
}
#Bean
public CacheManager cacheManager(RedisTemplate<String, String> redisTemplate) {
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate);
// Number of seconds before expiration. Defaults to unlimited (0)
cacheManager.setDefaultExpiration(expirationDate);
cacheManager.setLoadRemoteCachesOnStartup(true);
return cacheManager;
}
}
I'm studying the QueryDSL library and implementing it into my DAL project.
What is clear to me is that I need to instantiate a HibernateQuery object and use the QueryDSL methods to define the source of data (from() clause) and the conditions (where() with the BooleanExpressions).
For example consider a User Entity which has a name field and suppose we want to test whether the user with a name equal to "Richie" exists into the DB. I would write the following code to make things done
public boolean richieExists()
{
QUser qUser = QUser.user;
HibernateQuery query = new HibernateQuery(session); // I need a session instance here!
User richie = query.from(qUser).where(qUser.name.eq("Richie")).uniqueResult(qUser);
return (richie!=null);
}
The problem is that the above code should be the method of a Spring's Service object which uses a Repository to execute CRUD operations. This means that I need to retrieve the Session from the EntityManager instance I'm using in the Application Context to instantiate the HibernateQuery object, and this is a problem because the Service object doesn't have a way to return the used EntityManager.
What is the right way/place to write QueryDSL queries?
Here is my DAOConfig.java class with the Spring configuration (here we define the EntityManagerFactoryBean used by Spring for the Repository operations)
package my.dal.service.dal.config;
import java.util.Properties;
import javax.annotation.Resource;
import javax.sql.DataSource;
import org.apache.commons.dbcp2.BasicDataSource;
import org.apache.commons.dbcp2.BasicDataSourceFactory;
import org.hibernate.jpa.HibernatePersistenceProvider;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
import org.springframework.core.env.Environment;
import org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.orm.hibernate4.HibernateExceptionTranslator;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
#Configuration
#ComponentScan(basePackages = { "my.dal" })
#PropertySource("classpath:dbconnection.properties")
#EnableJpaRepositories("my.dal.repository")
#EnableTransactionManagement
public class DALConfig {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver_class";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USERNAME = "db.username";
private static final String PROPERTY_NAME_POOL_INITIAL_SIZE = "pool.initialsize";
private static final String PROPERTY_NAME_POOL_MAX_IDLE = "pool.maxidle";
private static final String PROPERTY_NAME_DAL_CLASSES_PACKAGE = "entities.packages_to_scan";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.showsql";
private static final String PROPERTY_NAME_HIBERNATE_FORMAT_SQL = "hibernate.format_sql";
#Resource
private Environment environment;
#Bean
public DataSource dataSource()
{
Properties props = new Properties();
props.put("driverClassName", environment.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
props.put("url", environment.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
props.put("username", environment.getRequiredProperty(PROPERTY_NAME_DATABASE_USERNAME));
props.put("password", environment.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
props.put("initialSize", environment.getRequiredProperty(PROPERTY_NAME_POOL_INITIAL_SIZE));
props.put("maxIdle", environment.getRequiredProperty(PROPERTY_NAME_POOL_MAX_IDLE));
BasicDataSource bds = null;
try {
bds = BasicDataSourceFactory.createDataSource(props);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return bds;
}
#Bean
public PersistenceExceptionTranslationPostProcessor persistenceExceptionTranslationPostProcessor()
{
PersistenceExceptionTranslationPostProcessor b = new PersistenceExceptionTranslationPostProcessor();
return b;
}
#Bean
public HibernateExceptionTranslator hibernateExceptionTranslator(){
return new HibernateExceptionTranslator();
}
#Bean
public PlatformTransactionManager transactionManager() throws ClassNotFoundException {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() throws ClassNotFoundException {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPackagesToScan(environment.getRequiredProperty(PROPERTY_NAME_DAL_CLASSES_PACKAGE));
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistenceProvider.class);
Properties jpaProperties = new Properties();
jpaProperties.put(PROPERTY_NAME_HIBERNATE_DIALECT, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_FORMAT_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_FORMAT_SQL));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
}
This is my repository interface
package my.dal.repository;
import my.domain.dal.User;
import org.springframework.data.repository.CrudRepository;
import org.springframework.stereotype.Repository;
#Repository
public interface IUserRepository extends CrudRepository<User, String>{
}
This is the UserService Service class in which I have to implement the "richieExists" query method
package my.dal.service;
import my.dal.repository.IUserRepository;
import my.domain.dal.QUser;
import my.domain.dal.User;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.dao.DataRetrievalFailureException;
import org.springframework.dao.DuplicateKeyException;
import org.springframework.stereotype.Service;
import com.mysema.query.jpa.hibernate.HibernateQuery;
#Service
public class UserService {
#Autowired
private IUserRepository repository;
public User find(String username) throws DataRetrievalFailureException
{
User user = null;
user= repository.findOne(username);
if (user == null)
throw new DataRetrievalFailureException("User with username = \"" + username + "\" not found");
else
return user;
}
public User insert(User user) throws DuplicateKeyException
{
if (repository.findOne(user.getUsername()) != null)
throw new DuplicateKeyException("User with username = \"" + user.getUsername() + "\" already exists");
return repository.save(user);
}
public void delete(String username) throws DataRetrievalFailureException
{
if (repository.findOne(username) == null)
throw new DataRetrievalFailureException("User with username =\"" + username + "\" not found");
repository.delete(username);
}
public User update(User user) throws DataRetrievalFailureException
{
if (repository.findOne(user.getUsername()) == null)
throw new DataRetrievalFailureException("User with username = \"" + user.getUsername() + "\" not found");
return repository.save(user);
}
public boolean richieExists()
{
QUser qUser = QUser.user;
HibernateQuery query = new HibernateQuery(session); // I need a session instance here!
User richie = query.from(qUser).where(qUser.username.eq("richie")).uniqueResult(qUser);
return (richie!=null);
}
}
Thank you
I solved this way:
Got the current EntityManager from the persistence context by mean of the proper annotation
#PersistenceContext
EntityManager em;
Next I used the JPAQuery class (not the HibernateQuery one) to build the queries
JPQLQuery query = new JPAQuery(em); // Now just use the query object
Hope this help