I have a project built with Spring Boot + FreeMarker, and it worked fine until tonight, but I don't think I had changed anything; however it failed. Below is my FreeMarker configuration class:
#Configuration
#Slf4j
public class FreemarkerConfiguration extends FreeMarkerAutoConfiguration.FreeMarkerWebConfiguration {
/**
* autowired all implementations of freemarker.template.TemplateDirectiveModel
*/
#Autowired
Map<String, TemplateDirectiveModel> directiveModelMap;
/**
* autowired all implementations of freemarker.template.TemplateMethodModelEx
*/
#Autowired
Map<String, TemplateMethodModelEx> methodModelExMap;
private static final String CUSTOM_DIRECTIVE_SUFFIX = "Directive";
private static final String CUSTOM_METHOD_SUFFIX = "Method";
#Override
public FreeMarkerConfigurer freeMarkerConfigurer() {
FreeMarkerConfigurer configurer = super.freeMarkerConfigurer();
Map<String, Object> sharedVariables = new HashMap<String, Object>();
if (!CollectionUtils.isEmpty(directiveModelMap)) {
Map<String, Object> map = new HashMap<String, Object>();
for (Map.Entry<String, TemplateDirectiveModel> entry : directiveModelMap.entrySet()) {
map.put(StringUtils.uncapitalize(entry.getKey()).replaceAll(CUSTOM_DIRECTIVE_SUFFIX, ""), entry.getValue());
}
sharedVariables.putAll(map);
}
if (!CollectionUtils.isEmpty(this.methodModelExMap)) {
Map<String, Object> map = new HashMap<String, Object>();
for (Map.Entry<String, TemplateMethodModelEx> entry : this.methodModelExMap.entrySet()) {
map.put(StringUtils.uncapitalize(entry.getKey()).replaceAll(CUSTOM_METHOD_SUFFIX, ""), entry.getValue());
}
sharedVariables.putAll(map);
}
BeansWrapper beansWrapper = new BeansWrapperBuilder(freemarker.template.Configuration.DEFAULT_INCOMPATIBLE_IMPROVEMENTS).build();
sharedVariables.put("enums", beansWrapper.getEnumModels());
configurer.setFreemarkerVariables(sharedVariables);
return configurer;
}
}
Problem is that
#Autowired
Map<String, TemplateDirectiveModel> directiveModelMap;
#Autowired
Map<String, TemplateMethodModelEx> methodModelExMap;
I want to inject all implementations of TemplateDirectiveModel and TemplateMethodModelEx , but both Map<String ,TemplateDirectiveModel/TemplateMethodModelEx> got null. Of course, the implementations annotated with #Compoment. I don't know why, I compared the diffs but got no answers, why the Maps instantiated after
#Override
public FreeMarkerConfigurer freeMarkerConfigurer(){ .... }
Here's my boot application
#Configuration
#SpringBootApplication
#EntityScan("com.hmxx.entity")
#EnableAspectJAutoProxy
#EnableTransactionManagement
#EnableJpaRepositories(value = {"com.hmxx.service"})
public class Application implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication app = new SpringApplication(new Object[]{Application.class});
app.setWebEnvironment(true);
//app.setBannerMode(Banner.Mode.CONSOLE);
ConfigurableApplicationContext ctx = app.run(args);
Map<String, TemplateDirectiveModel> directiveModelMap = ctx.getBeansOfType(TemplateDirectiveModel.class);
Map<String, TemplateMethodModelEx> methodModelExMap = ctx.getBeansOfType(TemplateMethodModelEx.class);
}
#Autowired
DataInitService dataInitService;
#Override
public void run(String... args) throws Exception {
// dataInitService.initAdminUser();
}
}
And obviously Map<String, TemplateDirectiveModel>、Map<String, TemplateMethodModelEx> methodModelExMap both not null.
I want to know why the injection got null and hope to resolve it.
Related
I have the service, that sending message
#Service
class ExportTaskService {
#Autowired
private KafkaTemplate<String, Object> template;
public void exportNewTask(ImportTaskRequest req) {
template.send('my-topic-name', req)
}
}
I configured beans: consumerFactory, producerFactory, kafkaTemplate (src/main/java)
If I run application, and execute metod -- all ok and message in real message broker.
Then I need spring test, that uses ExportTaskService.exportNewTask(request) and waiting message from same topic.
My code, but not working (i cant receive message):
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
#TestPropertySource(locations="classpath:test.properties")
#EnableKafka
#EmbeddedKafka(
topics = "new-bitrix-leads", ports = 9092
)
public class ExportingLeadTests {
#Autowired
private EmbeddedKafkaBroker embeddedKafkaBroker;
#Autowired
ExportTaskService exportTaskService;
#Autowired
ConsumerFactory<String, Object> consumerFactory;
#Test
public void test() throws InterruptedException {
assert(embeddedKafkaBroker != null);
assert(exportTaskService != null);
Consumer<String, Object> consumer = consumerFactory.createConsumer();
consumer.subscribe(Collections.singletonList("new-bitrix-leads"));
exportTaskService.exportNewTask(ImportTaskRequest.builder()
.description("descr")
.title("title")
.build());
ConsumerRecords<String, Object> records = consumer.poll(Duration.ofSeconds(3));
assert (records.count() == 1);
}
}
How I can read this message ? What I need to do ? I have no ideas...
BIG TNX :) !!
My simple solution is:
#SpringBootTest
#DirtiesContext
#TestPropertySource(locations="classpath:test.properties")
#EnableKafka
#EmbeddedKafka(
topics = "new-bitrix-leads", ports = 9092
)
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class ExportingLeadTests {
private BlockingQueue<ConsumerRecord<String, Object>> records;
private KafkaMessageListenerContainer<String, String> container;
#Autowired
private EmbeddedKafkaBroker embeddedKafkaBroker;
#Autowired
ExportTaskService exportTaskService;
#BeforeAll
void setUp() {
DefaultKafkaConsumerFactory<String, Object> consumerFactory = new DefaultKafkaConsumerFactory<>(getConsumerProperties());
ContainerProperties containerProperties = new ContainerProperties("new-bitrix-leads");
container = new KafkaMessageListenerContainer<>(consumerFactory, containerProperties);
records = new LinkedBlockingQueue<>();
container.setupMessageListener((MessageListener<String, Object>) e -> records.add(e));
container.start();
ContainerTestUtils.waitForAssignment(container, embeddedKafkaBroker.getPartitionsPerTopic());
}
#AfterAll
void tearDown() {
container.stop();
}
private Map<String, Object> getConsumerProperties() {
Map<String, Object> map = new HashMap<>();
map.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, embeddedKafkaBroker.getBrokersAsString());
map.put(ConsumerConfig.GROUP_ID_CONFIG, "consumer");
map.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true");
map.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "10");
map.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "60000");
map.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
map.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
map.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return map;
}
#Test
public void test() throws InterruptedException {
exportTaskService.exportNewTask(ImportTaskRequest.builder()
.description("descr")
.title("title")
.gclid("gclid")
.id("id")
.name("my name")
.website("http://website.com")
.yclid("yclid")
.source("Source")
.build());
ConsumerRecord<String, Object> record = records.poll(5, TimeUnit.SECONDS);
assert (record != null);
assertThat(record.value().toString(), containsString("gclid"));
}
}
really works. nice
Edit FYI: working gitHub example
I was searching the internet and couldn't find a working and simple example of an embedded Kafka test.
My setup is:
Spring boot
Multiple #KafkaListener with different topics in one class
Embedded Kafka for test which is starting fine
Test with Kafkatemplate which is sending to topic but the
#KafkaListener methods are not receiving anything even after a huge sleep time
No warnings or errors are shown, only info spam from Kafka in logs
Please help me. There are mostly over configured or overengineered examples. I am sure it can be done simple.
Thanks, guys!
#Controller
public class KafkaController {
private static final Logger LOG = getLogger(KafkaController.class);
#KafkaListener(topics = "test.kafka.topic")
public void receiveDunningHead(final String payload) {
LOG.debug("Receiving event with payload [{}]", payload);
//I will do database stuff here which i could check in db for testing
}
}
private static String SENDER_TOPIC = "test.kafka.topic";
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, SENDER_TOPIC);
#Test
public void testSend() throws InterruptedException, ExecutionException {
Map<String, Object> senderProps = KafkaTestUtils.producerProps(embeddedKafka);
KafkaProducer<Integer, String> producer = new KafkaProducer<>(senderProps);
producer.send(new ProducerRecord<>(SENDER_TOPIC, 0, 0, "message00")).get();
producer.send(new ProducerRecord<>(SENDER_TOPIC, 0, 1, "message01")).get();
producer.send(new ProducerRecord<>(SENDER_TOPIC, 1, 0, "message10")).get();
Thread.sleep(10000);
}
Embedded Kafka tests work for me with below configs,
Annotation on test class
#EnableKafka
#SpringBootTest(classes = {KafkaController.class}) // Specify #KafkaListener class if its not the same class, or not loaded with test config
#EmbeddedKafka(
partitions = 1,
controlledShutdown = false,
brokerProperties = {
"listeners=PLAINTEXT://localhost:3333",
"port=3333"
})
public class KafkaConsumerTest {
#Autowired
KafkaEmbedded kafkaEmbeded;
#Autowired
KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
Before annotation for setup method
#Before
public void setUp() throws Exception {
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
ContainerTestUtils.waitForAssignment(messageListenerContainer,
kafkaEmbeded.getPartitionsPerTopic());
}
}
Note: I am not using #ClassRule for creating embedded Kafka rather auto-wiring #Autowired embeddedKafka
#Test
public void testReceive() throws Exception {
kafkaTemplate.send(topic, data);
}
Hope this helps!
Edit: Test configuration class marked with #TestConfiguration
#TestConfiguration
public class TestConfig {
#Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(KafkaTestUtils.producerProps(kafkaEmbedded));
}
#Bean
public KafkaTemplate<String, String> kafkaTemplate() {
KafkaTemplate<String, String> kafkaTemplate = new KafkaTemplate<>(producerFactory());
kafkaTemplate.setDefaultTopic(topic);
return kafkaTemplate;
}
Now #Test method will autowire KafkaTemplate and use is to send message
kafkaTemplate.send(topic, data);
Updated answer code block with above line
since the accepted answer doesn't compile or work for me. I find another solution based on https://blog.mimacom.com/testing-apache-kafka-with-spring-boot/ what I would like to share with you.
The dependency is 'spring-kafka-test' version: '2.2.7.RELEASE'
#RunWith(SpringRunner.class)
#EmbeddedKafka(partitions = 1, topics = { "testTopic" })
#SpringBootTest
public class SimpleKafkaTest {
private static final String TEST_TOPIC = "testTopic";
#Autowired
EmbeddedKafkaBroker embeddedKafkaBroker;
#Test
public void testReceivingKafkaEvents() {
Consumer<Integer, String> consumer = configureConsumer();
Producer<Integer, String> producer = configureProducer();
producer.send(new ProducerRecord<>(TEST_TOPIC, 123, "my-test-value"));
ConsumerRecord<Integer, String> singleRecord = KafkaTestUtils.getSingleRecord(consumer, TEST_TOPIC);
assertThat(singleRecord).isNotNull();
assertThat(singleRecord.key()).isEqualTo(123);
assertThat(singleRecord.value()).isEqualTo("my-test-value");
consumer.close();
producer.close();
}
private Consumer<Integer, String> configureConsumer() {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps("testGroup", "true", embeddedKafkaBroker);
consumerProps.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
Consumer<Integer, String> consumer = new DefaultKafkaConsumerFactory<Integer, String>(consumerProps)
.createConsumer();
consumer.subscribe(Collections.singleton(TEST_TOPIC));
return consumer;
}
private Producer<Integer, String> configureProducer() {
Map<String, Object> producerProps = new HashMap<>(KafkaTestUtils.producerProps(embeddedKafkaBroker));
return new DefaultKafkaProducerFactory<Integer, String>(producerProps).createProducer();
}
}
I solved the issue now
#BeforeClass
public static void setUpBeforeClass() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
System.setProperty("spring.cloud.stream.kafka.binder.zkNodes", embeddedKafka.getZookeeperConnectionString());
}
while I was debugging, I saw that the embedded kaka server is taking a random port.
I couldn't find the configuration for it, so I am setting the kafka config same as the server. Looks still a bit ugly for me.
I would love to have just the #Mayur mentioned line
#EmbeddedKafka(partitions = 1, controlledShutdown = false, brokerProperties = {"listeners=PLAINTEXT://localhost:9092", "port=9092"})
but can't find the right dependency in the internet.
In integration testing, having fixed ports like 9092 is not recommended because multiple tests should have the flexibility to open their own ports from embedded instances. So, following implementation is something like that,
NB: this implementation is based on junit5(Jupiter:5.7.0) and spring-boot 2.3.4.RELEASE
TestClass:
#EnableKafka
#SpringBootTest(classes = {ConsumerTest.Config.class, Consumer.class})
#EmbeddedKafka(
partitions = 1,
controlledShutdown = false)
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class ConsumerTest {
#Autowired
private EmbeddedKafkaBroker kafkaEmbedded;
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#BeforeAll
public void setUp() throws Exception {
for (final MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
ContainerTestUtils.waitForAssignment(messageListenerContainer,
kafkaEmbedded.getPartitionsPerTopic());
}
}
#Value("${topic.name}")
private String topicName;
#Autowired
private KafkaTemplate<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestKafkaTemplate;
#Test
public void consume_success() {
requestKafkaTemplate.send(topicName, load);
}
#Configuration
#Import({
KafkaListenerConfig.class,
TopicConfig.class
})
public static class Config {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Bean
public ProducerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestProducerFactory() {
final Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
public KafkaTemplate<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> requestKafkaTemplate() {
return new KafkaTemplate<>(requestProducerFactory());
}
}
}
Listener Class:
#Component
public class Consumer {
#KafkaListener(
topics = "${topic.name}",
containerFactory = "listenerContainerFactory"
)
#Override
public void listener(
final ConsumerRecord<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> consumerRecord,
final #Payload Optional<Map<String, List<ImmutablePair<String, String>>>> payload
) {
}
}
Listner Config:
#Configuration
public class KafkaListenerConfig {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Value(value = "${topic.name}")
private String resolvedTreeQueueName;
#Bean
public ConsumerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> resolvedTreeConsumerFactory() {
final Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put(ConsumerConfig.GROUP_ID_CONFIG, resolvedTreeQueueName);
return new DefaultKafkaConsumerFactory<>(props, new StringDeserializer(), new CustomDeserializer());
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> resolvedTreeListenerContainerFactory() {
final ConcurrentKafkaListenerContainerFactory<String, Optional<Map<String, List<ImmutablePair<String, String>>>>> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(resolvedTreeConsumerFactory());
return factory;
}
}
TopicConfig:
#Configuration
public class TopicConfig {
#Value(value = "${spring.kafka.bootstrap-servers}")
private String bootstrapAddress;
#Value(value = "${topic.name}")
private String requestQueue;
#Bean
public KafkaAdmin kafkaAdmin() {
Map<String, Object> configs = new HashMap<>();
configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
return new KafkaAdmin(configs);
}
#Bean
public NewTopic requestTopic() {
return new NewTopic(requestQueue, 1, (short) 1);
}
}
application.properties:
spring.kafka.bootstrap-servers=${spring.embedded.kafka.brokers}
This assignment is the most important assignment that would bind the embedded instance port to the KafkaTemplate and, KafkaListners.
Following the above implementation, you could open dynamic ports per test class and, it would be more convenient.
I am trying to register Datasource instance as a bean in java code(spring-boot project)
Here is what I wrote. (This code is not working.)
#Configuration
public class DatabaseConfig {
private Logger logger = Logger.getLogger(DatabaseConfig.class);
#Autowired
ApplicationContext context;
private Map<String, Map<String, String>> dsMap;
private Map<String, String> getTestDataSourceInfo () {
Map<String, String> ds = new HashMap<String, String> ();
ds.put("driverClassName", "com.mysql.jdbc.Driver");
ds.put("url", "jdbc:mysql://123.456.78.912:3306/test");
ds.put("username", "testuser");
ds.put("password", "testuser");
return ds;
}
public DatabaseConfig () {
this.dsMap = new HashMap<String, Map<String, String>>();
dsMap.put("sampleDs", getTestDataSourceInfo());
}
#PostConstruct
public void loadDataSource () {
logger.info("DS ================================ :: " + String.valueOf(this.dsMap));
this.dsMap.forEach((k,v) -> {
logger.info("value ========================== :: " + String.valueOf(v));
DataSource aSource = DataSourceBuilder.create()
.driverClassName(v.get("driverClassName"))
.url(v.get("url"))
.username(v.get("username"))
.password(v.get("password"))
.build();
// PROBLEM STARTS ..............
// Add datasource instance with name to context
context.getAutowireCapableBeanFactory().autowireBean(aSource);
});
}
}
Is there any proper way to register bean with an instance?
I could not find any fine samples for this.
FYI, What I have expected in above code is...
Spring boot application will read above class as a Configure
It will make an Java instance in its constructor
And it will add the instance as a bean to application context in loadDatasource method
However, it is not working. So I am curious about how to add an java instance as a bean to current Spring boot application context.
Actually, it could be more easier if I did not do this with DataSource.
Since, Spring-boot automatically do configure dataSource, I have to disable this settings first.
Here is what I did to achieve the goal
In #Configuration Class...
#Configuration
public class DatabaseConfig {
private Logger logger = Logger.getLogger(DatabaseConfig.class);
#Autowired
ApplicationContext context;
private Map<String, Map<String, String>> dsMap;
private Map<String, String> getTestDataSourceInfo () {
Map<String, String> ds = new HashMap<String, String> ();
ds.put("driverClassName", "${ your driverClassName }");
ds.put("url", "${ your url }");
ds.put("username", "${ your user }");
ds.put("password", "${ your password }");
return ds;
}
public DatabaseConfig () {
this.dsMap = new HashMap<String, Map<String, String>>();
dsMap.put("sampleDs1", getTestDataSourceInfo());
dsMap.put("sampleDs2", getTestDataSourceInfo());
}
#PostConstruct
public void loadDataSource () {
BeanDefinitionRegistry registry = (BeanDefinitionRegistry) context.getAutowireCapableBeanFactory();
this.dsMap.forEach((k,v) -> {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.rootBeanDefinition(BasicDataSource.class);
v.forEach((ds_key, ds_val) -> {
builder.addPropertyValue(ds_key, ds_val);
});
BeanDefinition def = builder.getBeanDefinition();
if(!registry.containsBeanDefinition(k)) registry.registerBeanDefinition(k, def);
});
}
}
In above class, I could add java instances to spring bean with BeanDefinitionRegistry and BeanDefinitionBuilder.
If this is just a bean, it would be end here, but what you are trying to add is DataSource bean, have to do some extra work.
Since, boot automatically setting DataSource, we have to disable that setting to register customized datasource.
In your #SpringbootApplication class, add #EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class}).
Now, you are able to user those bean in other beans via #Autowired and #Qualifier.
Thanks.
Can I create one singleton property in spring boot?
When I use this:
public class MessengerPlatformCallbackHandler {
#Scope(value = "singleton")
private Map<String, Object> conversationID = new HashMap<>();
I got the erro: #Scope not applicable to field
tks
You need to create it this way.
#Configuration
public class ConversationIDConfig {
#Bean
#Scope(value = "singleton")
public Map<String, Object> conversationId(){
private Map<String, Object> conversationID = new HashMap<>();
}
}
And later you can inject it where ever you want as below.
public class MessengerPlatformCallbackHandler {
#Autowired
private Map<String, Object> conversationID;
}
You need to create it this way.
#Configuration
public class ConversationIDConfig {
#Bean
public Map<String, Object> conversationId(){
return new HashMap<>();
}
}
And later you can inject it where ever you want as below.
public class MessengerPlatformCallbackHandler {
#Autowired
private Map<String, Object> conversationId;
}
When I call a service directly in my main() I can query the database and things work fine. When a jersey request comes in and maps the JSON to NewJobRequest I can't use my service because the #Autowire failed.
My app:
public class Main {
public static final URI BASE_URI = getBaseURI();
private static URI getBaseURI() {
return UriBuilder.fromUri("http://localhost/").port(9998).build();
}
protected static HttpServer startServer() throws IOException {
ResourceConfig rc = new PackagesResourceConfig("com.production.api.resources");
rc.getFeatures()
.put(JSONConfiguration.FEATURE_POJO_MAPPING, true);
return GrizzlyServerFactory.createHttpServer(BASE_URI, rc);
}
public static void main(String[] args) throws IOException {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Config.class);
//if this is uncommented, it'll successfully query the database
//VendorService vendorService = (VendorService)ctx.getBean("vendorService");
//Vendor vendor = vendorService.findByUUID("asdf");
HttpServer httpServer = startServer();
System.out.println(String.format("Jersey app started with WADL available at " + "%sapplication.wadl\nTry out %shelloworld\nHit enter to stop it...", BASE_URI, BASE_URI));
System.in.read();
httpServer.stop();
}
}
My Resource (controller):
#Component
#Path("/job")
public class JobResource extends GenericResource {
#Path("/new")
#POST
public String New(NewJobRequest request) {
return "done";
}
}
Jersey is mapping the JSON post to:
#Component
public class NewJobRequest {
#Autowired
private VendorService vendorService;
#JsonCreator
public NewJobRequest(Map<String, Object> request) {
//uh oh, can't do anything here because #Autowired failed and vendorService is null
}
}
VendorService:
#Service
public class VendorService extends GenericService<VendorDao> {
public Vendor findByUUID(String uuid) {
Vendor entity = null;
try {
return (Vendor)em.createNamedQuery("Vendor.findByUUID")
.setParameter("UUID", uuid)
.getSingleResult();
} catch (Exception ex) {
return null;
}
}
}
-
#Service
public class GenericService<T extends GenericDao> {
private static Logger logger = Logger.getLogger(Logger.class.getName());
#PersistenceContext(unitName = "unit")
public EntityManager em;
protected T dao;
#Transactional
public void save(T entity) {
dao.save(entity);
}
}
My service config:
#Configuration
public class Config {
#Bean
public VendorService vendorService() {
return new VendorService();
}
}
My config
#Configuration
#ComponentScan(basePackages = {
"com.production.api",
"com.production.api.dao",
"com.production.api.models",
"com.production.api.requests",
"com.production.api.requests.job",
"com.production.api.resources",
"com.production.api.services"
})
#Import({
com.production.api.services.Config.class,
com.production.api.dao.Config.class,
com.production.api.requests.Config.class
})
#PropertySource(value= "classpath:/META-INF/application.properties")
#EnableTransactionManagement
public class Config {
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USER = "db.user";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_FORMAT_SQL = "hibernate.format_sql";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
private static final String PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN = "entitymanager.packages.to.scan";
#Resource
Environment environment;
#Bean
public DataSource dataSource() {
MysqlDataSource dataSource = new MysqlDataSource();
dataSource.setUrl(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
dataSource.setUser(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_USER));
dataSource.setPassword(environment.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return dataSource;
}
#Bean
public JpaTransactionManager transactionManager() throws ClassNotFoundException {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactoryBean().getObject());
return transactionManager;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactoryBean() throws ClassNotFoundException {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPersistenceUnitName("unit");
entityManagerFactoryBean.setPackagesToScan(environment.getRequiredProperty(PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN));
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistence.class);
Properties jpaProperties = new Properties();
jpaProperties.put(PROPERTY_NAME_HIBERNATE_DIALECT, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_FORMAT_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_FORMAT_SQL));
jpaProperties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL, environment.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
}
The #Path and #POST annotations are JAX-RS, not Spring. So the container is instantiating your endpoints on its own, without any knowledge of Spring beans. You are most likely not getting any Spring logging because Spring is not being used at all.
I've figured out the issue and blogged about it here: http://blog.benkuhl.com/2013/02/how-to-access-a-service-layer-on-a-jersey-json-object/
In the mean time, I'm also going to post the solution here:
I need to tap into the bean that Spring already created so I used Spring's ApplicationContextAware
public class ApplicationContextProvider implements ApplicationContextAware {
private static ApplicationContext applicationContext;
public static ApplicationContext getApplicationContext() {
return applicationContext;
}
public void setApplicationContext (ApplicationContext applicationContext) {
this.applicationContext = applicationContext;
}
}
And then used that static context reference within my object to be mapped to so I can perform lookups in the service:
public class NewJobRequest {
private VendorService vendorService;
public NewJobRequest() {
vendorService = (VendorService) ApplicationContextProvider.getApplicationContext().getBean("vendorService");
}
#JsonCreator
public NewJobRequest(Map<String, Object> request) {
setVendor(vendorService.findById(request.get("vendorId")); //vendorService is null
}
....
}