I found an acceptable solution. Look below.
I want to use Kundera as my persistence provider to store my JPA objects in HBase. I do everything in an annotation-driven way, i.e., I do not have and do not want a persistence.xml.
So far I did the following:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
final LocalContainerEntityManagerFactoryBean bean = new LocalContainerEntityManagerFactoryBean();
bean.setPersistenceProviderClass(KunderaPersistence.class);
final Properties props = new Properties();
props.setProperty("kundera.dialect", "hbase");
props.setProperty("kundera.client.lookup.class",
HBaseClientFactory.class.getName());
props.setProperty("kundera.cache.provider.class",
EhCacheProvider.class.getName());
props.setProperty("kundera.cache.config.resource",
"/ehcache.xml");
props.setProperty("kundera.ddl.auto.prepare", "update");
props.setProperty("kundera.nodes", "myhbase.example.com");
props.setProperty("kundera.port", "2182");
props.setProperty("kundera.keyspace", "foo");
bean.setJpaProperties(props);
return bean;
}
I have an ehcache.xml with a default comfiguration in my src/main/resources.
When I try to run this, It bails out, the stack trace boils down to:
... many more ...
Caused by: java.lang.IllegalStateException: No persistence units parsed from {classpath*:META-INF/persistence.xml}
at org.springframework.orm.jpa.persistenceunit.DefaultPersistenceUnitManager.obtainDefaultPersistenceUnitInfo(DefaultPersistenceUnitManager.java:655)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.determinePersistenceUnitInfo(LocalContainerEntityManagerFactoryBean.java:358)
at org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.createNativeEntityManagerFactory(LocalContainerEntityManagerFactoryBean.java:307)
at org.springframework.orm.jpa.AbstractEntityManagerFactoryBean.afterPropertiesSet(AbstractEntityManagerFactoryBean.java:318)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1613)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1550)
... 152 more
When I set a persistence unit name via bean.setPersistenceUnitName("foo"); I get: No persistence unit with name 'foo' found
What am I doing wrong, Do I actually NEED a persistence.xml? I do not want one because I want to be able to set the configuration via command line switches. These are inserted into the Environment I can use as a parameter to my #Bean method and put into my properties object this way.
--- EDIT ---
I now created a persistence.xml with the basic settings, and just override the custom ones in my Properties object. This works, but now I have another problem, connecting to HBase locks up the Java process so badly I have to kill -9 it. Another question follows.
As I stated in my original edit, I found an acceptable solution:
Create a persistence.xml with the default settings
Override the custom settings in the #Bean method
Related
I have got a Spring Boot project with two data sources, one DB2 and one Postgres. I configured that, but have a problem:
The auto-detection for the database type does not work on the DB2 (in any project) unless I specify the database dialect using spring.jpa.database-platform = org.hibernate.dialect.DB2390Dialect.
But how do I specify that for only one of the database connections? Or how do I specify the other one independently?
Additional info to give you more info on my project structure: I seperated the databases roughly according to this tutorial, although I do not use the ChainedTransactionManager: https://medium.com/preplaced/distributed-transaction-management-for-multiple-databases-with-springboot-jpa-and-hibernate-cde4e1b298e4
I use the same basic project structure and almost unchanged configuration files.
Ok, I found the answer myself and want to post it for the case that anyone else has the same question.
The answer lies in the config file for each database, i.e. the DB2Config.java file mentioned in the tutorial mentioned in the question.
While I'm at it, I'll inadvertedly also answer the question "how do I manipulate any of the spring.jpa properties for several databases independently".
In the example, the following method gets called:
#Bean
public LocalContainerEntityManagerFactoryBean db2EntityManagerFactory(
#Qualifier(“db2DataSource”) DataSource db2DataSource,
EntityManagerFactoryBuilder builder
) {
return builder
.dataSource(db2DataSource)
.packages(“com.preplaced.multidbconnectionconfig.model.db2”)
.persistenceUnit(“db2”)
.build();
}
While configuring the datasource and the package our model lies in, we can also inject additional configuration.
After calling .packages(...), we can set a propertiesMap that can contain everything we would normally set via spring.jpa in the application.properties file.
If we want to set the DB2390Dialect, the method could now look like this (with the possibility to easily add further configuration):
#Bean
public LocalContainerEntityManagerFactoryBean db2EntityManagerFactory(
#Qualifier(“db2DataSource”) DataSource db2DataSource,
EntityManagerFactoryBuilder builder
) {
HashMap<String, String> propertiesMap = new HashMap<String, String>();
propertiesMap.put("hibernate.dialect", "org.hibernate.dialect.DB2390Dialect");
return builder
.dataSource(db2DataSource)
.packages(“com.preplaced.multidbconnectionconfig.model.db2”)
.properties(propertiesMap)
.persistenceUnit(“db2”)
.build();
}
Note that "hibernate.dialect" seems to work (instead of "database-platform").
I am using Spring Boot 2.3.0. I have 2 data sources one for oracle and one for h2 defined in application.properties.
I have to 2 #Configuration classes for the data configurations. Both classes implement:
DataSource
PlatformTransactionManager
LocalContainerEntityManagerFactoryBean
In LocalContainerEntityManagerFactoryBean I set up:
setDataSource
setPackagesToScan
setJpaVendorAdapter
The application starts up properly, I can even do .findAll on the table in the H2 database, however
as soon as I start executing custom methods in the repository implementation, such as this:
#Transactional(readOnly = true)
private Optional<List<Foo>> findFooByState(Optional<Integer> id, Foo.State state) {
CriteriaBuilder cp = em.getCriteriaBuilder();
CriteriaQuery<Foo> cqFoo= cp.createQuery(Foo.class);
Root<Foo> fooRoot = cqFoo.from(Foo.class);
[...]
Spring throws an exception such as:
Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception
[Request processing failed; nested exception is org.springframework.dao.InvalidDataAccessApiUsageException: Not an entity: class foo.Foo;
nested exception is java.lang.IllegalArgumentException: Not an entity: class foo.Foo] with root cause
Package foo is added in setPackagesToScan as I wrote earlier.
I have tried various things with #Transactional, e.g. remove it, add the name of the transaction manager specified in the DataSource in it, move the #Transactional to the #GetMapping, but none of it helped.
Does anybody have any clue what am I doing wrong?
Thanks,
I had a similar problem. Most probably you haven`t configured JPA repositories base packages to pick up different entities for different data sources. You can have a look at my guide on how to configure two data sources
in Spring Boot application. Hope it will help!
Ok, I was lame. I had to:
setPersistenceUnitName in the LocalContainerEntityManagerFactoryBean instantiation method
I had to use the proper #PersistenceContext with the proper unitName
I am just starting to use Kafka with Spring Boot & want to send & consume JSON objects.
I am getting the following error when I attempt to consume an message from the Kafka topic:
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition dev.orders-0 at offset 9903. If needed, please seek past the record to continue consumption.
Caused by: java.lang.IllegalArgumentException: The class 'co.orders.feedme.feed.domain.OrderItem' is not in the trusted packages: [java.util, java.lang]. If you believe this class is safe to deserialize, please provide its name. If the serialization is only done by a trusted source, you can also enable trust all (*).
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.getClassIdType(DefaultJackson2JavaTypeMapper.java:139) ~[spring-kafka-2.1.5.RELEASE.jar:2.1.5.RELEASE]
at org.springframework.kafka.support.converter.DefaultJackson2JavaTypeMapper.toJavaType(DefaultJackson2JavaTypeMapper.java:113) ~[spring-kafka-2.1.5.RELEASE.jar:2.1.5.RELEASE]
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:218) ~[spring-kafka-2.1.5.RELEASE.jar:2.1.5.RELEASE]
at org.apache.kafka.clients.consumer.internals.Fetcher.parseRecord(Fetcher.java:923) ~[kafka-clients-1.0.1.jar:na]
at org.apache.kafka.clients.consumer.internals.Fetcher.access$2600(Fetcher.java:93) ~[kafka-clients-1.0.1.jar:na]
I have attempted to add my package to the list of trusted packages by defining the following property in application.properties:
spring.kafka.consumer.properties.spring.json.trusted.packages = co.orders.feedme.feed.domain
This doesn't appear to make any differences. What is the correct way to add my package to the list of trusted packages for Spring's Kafka JsonDeserializer?
Since you have the trusted package issue solved, for your next problem you could take advantage of the overloaded
DefaultKafkaConsumerFactory(Map<String, Object> configs,
Deserializer<K> keyDeserializer,
Deserializer<V> valueDeserializer)
and the JsonDeserializer "wrapper" of spring kafka
JsonDeserializer(Class<T> targetType, ObjectMapper objectMapper)
Combining the above, for Java I have:
new DefaultKafkaConsumerFactory<>(properties,
new IntegerDeserializer(),
new JsonDeserializer<>(Foo.class,
new ObjectMapper()
.registerModules(new KotlinModule(), new JavaTimeModule()).setSerializationInclusion(JsonInclude.Include.NON_NULL)
.setDateFormat(new ISO8601DateFormat()).configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false))));
Essentially, you can tell the factory to use your own Deserializers and for the Json one, provide your own ObjectMapper. There you can register the Kotlin Module as well as customize date formats and other stuff.
Ok, I have read the documentation in a bit more detail & have found an answer to my question. I am using Kotlin so the creation of my consumer looks like this with the
#Bean
fun consumerFactory(): ConsumerFactory<String, FeedItem> {
val configProps = HashMap<String, Any>()
configProps[ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG] = bootstrapServers
configProps[ConsumerConfig.GROUP_ID_CONFIG] = "feedme"
configProps[ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG] = StringDeserializer::class.java
configProps[ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG] = JsonDeserializer::class.java
configProps[JsonDeserializer.TRUSTED_PACKAGES] = "co.orders.feedme.feed.domain"
return DefaultKafkaConsumerFactory(configProps)
}
Now I just need a way to override the creation of the Jackson ObjectMapper in the JsonDeserializer so that it can work with my Kotlin data classes that don't have a zero-argument constructor :)
I am using tomcat-jdbc pool in default spring-boot setup. I would like to run some custom Java code each time new JDBC connection is established in the pool and before it is used for the first time. How to do it, and if there are several possibilities which one is the best?
To extend already accepted answer, you can use Spring AOP without full AspectJ if you use pointcut as this one:
#AfterReturning(pointcut = "execution(* org.apache.tomcat.jdbc.pool.DataSourceProxy.getConnection())")
public void afterConnectionEstablished() {
...
}
Well, I can think of two options:
Create your own wrapper class - either by extending Tomcat's DataSource class or by implementing Java's DataSource interface and delegating to the wrapped DataSource - and then add the logic you want to the desired methods and register a bean in a #Configuration class by manually instantiating your tomcat-jdbc DataSource (for examples on how to do so, refer to DataSourceConfiguration.Tomcat class) and wrapping it with your class.
Create an aspect and use Spring's AOP support to intercept calls to getConnection. Since DataSourceclass is in the javax package, I think you'll have to use AspectJ, and for some examples refer to this link
My suggestion would be to go with the first option, it should give you fewer headaches, here's a small example how you'd define your wrapper bean:
#Bean
public DataSource dataSource(DataSourceProperties properties) {
return new MyDataSourceWrapper(tomcatDataSourceFrom(properties));
}
private org.apache.tomcat.jdbc.pool.DataSource tomcatDataSourceFrom(
DataSourceProperties properties) {
// manual instantiation like in DataSourceConfiguration.Tomcat class
}
I have a boot application and in one of my facades I try to Autowire the conversionService like this:
#Autowired
private ConversionService conversionService;
as a result I get this:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.core.convert.ConversionService] is defined: expected single matching bean but found 3: mvcConversionService,defaultConversionService,integrationConversionService
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1061)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:949)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:533)
... 16 more
To overcome this I have added a Qualifier lilke this:
#Autowired
#Qualifier("mvcConversionService")
private ConversionService c;
and this all works. However all my custom converters are automatically added the to the mvcConversionService. And now I want to extend the ConversionService and add another method to it, however my converters are again added the to the mvcConversionService. Is there a way to tell spring-boot which conversionService to use to automatically register my converters there? I don't want to manually list all the converters to the new conversionService.
I was experiencing a similar issue. The problem seems to be that you need to define which conversion service you want to use. You can do it by XML or using a Spring Boot configuration.
I'm copying here part of answer to a very similar question (the one that worked for me) and marking this question as possible duplicate.
A look into the Spring documentation reveals, that you should
declare a ConversionService bean. In the XML configuration it would
look like this:
<bean id="conversionService" class="org.springframework.context.support.ConversionServiceFactoryBean">
<property name="converters">
<set>
<bean class="example.MyCustomConverter"/>
</set>
</property>
</bean>
If its a list of converters you're after and they all implement the same interface you can autowire in a list.
e.g.
#Service
public class MyService {
#Autowired
private List<MyConverter> myConverters;
public void myMethod(MyRawThing thing) {
myConverters
.forEach(converter -> converter.doStuff(thing));
}
}
and them iterate through them as required.
A nice feature of this, is that as new implementations are added, you do not need to update the code that uses them.
Explanation
Spring will attempt to auto-create a ConversionService (in your case mvcConversionService) for the application. It has no qualifiers for what Converters will be registered so if you create a component/bean that implements one of the following types, it will be autoregistered to the service.
Converter
GenericConverter
Printer
Parser
Solution #1
Ignore it. If there is no harm done by registering these converters to the mvcConversionService, then it might not be worth your time to work around this restriction. Your converter classes will likely go unused.
Solution #2
Reproduce your own interfaces. You can copy the interface for ConversionService and Converter and name them how you like. Sure, the functionality is intended to be the same but this will prevent your Converter classes from being injected into any other ConversionService instances.
Bump into this because I was having this issue also with Spring Integration.
I'm not so satisfied with this solution but this is a quick one.
#Configuration
public class ConversionServiceConfiguration {
#Bean
#Primary
public ConversionService rewardBoxConversionService(#Qualifier("mvcConversionService") final ConversionService conversionService) {
return conversionService;
}
}
Problem will arise if you need other ConversionService implementation for some of your logic. But, that can be solved by using #Qualifier annotation.