In hibernate 4 - spring 4 setup it was possible to generate DDL using SchemaExport object:
LocalSessionFactoryBean sfb = (LocalSessionFactoryBean) context.getBean("&sessionFactory");
SchemaExport schema = new SchemaExport(sfb.getConfiguration());
But hibernate 5 replaces SchemaExport(Configuration configuration) constructor with SchemaExport(MetadataImplementator metadataImplementator).
MetadataImplementator is not readily available on
org.springframework.orm.hibernate5.LocalSessionFactoryBean or org.springframework.orm.hibernate5.LocalSessionFactoryBuilder
I hacked it like this:
MetadataSources metadataSources = (MetadataSources) FieldUtils.readField(configuration, "metadataSources", true);
Metadata metadata = metadataSources
.getMetadataBuilder(configuration.getStandardServiceRegistryBuilder().build())
.applyPhysicalNamingStrategy(new MyPhysicialNamingStrategy())
.applyImplicitNamingStrategy(ImplicitNamingStrategyJpaCompliantImpl.INSTANCE)
.build();
MetadataImplementor metadataImpl = (MetadataImplementor) metadata;
SchemaExport schema = new SchemaExport(metadataImplementor);
But it would be nice to have a better way and also, Validator annotations (#NotNull, #Size) are not used for DDL generation and I don't know if it is a bug in Hibernate 5 or this setup.
I am using hibernate 5.0.0.CR4 and spring 4.2.0.RELEASE
You need to implement org.hibernate.integrator.spi.Integrator where you can store required data to some holder.
Work example you can find here https://github.com/valery-barysok/spring4-hibernate5-stackoverflow-34612019
register it as the service at META-INF/services/org.hibernate.integrator.spi.Integrator file
public class Integrator implements org.hibernate.integrator.spi.Integrator {
#Override
public void integrate(Metadata metadata, SessionFactoryImplementor sessionFactory, SessionFactoryServiceRegistry serviceRegistry) {
HibernateInfoHolder.setMetadata(metadata);
HibernateInfoHolder.setSessionFactory(sessionFactory);
HibernateInfoHolder.setServiceRegistry(serviceRegistry);
}
#Override
public void disintegrate(SessionFactoryImplementor sessionFactory, SessionFactoryServiceRegistry serviceRegistry) {
}
}
Use it
new SchemaExport((MetadataImplementor) HibernateInfoHolder.getMetadata()).create(true, true);
new SchemaUpdate(HibernateInfoHolder.getServiceRegistry(), (MetadataImplementor) HibernateInfoHolder.getMetadata()).execute(true, true);
Additional info you can find here Programmatic SchemaExport / SchemaUpdate with Hibernate 5 and Spring 4
There is Configuration over Convention principle for Java Persistence API but Validation API is intended for validation purpose only. Validation is not absolute you can put different validation rules on the same field.
if you have for example
#Size(max = 50)
#NotNull(groups = DefaultGroup.class)
#Null(groups = SecondGroup.class)
private String shortTitle;
then it is interpreted as
#Size(max = 50)
#NotNull(groups = DefaultGroup.class)
#Null(groups = SecondGroup.class)
#Column(length = 255, nullable = true)
private String shortTitle;
see more details here
Why does Hibernate Tools hbm2ddl generation not take into account Bean Validation annotations?
For Hibernate 5.2.7 (in my case) I've wrote a method to export schema that is based on packages scan like:
static void exportSchema(
DataSource dataSource,
Class<? extends Dialect> dialect,
String... packagesToScan) {
StandardServiceRegistryBuilder registryBuilder = new StandardServiceRegistryBuilder()
.applySetting(DATASOURCE, dataSource)
.applySetting(DIALECT, dialect); // dialect could be omitted
MetadataSources metadataSources = new MetadataSources(registryBuilder.build());
PathMatchingResourcePatternResolver resourceLoader = new PathMatchingResourcePatternResolver();
new LocalSessionFactoryBuilder(null, resourceLoader, metadataSources)
.scanPackages(packagesToScan);
Metadata metadata = metadataSources.buildMetadata();
new SchemaExport()
.setFormat(true)
.create(EnumSet.of(STDOUT, DATABASE), metadata);
}
Related
I upgraded a project from Spring 4.3.9.RELEASE + Hibernate 4.3.11.Final to Spring Boot 2.1.4.RELEASE and Hibernate 5.3.9.Final. The queries are still working fine, but I'm getting LazyInitializationException with some #OneToMany class members.
First I retrieve the object, which has a reference to a #OneToMany List, from the #Transaction service. The collection is returned to the controller, and from there it goes back to Spring to be serialized into a json. The controller has #RestController, so it knows what to do.
In Spring 4.3.9.RELEASE + Hibernate 4.3.11.Final everything was fine, even though OpenEntityManagerInView wasn't enabled by configuration and the collection wasn't loaded with EAGER mode. But in Spring Boot 2.1.4.RELEASE and Hibernate 5.3.9.Final the same thing doesn't work anymore. I've tried enabling OEMIV, by setting spring.jpa.open-in-view=true, but even this doesn't seem to work or it's being overriden somewhere.
If I enable EAGER loading mode for that collection, everything works fine.
pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
#Entity
#JsonSerialize(using = TemplateSerializer.class)
public class Template implements Serializable {
#Id
#GeneratedValue
private Long id;
private String name;
#ManyToOne
private ObjFormat objFormat;
#OneToOne
#JoinColumn(name = "event_id")
#OnDelete(action = OnDeleteAction.CASCADE)
private Event event;
#OneToMany
#JoinColumn(name = "category_id")
private List<Category> linkToCategories;
The problem is caused by field linkToCategories. If I configure #OneToMany(fetch = FetchType.EAGER) everything works fine.
Application configuration:
#Bean
public LocalSessionFactoryBean sessionFactory(DataSource dataSource) throws ClassNotFoundException {
LocalSessionFactoryBean localSessionFactoryBean = new LocalSessionFactoryBean();
localSessionFactoryBean.setDataSource(dataSource);
localSessionFactoryBean.setPackagesToScan("com.project.backend.model",
"com.project.backend.hibernate.converters");
return localSessionFactoryBean;
}
#Bean
public HibernateTransactionManager transactionManager(SessionFactory sessionFactory) {
return new HibernateTransactionManager(sessionFactory);
}
Later edit:
After a lot of debugging, the difference between the old and the new Hibernate functionality is in the HibernateTransactionManager. In the method doGetTransaction(), in Hibernate 4 it finds the SessionHolder object when calling
TransactionSynchronizationManager.getResource(getSessionFactory())
while in Hibernate 5 it doesn't.
SessionHolder sessionHolder =
(SessionHolder) TransactionSynchronizationManager.getResource(getSessionFactory());
if (sessionHolder != null) {
if (logger.isDebugEnabled()) {
logger.debug("Found thread-bound Session [" + sessionHolder.getSession() + "] for Hibernate transaction");
}
txObject.setSessionHolder(sessionHolder);
}
else if (this.hibernateManagedSession) {
try {
Session session = this.sessionFactory.getCurrentSession();
if (logger.isDebugEnabled()) {
logger.debug("Found Hibernate-managed Session [" + session + "] for Spring-managed transaction");
}
txObject.setExistingSession(session);
}
catch (HibernateException ex) {
throw new DataAccessResourceFailureException(
"Could not obtain Hibernate-managed Session for Spring-managed transaction", ex);
}
}
In the method doBegin, a new session is created and set on the txObject for every request.
if (txObject.getSessionHolder() == null || txObject.getSessionHolder().isSynchronizedWithTransaction()) {
Interceptor entityInterceptor = getEntityInterceptor();
Session newSession = (entityInterceptor != null ?
getSessionFactory().withOptions().interceptor(entityInterceptor).openSession() :
getSessionFactory().openSession());
if (logger.isDebugEnabled()) {
logger.debug("Opened new Session [" + newSession + "] for Hibernate transaction");
}
txObject.setSession(newSession);
}
My experience with Hibernate is fairly small, so here I'm stuck. It's probably a configuration thing, but I can't find it.
As M. Deinum was saying, the Spring 4.3.9.RELEASE + Hibernate 4.3.11.Final configuration was loading OpenSessionInViewFilter, which explains why all the queries were going through successfully. After configuring the same filter in Spring Boot, everything is back to normal. Add the following bean to register the filter:
#Bean
public FilterRegistrationBean<OpenSessionInViewFilter> registerOpenSessionInViewFilterBean() {
FilterRegistrationBean<OpenSessionInViewFilter> registrationBean = new FilterRegistrationBean<>();
OpenSessionInViewFilter filter = new OpenSessionInViewFilter();
registrationBean.setFilter(filter);
return registrationBean;
}
The next step is to replace plain Hibernate with JPA, and OpenSessionInViewFilter with OpenEntityManagerInViewFilter.
Thanks M. Deinum.
#xxxToMany annotations inicates that fetch type is LAZY by default. It means that you need to initialize collection your entity refers to.
Eg.
#Entity
public class Book {
#OneToMany
public List<Author> authors;
}
There is few ways to resolve this. You can modify #OneToMany annotation with:
#OneToMany(FetcType=FetchType.EAGER)
Or to make a method where you will initialize authors eg.:
public void initializeAuthors(Book book) {
Book b = em.find(Book.class, book.getId());
List<Author> authors = new ArrayList<>(b.getAuthors());
book.setAuthors(authors);
}
If you have #NamedQueries on your entities, you can do that by adding LEFT JOIN FETCH on your collections.
My Spring Boot 1.3.1 based application relies on an Oracle 11.2 database and I want to tune the fetching of SELECT statement results.
JdbcTemplate offers public void setFetchSize(int fetchSize) to tune the fetch size, which for Oracle is preset to 10 by the driver:
Set the fetch size for this JdbcTemplate. This is important for
processing large result sets: Setting this higher than the default
value will increase processing speed at the cost of memory
consumption; setting this lower can avoid transferring row data that
will never be read by the application. Default is -1, indicating to
use the JDBC driver's default (i.e. to not pass a specific fetch size
setting on the driver).
The Oracle JDBC driver (I use ojdbc7.jar because it is downwards compatible) offers a defaultRowPrefetch parameter to increase the fetch size for the complete database connection.
According to the docs this parameter could be set this way:
java.util.Properties info = new java.util.Properties();
info.put ("user", "scott");
info.put ("password","tiger");
info.put ("defaultRowPrefetch","15");
getConnection ("jdbc:oracle:oci8:#",info);
But my application is configured using application.yml:
datasource:
url: jdbc:oracle:thin:#xyz:1521:abc
username: ${name}
password: ${password}
driver-class-name: oracle.jdbc.driver.OracleDriver
...
And even if I wanted to change that configuration to use spring.datasource.url=jdbc:... instead there is no way to set the fetch size globally according to this post.
Is there a more "Spring Boot style" approach or do I need to configure each template manually ?
A BeanPostProcessor will process all the beans in the ApplicationContext and that way you can add additional configuration or replace it totally if you would like.
You could create a BeanPostProcessor that would add the properties to the configured DataSource. The sample below assumes the use of commons-dbcp 1 or 2 if you use a different DataSource modify accordingly.
public class DataSourceConfiguringBeanPostProcessor implements BeanPostProcessor {
private final Map<String,String> properties = new HashMap<>;
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (bean instance BasicDataSource ) {
for (Map.Entry<String, String> prop : properties.entrySet()) {
((BasicDataSource) bean).addConnectionProperty(prop.getKey(), prop.getValue());
}
}
return bean;
}
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
public void setProperties(Map<String, String> properties) {
this.properties.putAll(properties);
}
}
Now you can add this to your configuration and it will add the properties to DataSource beans.
#Bean
public BeanPostProcessor dataSourcePostProcessor() {
DataSourceConfiguringBeanPostProcessor processor = new DataSourceConfiguringBeanPostProcessor();
Map<String, String> properties = new HashMap<>();
properties.put("defaultRowPrefetch", "15");
properties.put("defaultBatchValue", "25");
processor.setProperties(properties);
return processor;
}
That should do the trick for configuring the datasource.
We currently have an application which uses multiple databases with the same schema. At the moment we're using a custom solution for switching between them based on the user's session. This works via
public final class DataSourceProxy extends BasicDataSource {
...
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
if (auth != null && auth.getDetails() instanceof Map) {
Map<String, String> details = (Map<String, String>) auth.getDetails();
String targetUrl = details.get("database");
Connection c = super.getConnection();
Statement s = c.createStatement();
s.execute("USE " + targetUrl + ";");
s.close();
return c;
} else {
return super.getConnection();
}
}
Now we want to build a solution using AbstractRoutingDataSource. The problem is:
#Component
public class CustomRoutingDataSource extends AbstractRoutingDataSource {
#Autowired
Environment env;
#Autowired
DbDetailsRepositoy repo;
public CustomRoutingDataSource() {
Map<Object, Object> targetDataSources = new HashMap<Object, Object>();
for(DBDetails dbd : repo.findAll() {
// create DataSource and put it into the map
}
setTargetDataSources(new HashMap<Object, Object>());
}
#Override
protected Object determineCurrentLookupKey() {
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
if (auth != null && auth.getDetails() instanceof Map) {
Map<String, String> details = (Map<String, String>) auth.getDetails();
return details.get("database");
}
return null;
}
}
Inside the constructor (or even via #PostConstruct) we have to fill the targetDataSources Map. But(!) for this we need the connection details which are stored in another database, which has its own DataSource and Entity Manager.
It seems like Spring can't determine the order of Bean construction, or maybe I'm just missing something. It always gives a NullPointerException when accessing the repository (which btw is a JpaRepository).
We're using Spring 3.2.3, Spring Data, Hibernate 4.2. Complete Annotation and Java-Code configuration of Spring and Spring Security.
Please help us!
Spring of course has to call the constructor before it can populate the properties. But that's not a Spring thing, that's basic Java 101 and one of the plenty downsides of using field injection.
To avoid this, simply add your dependencies to the constructor:
#Component
class CustomRoutingDataSource extends AbstractRoutingDataSource {
#Autowired
public CustomRoutingDataSource(DbDetailsRepository repo, Environment environment) {
…
}
…
}
I want to search for some annotations in a Spring based web application, like #Entity. Therefore I need the same functionality like Spring involves when the server starts up and it looks for all classes that are annotated with #Component. In my case I don't create singleton's, it's just important for me to collect all those classes annotated with #Entity.
Is there any possibility to use existing Spring tools for that? I want to search exactly in the same namespace as Spring does for the #Component annotations.
Sure, look at parse() method in org.springframework.context.annotation.ComponentScanBeanDefinitionParser. This method is called when Spring encounters <context:component-scan/> in the XML configuration. Probably You can strip it a bit to better suit your needs, but it should serve as a comprehensive example.
The class You should be particularly interested in is org.springframework.context.annotation.ClassPathBeanDefinitionScanner. From JavaDoc:
Candidate classes are detected through configurable type filters. The default filters include classes that are annotated with Spring's #Component, #Repository, #Service, or #Controller stereotype.
BTW if you need less general solution, maybe your persistence provider has some API to fetch all entity classes?
Spring's built-in classpath scanning infrastructure (ClassPathBeanDefinitionScanner/ ComponentScanBeanDefinitionParser) is geared up for registering classes as BeanDefinitions within an Spring appcontext.
If you're just looking to obtain a list of classes annotated with a given annotation (rather than actually register them in Spring as bean definitions) take a look at the Google Reflections library.
Reflections allows you to scan your classpath using various filters, including an annotation filter.
Reflections reflections = new Reflections("my.project.prefix");
Set<Class<? extends SomeClassOrInterface>> subTypes = reflections.getSubTypesOf(SomeClassOrInterface.class);
Set<Class<?>> annotated = reflections.getTypesAnnotatedWith(SomeAnnotation.class);
Spring based solution
Use spring AnnotationTypeFilter and pass Entity.class as annotationType
using ResourcePatternResolver load all resouces(.class) under given pacakage
Use SimpleMetadataReaderFactory to get MetadataReader
for each resource you can call match on AnnotationTypeFilter using MetadataReader
metadataReader.getAnnotationMetadata().getClassName() will provide FQN of class
usage
AnnotatedClassFinder entityScanner = new AnnotatedClassFinder(Entity.class);
entityScanner.setPackages(Arrays.asList("org.myapp.domain"));
Collection<Class<?>> entities = entityScanner.findMarkedClassOfType();
public class AnnotatedClassFinder {
private static final String CLASS_RESOURCE_PATTERN = "**/*.class";
private List<String> packages;
private final ResourceLoader resourceLoader = new DefaultResourceLoader();
private final ResourcePatternResolver resourcePatternResolver = ResourcePatternUtils
.getResourcePatternResolver(resourceLoader);
private final MetadataReaderFactory metadataReaderFactory = new SimpleMetadataReaderFactory();
private final TypeFilter annotationFilter;
public AnnotatedClassFinder(final Class<? extends Annotation> annotationToScanFor) {
annotationFilter = new AnnotationTypeFilter(annotationToScanFor);
}
public Set<Class<?>> findMarkedClassOfType() {
if (packages == null) {
return new HashSet<Class<?>>();
}
final Set<Class<?>> annotatedClasses = new HashSet<Class<?>>();
try {
for (final String p : packages) {
final String packageSearchPath = ResourcePatternResolver.CLASSPATH_ALL_URL_PREFIX
+ ClassUtils.convertClassNameToResourcePath(SystemPropertyUtils.resolvePlaceholders(p)) + "/"
+ CLASS_RESOURCE_PATTERN;
final Resource[] resources = resourcePatternResolver.getResources(packageSearchPath);
for (final Resource resource : resources) {
if (resource.isReadable()) {
final MetadataReader metadataReader = this.metadataReaderFactory.getMetadataReader(resource);
if (annotationFilter.match(metadataReader, metadataReaderFactory)) {
annotatedClasses.add(Class.forName(metadataReader.getAnnotationMetadata().getClassName()));
}
}
}
}
return annotatedClasses;
} catch (final IOException ex) {
throw new RuntimeException("I/O failure during classpath scanning", ex);
} catch (final ClassNotFoundException ex) {
throw new RuntimeException("Class loading failure during classpath scanning", ex);
}
}
public void setPackages(final List<String> packages) {
this.packages = packages;
}
}
I am using a properties File to store some configuration properties, that are accessed this way:
#Value("#{configuration.path_file}")
private String pathFile;
Is it possible (with Spring 3) to use the same #Value annotation, but loading the properties from a database instead of a file ?
Assuming you have a table in your database stored key/value pairs:
Define a new bean "applicationProperties" - psuedo-code follows...
public class ApplicationProperties {
#AutoWired
private DataSource datasource;
public getPropertyValue(String key) {
// transact on your datasource here to fetch value for key
// SNIPPED
}
}
Inject this bean where required in your application. If you already have a dao/service layer then you would just make use of that.
Yes, you can keep your #Value annotation, and use the database source with the help of EnvironmentPostProcessor.
As of Spring Boot 1.3, we're able to use the EnvironmentPostProcessor to customize the application's Environment before application context is refreshed.
For example, create a class which implements EnvironmentPostProcessor:
public class ReadDbPropertiesPostProcessor implements EnvironmentPostProcessor {
private static final String PROPERTY_SOURCE_NAME = "databaseProperties";
private String[] CONFIGS = {
"app.version"
// list your properties here
};
#Override
public void postProcessEnvironment(ConfigurableEnvironment environment, SpringApplication application) {
Map<String, Object> propertySource = new HashMap<>();
try {
// the following db connections properties must be defined in application.properties
DataSource ds = DataSourceBuilder
.create()
.username(environment.getProperty("spring.datasource.username"))
.password(environment.getProperty("spring.datasource.password"))
.url(environment.getProperty("spring.datasource.url"))
.driverClassName("com.mysql.jdbc.Driver")
.build();
try (Connection connection = ds.getConnection();
// suppose you have a config table storing the properties name/value pair
PreparedStatement preparedStatement = connection.prepareStatement("SELECT value FROM config WHERE name = ?")) {
for (int i = 0; i < CONFIGS.length; i++) {
String configName = CONFIGS[i];
preparedStatement.setString(1, configName);
ResultSet rs = preparedStatement.executeQuery();
while (rs.next()) {
propertySource.put(configName, rs.getString("value"));
}
// rs.close();
preparedStatement.clearParameters();
}
}
environment.getPropertySources().addFirst(new MapPropertySource(PROPERTY_SOURCE_NAME, propertySource));
} catch (Throwable e) {
throw new RuntimeException(e);
}
}
}
Finally, don't forget to put your spring.factories in META-INF. An example:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=
com.baeldung.environmentpostprocessor.autoconfig.PriceCalculationAutoConfig
Although not having used spring 3, I'd assume you can, if you make a bean that reads the properties from the database and exposes them with getters.