I want to upgrade my hibernate version 4.x.x to 5.x.x. But after upgrading i can not find getTableMappings method in Configuration class of hibernate 5. I need this before building sessionfactory. Earlier it was available in hibernate. What can be the right solution for it ?
I guess you can achieve what you want only by refusing usage legacy bootstrapping:
Configuration is semi-deprecated but still available for use, in a limited form that eliminates these drawbacks. "Under the covers", Configuration uses the new bootstrapping code, so the things available there are also available here in terms of auto-discovery.
As described here, you can bootstrap a Hibernate SessionFactory in the following way:
import org.hibernate.boot.Metadata;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.registry.StandardServiceRegistry;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import org.hibernate.mapping.PersistentClass;
// ...
StandardServiceRegistry serviceRegistry = new StandardServiceRegistryBuilder()
.configure("hibernate.cfg.xml")
.build();
MetadataSources metadata = new MetadataSources(serviceRegistry);
Metadata meta = metadata.buildMetadata();
// Retrieves the PersistentClass entity metadata representation for all known entities.
for (PersistentClass pClass : meta.getEntityBindings())
{
// ...
}
sessionFactory = meta.buildSessionFactory();
Related
I'm upgrading an application from Spring Boot 1.5 to 2.0 and my Spring Data JPA repositories are broken. I turned on query logging in my PostgreSQL 9.6 db to see what was different in the queries before and after the application upgrade and observed that as of 2.0, query parameters are being wrapped in double quotes which is unnecessary and breaking. Here's what I'm seeing in the query log:
Spring Boot 1.5.22
LOG: execute <unnamed>: select siteentity0_.site_id as site_id1_14_, siteentity0_.description as descript2_14_, siteentity0_.directory as director3_14_, siteentity0_.ip_address as ip_addre4_14_, siteentity0_.name as name5_14_, siteentity0_.server as server6_14_, siteentity0_.status as status7_14_, siteentity0_.type as type8_14_ from site siteentity0_ where siteentity0_.ip_address=$1
DETAIL: parameters: $1 = '127.0.0.1'
Spring Boot 2.0.9
LOG: execute <unnamed>: select siteentity0_.site_id as site_id1_14_, siteentity0_.description as descript2_14_, siteentity0_.directory as director3_14_, siteentity0_.ip_address as ip_addre4_14_, siteentity0_.name as name5_14_, siteentity0_.server as server6_14_, siteentity0_.status as status7_14_, siteentity0_.type as type8_14_ from site siteentity0_ where siteentity0_.ip_address=$1
DETAIL: parameters: $1 = '"127.0.0.1"'
I've checked all the release notes and migration guide and can't find anything that would explain this, nor can I find any similar reports. Any ideas?
EDIT:
The repository:
import java.util.Collection;
import org.springframework.data.jpa.repository.JpaRepository;
public interface SiteRepository extends JpaRepository<SiteEntity, Integer> {
SiteEntity findByName(String siteName);
Collection<SiteEntity> findByIpAddress(String ipAddress);
Collection<SiteEntity> findByStatus(String status);
Collection<SiteEntity> findByType(String type);
}
You can replace if the parameters contains double quote to empty
I figured it out. I had an attribute converter for converting JSON to string with autoApply set to true. I changed it to false and now it works. I suppose autoApply is more permissive in 2.x.
#Converter(autoApply = false)
public class JpaConverterJson implements AttributeConverter<Object, String> {
I'm trying to set up a project with two data sources, one is MongoDB and the other is Postgres. I have repositories for each data source in different packages and I annotated my main class as follows:
#Import({MongoDBConfiguration.class, PostgresDBConfiguration.class})
#SpringBootApplication(exclude = {
MongoRepositoriesAutoConfiguration.class,
JpaRepositoriesAutoConfiguration.class
})
public class TemporaryRunner implements CommandLineRunner {
...
}
MongoDBConfiguration:
#Configuration
#EnableMongoRepositories(basePackages = {
"com.example.datastore.mongo",
"com.atlassian.connect.spring"})
public class MongoDBConfiguration {
...
}
PostgresDBConfiguration:
#Configuration
#EnableJpaRepositories(basePackages = {
"com.example.datastore.postgres"
})
public class PostgresDBConfiguration {
...
}
And even though I specified the base packages as described in documentation, I still get those messages in the console:
13:10:44.238 [main] [] INFO o.s.d.r.c.RepositoryConfigurationDelegate - Multiple Spring Data modules found, entering strict repository configuration mode!
13:10:44.266 [main] [] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data MongoDB - Could not safely identify store assignment for repository candidate interface com.atlassian.connect.spring.AtlassianHostRepository.
I managed to solve this issue for all my repositories by using MongoRepository and JpaRepository but AtlassianHostRepository comes from an external lib and it is a regular CrudRepository (which totally makes sense because the consumer of the lib can decide what type of DB he would like to use). Anyway it looks that basePackages I specified are completely ignored and not used in any way, even though I specified com.atlassian.connect.spring package only in #EnableMongoRepositories Spring Data somehow can't figure out which data module should be used.
Am I doing something wrong? Is there any other way I could tell spring data to use mongo for AtlassianHostRepository without changing the AtlassianHostRepository.class itself?
The only working solution I found was to let spring data ignore AtlassianHostRepository (because it couldn't figure out which data source to use) then create a separate configuration for it, and simply create it by hand:
#Configuration
#Import({MongoDBConfiguration.class})
public class AtlassianHostRepositoryConfiguration {
private final MongoTemplate mongoTemplate;
#Autowired
public AtlassianHostRepositoryConfiguration(final MongoTemplate mongoTemplate) {
this.mongoTemplate = mongoTemplate;
}
#Bean
public AtlassianHostRepository atlassianHostRepository() {
RepositoryFactorySupport factory = new MongoRepositoryFactory(mongoTemplate);
return factory.getRepository(AtlassianHostRepository.class);
}
}
This solution works fine for a small or limited number of repositories used from a library, it would be rather cumbersome to create all the repositories by hand when there are more of them, but after reading the source code of spring-data I see no way to make it work with basePackages as stated in documentation (I may be wrong though).
We use Flyway for db migration in our Spring Boot based app and now we have a requirement to introduce multi tenancy support while using multiple datasources strategy. As part of that we also need to support migration of multiple data sources. All data sources should maintain the same structure so same migration scripts should be used for migrating of all data sources. Also, migrations should occur upon application startup (as opposed to build time, whereas it seems that the maven plugin can be configured to migrate multiple data sources). What is the best approach to use in order to achieve this? The app already has data source beans defined but Flyway executes the migration only for the primary data source.
To make #Roger Thomas answer more the Spring Boot way:
Easiest solution is to annotate your primary datasource with #Primary (which you already did) and just let bootstrap migrate your primary datasource the 'normal' way.
For the other datasources, migrate those sources by hand:
#Configuration
public class FlywaySlaveInitializer {
#Autowired private DataSource dataSource2;
#Autowired private DataSource dataSource3;
//other datasources
#PostConstruct
public void migrateFlyway() {
Flyway flyway = new Flyway();
//if default config is not sufficient, call setters here
//source 2
flyway.setDataSource(dataSource2);
flyway.setLocations("db/migration_source_2");
flyway.migrate();
//source 3
flyway.setDataSource(dataSource3);
flyway.setLocations("db/migration_source_3");
flyway.migrate();
}
}
Flyway supports migrations coded within Java and so you can start Flyway during your application startup.
https://flywaydb.org/documentation/migration/java
I am not sure how you would config Flyway to target a number of data sources via the its config files. My own development is based around using Java to call Flyway once per data source I need to work against. Spring Boot supports the autowiring of beans marked as #FlywayDataSource, but I have not looked into how this could be used.
For an in-java solution the code can be as simple as
Flyway flyway = new Flyway();
// Set the data source
flyway.setDataSource(dataSource);
// Where to search for classes to be executed or SQL scripts to be found
flyway.setLocations("net.somewhere.flyway");
flyway.setTarget(MigrationVersion.LATEST);
flyway.migrate();
Having your same problem... I looked into the spring-boot-autoconfigure artifact for V 2.2.4 in the org.springframework.boot.autoconfigure.flyway package and I found an annotation FlywayDataSource.
Annotating ANY datasource you want to be used by Flyway should do the trick.
Something like this:
#FlywayDataSource
#Bean(name = "someDatasource")
public DataSource someDatasource(...) {
<build and return your datasource>
}
Found an easy solution for that - I added the step during the creation of my emf:
#Qualifier(EMF2)
#Bean(name = EMF2)
public LocalContainerEntityManagerFactoryBean entityManagerFactory2(
final EntityManagerFactoryBuilder builder
) {
final DataSource dataSource = dataSource2();
Flyway.configure()
.dataSource(dataSource)
.locations("db/migration/ds2")
.load()
.migrate();
return builder
.dataSource(dataSource)
.packages(Role.class)
.properties(jpaProperties2().getProperties())
.persistenceUnit("domain2")
.build();
}
I disabled spring.flyway.enabled for that.
SQL files live in resources/db/migration/ds1/... and resources/db/migration/ds2/...
This worked for me.
import javax.annotation.PostConstruct;
import org.flywaydb.core.Flyway;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
#Configuration
public class FlywaySlaveInitializer {
#Value("${firstDatasource.db.url}")
String firstDatasourceUrl;
#Value("${firstDatasource.db.user}")
String firstDatasourceUser;
#Value("${firstDatasource.db.password}")
String firstDatasourcePassword;
#Value("${secondDatasource.db.url}")
String secondDatasourceUrl;
#Value("${secondDatasource.db.user}")
String secondDatasourceUser;
#Value("${secondDatasource.db.password}")
String secondDatasourcePassword;
#PostConstruct
public void migrateFlyway() {
Flyway flywayIntegration = Flyway.configure()
.dataSource(firstDatasourceUrl, firstDatasourceUser, firstDatasourcePassword)
.locations("filesystem:./src/main/resources/migration.first")
.load();
Flyway flywayPhenom = Flyway.configure()
.dataSource(secondDatasourceUrl, secondDatasourceUser, secondDatasourcePassword)
.locations("filesystem:./src/main/resources/migration.second")
.load();
flywayIntegration.migrate();
flywayPhenom.migrate();
}
}
And in my application.yml this property:
spring:
flyway:
enabled: false
I'm creating a Spring Starter project and need to get all classes which are marked with a custom annotation. The annotated class is not a spring bean.
My current solution is to use the ClassPathScanningCandidateComponentProvider to find the required classes.
ClassPathScanningCandidateComponentProvider scanner =
new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(CustomAnnotation.class));
candidates = scanner.findCandidateComponents("THE MISSING PACKAGE NAME");
The problem is that I'm currently provide an empty package String so that all packages/classes are scanned which slows the startup down.
I need to access the packages which are scanned by Spring to avoid the scanning of all packages and classes.
Is there a way to retrieve all packages programmatically which are scanned by Spring or is there an alternative solution to retrieve custom annotated classes which are not Spring beans.
Greets
One solution without the need to make a full classpath scan is to use the AutowiredAnnotationBeanPostProcessor:
private List<Class<?>> candidates = new ArrayList<>();
#Override
public Object postProcessBeforeInstantiation(Class<?> beanClass, String beanName) throws BeansException {
if(beanClass.isAnnotationPresent(YourAnnotation.class)){
candiates.add(beanClass));
System.out.println(beanClass);
return new Object();
}
}
#Bean
public CandiateHolder candidates() {
return new CandidateHolder(candidates);
}
You can check if the bean class which should be instantiated has the required annotation. If its the case you add the class to a property to expose it later as a bean. Instead of returning null you have to return an instance of a new Object. The returned object can be used to wrap the class in a proxy. Cause I don't need an instance I will return a simple new object. Its maybe a dirty hack but it works.
I have to use this kind of hack cause an instantiation of the needed object will result in an runtime error cause it has to be instantiated in the framework I use.
How can I refresh Spring configuration file without restarting my servlet container?
I am looking for a solution other than JRebel.
For those stumbling on this more recently -- the current and modern way to solve this problem is to use Spring Boot's Cloud Config.
Just add the #RefreshScope annotation on your refreshable beans and #EnableConfigServer on your main/configuration.
So, for example, this Controller class:
#RefreshScope
#RestController
class MessageRestController {
#Value("${message}")
private String message;
#RequestMapping("/message")
String getMessage() {
return this.message;
}
}
Will return the new value of your message String property for the /message endpoint when refresh is invoked on Spring Boot Actuator (via HTTP endpoint or JMX).
See the official Spring Guide for Centralized Configuration example for more implementation details.
Well, it can be useful to perform such a context reload while testing your app.
You can try the refresh method of one of the AbstractRefreshableApplicationContext class: it won't refresh your previously instanciated beans, but next call on the context will return refreshed beans.
import java.io.File;
import java.io.IOException;
import org.apache.commons.io.FileUtils;
import org.springframework.context.support.FileSystemXmlApplicationContext;
public class ReloadSpringContext {
final static String header = "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n" +
"<!DOCTYPE beans PUBLIC \"-//SPRING//DTD BEAN//EN\"\n" +
" \t\"http://www.springframework.org/dtd/spring-beans.dtd\">\n";
final static String contextA =
"<beans><bean id=\"test\" class=\"java.lang.String\">\n" +
"\t\t<constructor-arg value=\"fromContextA\"/>\n" +
"</bean></beans>";
final static String contextB =
"<beans><bean id=\"test\" class=\"java.lang.String\">\n" +
"\t\t<constructor-arg value=\"fromContextB\"/>\n" +
"</bean></beans>";
public static void main(String[] args) throws IOException {
//create a single context file
final File contextFile = File.createTempFile("testSpringContext", ".xml");
//write the first context into it
FileUtils.writeStringToFile(contextFile, header + contextA);
//create a spring context
FileSystemXmlApplicationContext context = new FileSystemXmlApplicationContext(
new String[]{contextFile.getPath()}
);
//echo the bean 'test' on stdout
System.out.println(context.getBean("test"));
//write the second context into it
FileUtils.writeStringToFile(contextFile, header + contextB);
//refresh the context
context.refresh();
//echo the bean 'test' on stdout
System.out.println(context.getBean("test"));
}
}
And you get this result
fromContextA
fromContextB
Another way to achieve this (and maybe a more simple one) is to use the Refreshable Bean feature of Spring 2.5+
With dynamic language (groovy, etc) and spring you can even change your bean behavior. Have a look to the spring reference for dynamic language:
24.3.1.2. Refreshable beans
One of the (if not the) most
compelling value adds of the dynamic
language support in Spring is the
'refreshable bean' feature.
A refreshable bean is a
dynamic-language-backed bean that with
a small amount of configuration, a
dynamic-language-backed bean can
monitor changes in its underlying
source file resource, and then reload
itself when the dynamic language
source file is changed (for example
when a developer edits and saves
changes to the file on the
filesystem).
I wouldn't recommend you to do that.
What do you expect to happen to singleton beans which their configuration modified? do you expect all singletons to reload? but some objects may hold references to that singletons.
See this post as well Automatic configuration reinitialization in Spring
You can take a look at this http://www.wuenschenswert.net/wunschdenken/archives/138 where once you change any thing in the properties file and save it the beans will be reloaded with the new values.