Retrive JMS ConnectionFactory from JNDI using Spring Boot auto-configuration - spring

I want to use the spring boot autoconfigure for JMS to connect to a remote JNDI and retrieve the ConnectionFactory based on his name populated through the spring.jms.jndi-name property in the application.properties file.
I noticed that the spring boot autoconfigure is relying on the JndiConnectionFactoryAutoConfiguration class to do that and this class in turn will call the JndiTemplate class to do the lookup. The problem is that the value of the environment attribute of the JndiTemplate class is null, so we cannot create the intialContext.
In fact, I noticed that the JndiTemplate class is instantiated with no-argument constructor in starting application and before loading the configuration defined in the JndiConnectionFactoryAutoConfiguration class.
My question: how can I instantiate JndiTemplate by specifying a list of properties (Context.INITIAL_CONTEXT_FACTORY, Context.PROVIDER_URL..)? knowing that JmsTemplate has a constructor that takes an Properties object.
Just for information: my application is a simple jar that doesn’t run on a server at the moment.

For those interested in the answer, you must use VM options to pass required JNDI properties.
Here is an example that works with ActiveMQ:
VM options:
-Djava.naming.provider.url=tcp://hostname:61616
-Djava.naming.factory.initial=org.apache.activemq.jndi.ActiveMQInitialContextFactory
And spring properties file (application.properties) must contain the JNDI name of the connection factory:
spring.jms.jndi-name=ConnectionFactory
Much better, you can use configuration to fin your connection factory from JNDI. In my project, we finished by creating our jms starter that we can use in all microservices.
Properties class:
import lombok.*;
import org.springframework.boot.context.properties.ConfigurationProperties;
#Getter
#Setter
#ToString
#NoArgsConstructor
#EqualsAndHashCode
#ConfigurationProperties( prefix = "custom.jms" )
public class CustomJmsProperties {
private String jndiName;
private String contextFactoryClass;
private String providerUrl;
private String username;
private String password;
}
Configuration class:
import org.apache.commons.lang3.StringUtils;
import org.springframework.boot.autoconfigure.AutoConfigureAfter;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.boot.autoconfigure.jms.JndiConnectionFactoryAutoConfiguration;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jms.connection.UserCredentialsConnectionFactoryAdapter;
import org.springframework.jndi.JndiLocatorDelegate;
import javax.jms.ConnectionFactory;
import javax.naming.Context;
import javax.naming.NamingException;
import java.util.Properties;
#Configuration
#ConditionalOnProperty( "custom.jms.jndi-name" )
#ConditionalOnMissingBean( ConnectionFactory.class )
#EnableConfigurationProperties( { CustomJmsProperties.class } )
#AutoConfigureAfter( { JndiConnectionFactoryAutoConfiguration.class } )
public class CustomJndiConnectionFactoryAutoConfiguration {
#Bean
public ConnectionFactory connectionFactory( CustomJmsProperties customJmsProperties ) throws NamingException {
ConnectionFactory connectionFactory = lookupForConnectionFactory( customJmsProperties );
return getEnhancedUserCredentialsConnectionFactory( customJmsProperties, connectionFactory );
}
private ConnectionFactory lookupForConnectionFactory( final CustomJmsProperties customJmsProperties ) throws NamingException {
JndiLocatorDelegate jndiLocatorDelegate = new JndiLocatorDelegate();
Properties jndiProperties = getJndiProperties( customJmsProperties );
jndiLocatorDelegate.setJndiEnvironment( jndiProperties );
return jndiLocatorDelegate.lookup( customJmsProperties.getJndiName(), ConnectionFactory.class );
}
private Properties getJndiProperties( final CustomJmsProperties customJmsProperties ) {
Properties jndiProperties = new Properties();
jndiProperties.setProperty( Context.PROVIDER_URL, customJmsProperties.getProviderUrl() );
jndiProperties.setProperty( Context.INITIAL_CONTEXT_FACTORY, customJmsProperties.getContextFactoryClass() );
if ( StringUtils.isNotEmpty( customJmsProperties.getUsername() ) ) {
jndiProperties.setProperty( Context.SECURITY_PRINCIPAL, customJmsProperties.getUsername() );
}
if ( StringUtils.isNotEmpty( customJmsProperties.getPassword() ) ) {
jndiProperties.setProperty( Context.SECURITY_CREDENTIALS, customJmsProperties.getPassword() );
}
return jndiProperties;
}
private UserCredentialsConnectionFactoryAdapter getEnhancedUserCredentialsConnectionFactory( final CustomJmsProperties customJmsProperties,
final ConnectionFactory connectionFactory ) {
UserCredentialsConnectionFactoryAdapter enhancedConnectionFactory = new UserCredentialsConnectionFactoryAdapter();
enhancedConnectionFactory.setTargetConnectionFactory( connectionFactory );
enhancedConnectionFactory.setUsername( customJmsProperties.getUsername() );
enhancedConnectionFactory.setPassword( customJmsProperties.getPassword() );
enhancedConnectionFactory.afterPropertiesSet();
return enhancedConnectionFactory;
}
}
Properties file of your project:
custom.jms.provider-url=tcp://hostname:61616
custom.jms.context-factory-class=org.apache.activemq.jndi.ActiveMQInitialContextFactory
custom.jms.jndi-name=ConnectionFactory

Related

Hibernate single table inheritance creating columns for subclasses in springboot [duplicate]

Note that this code does work with plain Spring but not with Spring Boot(v1.3.3), is there something i'm missing because this is imported from a spring app that works. The code below is from the spring boot app
#Entity
#Table(name="project")
public class Project implements Serializable{
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
#Column(name="id")
private int id;
#Column(name="teamId")
private int teamId;
//private String Rentabiliteit;
#Column
//#Index(name="IProject_status",columnNames="Status")
private String status;
#Column
//#Index(name="IProject_naam",columnNames="Naam")
private String naam;
//public Prototype m_Prototype;
//public Team m_Team;
}
SQL
CREATE TABLE IF NOT EXISTS `project` (
`id` int(11) NOT NULL,
`teamId` int(11) DEFAULT NULL,
`status` varchar(255) DEFAULT NULL,
`naam` varchar(255) DEFAULT NULL
) ENGINE=InnoDB AUTO_INCREMENT=43 DEFAULT CHARSET=latin1;
ERROR
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException:
Unknown column 'project0_.team_id' in 'field list'
Edited: Application.yml
spring:
mvc:
view:
prefix: /WEB-INF/jsp/
suffix: .jsp
datasource:
url: jdbc:mysql://localhost:3306/oxyplast
username: oxyplastuser
password: oxyplastuserpw
jpa:
properties:
hibernate:
current_session_context_class: org.springframework.orm.hibernate4.SpringSessionContext
namingStrategy: org.hibernate.cfg.DefaultNamingStrategy
SINCE SPRING-BOOT 1.4
Starting from 1.4, because of the switch to Hibernate 5, the naming strategy has been updated to SpringPhysicalNamingStrategy which should be very close to 1.3 defaults.
See also:
Spring's naming strategy
PREVIOUS VERSION
Spring Boot provides the ImprovedNamingStrategy as default naming strategy, which makes Hibernate search for a team_id column (inferred from the int teamId field). As this column doesn't exist in your table, that's the cause of the error. From the Hibernate docs:
An improved naming strategy that prefers embedded underscores to mixed case names
You've got two options:
Provide the column name explicitly as #Column(name="teamId"). There used to be a bug with this in early Boot versions, not anymore.
Change the naming strategy in the Spring Boot properties and tell it to use the EJB3NamingStrategy, which doesn't convert camelCase to snake_case, but keeps it as it is.
If you are using Spring Boot 2.0.2 and Hibernate 5.3.4 then setting the following property will fix the issue.
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
Below strategy worked for me
spring.jpa.hibernate.naming-strategy = org.hibernate.cfg.DefaultComponentSafeNamingStrategy
with the late version :
spring-boot-starter-data-jpa: ➡ 1.5.2.RELEASE
hibernate-core:5.0.12.Final
this class
PhysicalNamingStrategyStandardImpl
needs to be extended and added to hibernate properties.
Here is a full working version
public class PhysicalNamingStrategyImpl extends PhysicalNamingStrategyStandardImpl implements Serializable {
public static final PhysicalNamingStrategyImpl INSTANCE = new PhysicalNamingStrategyImpl();
#Override
public Identifier toPhysicalTableName(Identifier name, JdbcEnvironment context) {
String nameModified;
// Do whatever you want with the name modification
return new Identifier(nameModified, name.isQuoted());
}
}
#Override
public Identifier toPhysicalColumnName(Identifier name, JdbcEnvironment context) {
String nameModified;
// Do whatever you want with the name modification
return new Identifier(nameModified, name.isQuoted());
}
Linking it to hibernate should be done like this when configuring the datasource.
properties.put("hibernate.physical_naming_strategy", "my.Package.PhysicalNamingStrategyImpl");
here is a full working version of datasource config
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.autoconfigure.jdbc.DataSourceBuilder;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import javax.persistence.EntityManagerFactory;
import javax.sql.DataSource;
import java.util.HashMap;
import java.util.Map;
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactory",
basePackages = { "com.xxxxxx.repository" }
)
public class SharedDataSourceConfig {
#Value("${startup.ddl-auto}")
String hbm2ddl;
#Primary
#Bean(name = "dataSource")
#ConfigurationProperties("spring.datasource.shared")
public DataSource customerDataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("dataSource") DataSource dataSource) {
Map<String, Object> properties = new HashMap<String, Object>();
properties.put("hibernate.hbm2ddl.auto", hbm2ddl);
properties.put("hibernate.physical_naming_strategy", "my.package.PhysicalNamingStrategyImpl");
return builder
.dataSource(dataSource)
.packages(PackageScannerHelper.getPackagesToScan())
.persistenceUnit("shared")
.properties(properties)
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory
entityManagerFactory
) {
return new JpaTransactionManager(entityManagerFactory);
}
}
This worked for me with spring boot 1.4.0 and hibernate entitymanager 4.3.8.Final
application.properties
spring.jpa.hibernate.naming.implicit-strategy=org.hibernate.boot.model.naming.ImplicitNamingStrategyLegacyJpaImpl
spring.jpa.hibernate.naming.physical-strategy=org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl
application.properties
spring.jpa.hibernate.naming-strategy = org.hibernate.cfg.DefaultComponentSafeNamingStrategy
the above properties work for me.
hibernate 4.3.11.Final
spring boot 1.4.2.RELEASE

spring-boot 1.5.2+.RELEASE: Use more than one datasource with the same connection string

We have a spring boot application using hikariCP as connection pooler and multiple datasources to handle long jobs processing (basically it uses the same connection string with longer timeouts and a reduced number of connections)
Since 1.5.2.RELEASE upgrade, only the first datasource annotated with #Primary has its housekeeping and pool threads starting. The secondary one is simply discarded, although the debug shows the datasource initialization being executed.
With 1.5.1.RELEASE this worked properly, both group of threads would start.
Here are our datasources definitions
import java.util.Properties;
import javax.sql.DataSource;
import com.zaxxer.hikari.HikariDataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.Profile;
import org.springframework.stereotype.Component;
import org.springframework.validation.annotation.Validated;
import lombok.AllArgsConstructor;
#Profile("hikari")
#Validated
#AllArgsConstructor(onConstructor = #__(#Autowired))
#Component
class HikariDataSourceConfig
{
private final DataSourceConfig dataSourceConfig;
protected Properties setDataSourceProperties()
{
final Properties dataSourceProperties = new Properties();
dataSourceProperties.put("driverType", "thin");
dataSourceProperties.put("user", this.dataSourceConfig.getUsername());
dataSourceProperties.put("password", this.dataSourceConfig.getPassword());
dataSourceProperties.put("serverName", this.dataSourceConfig.getServer());
dataSourceProperties.put("portNumber", String.valueOf(this.dataSourceConfig.getPort()));
dataSourceProperties.put("databaseName", this.dataSourceConfig.getSid());
return dataSourceProperties;
}
protected HikariDataSource initializeDataSource(final int maximumPoolSize)
{
final HikariDataSource dataSource = new HikariDataSource();
dataSource.setMaximumPoolSize(maximumPoolSize);
dataSource.setValidationTimeout(this.dataSourceConfig.getValidationTimeout() * 1000L);
dataSource.setDataSourceClassName("oracle.jdbc.pool.OracleDataSource");
dataSource.setConnectionTimeout(this.dataSourceConfig.getConnectionTimeout() * 1000L);
dataSource.setMaxLifetime(this.dataSourceConfig.getMaxLifetime() * 1000L);
dataSource.setIdleTimeout(this.dataSourceConfig.getIdleTimeout() * 1000L);
dataSource.setAutoCommit(false);
return dataSource;
}
#Bean(destroyMethod = "close")
#Primary
public DataSource dataSource()
{
final HikariDataSource dataSource = initializeDataSource(this.dataSourceConfig.getMaximumPoolSize());
dataSource.setPoolName(DataSourceConfig.CONNECTION_POOL_DEFAULT_NAME);
dataSource.setLeakDetectionThreshold(this.dataSourceConfig.getLeakDetectionThreshold() * 1000L);
dataSource.setDataSourceProperties(setDataSourceProperties());
return dataSource;
}
#Bean(name = DataSourceConfig.DATASOURCE_ALT_NAME, destroyMethod = "close")
public DataSource slowDataSource()
{
final HikariDataSource dataSource = initializeDataSource(this.dataSourceConfig.getWorkingQueueSize());
dataSource.setMinimumIdle(1);
dataSource.setPoolName(DataSourceConfig.CONNECTION_POOL_ALT_NAME);
dataSource.setLeakDetectionThreshold(this.dataSourceConfig.getSlowJobLeakDetectionThreshold() * 1000L);
dataSource.setDataSourceProperties(setDataSourceProperties());
return dataSource;
}
}
Is there a way to configure spring to allow cloned datasources again?
Versions used:
HikariCP: 2.6.1
Spring-boot: 1.5.3.RELEASE
Java: 1.8.121
-- Edit
It fails for both 1.5.2.RELEASE and 1.5.3.RELEASE
-- Edit 2
I tried to bind the second datasource to a different database, but the same issue occurs, the secondary datasource pool does not start.

Spring beans are not injected in flyway java based migration

I'm trying to inject component of configuration properties in the flyway migration java code but it always null.
I'm using spring boot with Flyway.
#Component
#ConfigurationProperties(prefix = "code")
public class CodesProp {
private String codePath;
}
Then inside Flyway migration code, trying to autowrire this component as following:
public class V1_4__Migrate_codes_metadata implements SpringJdbcMigration {
#Autowired
private CodesProp codesProp ;
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
codesProp.getCodePath();
}
Here, codesProp is always null.
Is there any way to inject spring beans inside flyway or make it initialized before flyway bean?
Thank You.
Flyway doesn't support dependency injection into SpringJdbcMigration implementations. It simply looks for classes on the classpath that implement SpringJdbcMigration and creates a new instance using the default constructor. This is performed in SpringJdbcMigrationResolver. When the migration is executed, SpringJdbcMigrationExecutor creates a new JdbcTemplate and then calls your migration implementation's migrate method.
If you really need dependencies to be injected into your Java-based migrations, I think you'll have to implement your own MigrationResolver that retrieves beans of a particular type from the application context and creates and returns a ResolvedMigration instance for each.
If like me, you don't want to wait for Flyway 4.1, you can use Flyway 4.0 and add the following to your Spring Boot application:
1) Create a ApplicationContextAwareSpringJdbcMigrationResolver class in your project:
import org.flywaydb.core.api.FlywayException;
import org.flywaydb.core.api.MigrationType;
import org.flywaydb.core.api.MigrationVersion;
import org.flywaydb.core.api.configuration.FlywayConfiguration;
import org.flywaydb.core.api.migration.MigrationChecksumProvider;
import org.flywaydb.core.api.migration.MigrationInfoProvider;
import org.flywaydb.core.api.migration.spring.SpringJdbcMigration;
import org.flywaydb.core.api.resolver.ResolvedMigration;
import org.flywaydb.core.internal.resolver.MigrationInfoHelper;
import org.flywaydb.core.internal.resolver.ResolvedMigrationComparator;
import org.flywaydb.core.internal.resolver.ResolvedMigrationImpl;
import org.flywaydb.core.internal.resolver.spring.SpringJdbcMigrationExecutor;
import org.flywaydb.core.internal.resolver.spring.SpringJdbcMigrationResolver;
import org.flywaydb.core.internal.util.ClassUtils;
import org.flywaydb.core.internal.util.Location;
import org.flywaydb.core.internal.util.Pair;
import org.flywaydb.core.internal.util.StringUtils;
import org.flywaydb.core.internal.util.scanner.Scanner;
import org.springframework.context.ApplicationContext;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.Map;
/**
* Migration resolver for {#link SpringJdbcMigration}s which are registered in the given {#link ApplicationContext}.
* This resolver provides the ability to use other beans registered in the {#link ApplicationContext} and reference
* them via Spring's dependency injection facility inside the {#link SpringJdbcMigration}s.
*/
public class ApplicationContextAwareSpringJdbcMigrationResolver extends SpringJdbcMigrationResolver {
private final ApplicationContext applicationContext;
public ApplicationContextAwareSpringJdbcMigrationResolver(Scanner scanner, Location location, FlywayConfiguration configuration, ApplicationContext applicationContext) {
super(scanner, location, configuration);
this.applicationContext = applicationContext;
}
#SuppressWarnings("unchecked")
#Override
public Collection<ResolvedMigration> resolveMigrations() {
// get all beans of type SpringJdbcMigration from the application context
Map<String, SpringJdbcMigration> springJdbcMigrationBeans =
(Map<String, SpringJdbcMigration>) this.applicationContext.getBeansOfType(SpringJdbcMigration.class);
ArrayList<ResolvedMigration> resolvedMigrations = new ArrayList<ResolvedMigration>();
// resolve the migration and populate it with the migration info
for (SpringJdbcMigration springJdbcMigrationBean : springJdbcMigrationBeans.values()) {
ResolvedMigrationImpl resolvedMigration = extractMigrationInfo(springJdbcMigrationBean);
resolvedMigration.setPhysicalLocation(ClassUtils.getLocationOnDisk(springJdbcMigrationBean.getClass()));
resolvedMigration.setExecutor(new SpringJdbcMigrationExecutor(springJdbcMigrationBean));
resolvedMigrations.add(resolvedMigration);
}
Collections.sort(resolvedMigrations, new ResolvedMigrationComparator());
return resolvedMigrations;
}
ResolvedMigrationImpl extractMigrationInfo(SpringJdbcMigration springJdbcMigration) {
Integer checksum = null;
if (springJdbcMigration instanceof MigrationChecksumProvider) {
MigrationChecksumProvider version = (MigrationChecksumProvider) springJdbcMigration;
checksum = version.getChecksum();
}
String description;
MigrationVersion version1;
if (springJdbcMigration instanceof MigrationInfoProvider) {
MigrationInfoProvider resolvedMigration = (MigrationInfoProvider) springJdbcMigration;
version1 = resolvedMigration.getVersion();
description = resolvedMigration.getDescription();
if (!StringUtils.hasText(description)) {
throw new FlywayException("Missing description for migration " + version1);
}
} else {
String resolvedMigration1 = ClassUtils.getShortName(springJdbcMigration.getClass());
if (!resolvedMigration1.startsWith("V") && !resolvedMigration1.startsWith("R")) {
throw new FlywayException("Invalid Jdbc migration class name: " + springJdbcMigration.getClass()
.getName() + " => ensure it starts with V or R," + " or implement org.flywaydb.core.api.migration.MigrationInfoProvider for non-default naming");
}
String prefix = resolvedMigration1.substring(0, 1);
Pair info = MigrationInfoHelper.extractVersionAndDescription(resolvedMigration1, prefix, "__", "");
version1 = (MigrationVersion) info.getLeft();
description = (String) info.getRight();
}
ResolvedMigrationImpl resolvedMigration2 = new ResolvedMigrationImpl();
resolvedMigration2.setVersion(version1);
resolvedMigration2.setDescription(description);
resolvedMigration2.setScript(springJdbcMigration.getClass().getName());
resolvedMigration2.setChecksum(checksum);
resolvedMigration2.setType(MigrationType.SPRING_JDBC);
return resolvedMigration2;
}
}
2) Add a new configuration class to post process the Spring Boot generated Flyway instance:
import org.flywaydb.core.Flyway;
import org.flywaydb.core.internal.dbsupport.DbSupport;
import org.flywaydb.core.internal.dbsupport.h2.H2DbSupport;
import org.flywaydb.core.internal.dbsupport.mysql.MySQLDbSupport;
import com.pegusapps.zebra.infrastructure.repository.flyway.ApplicationContextAwareSpringJdbcMigrationResolver;
import org.flywaydb.core.internal.resolver.sql.SqlMigrationResolver;
import org.flywaydb.core.internal.util.Location;
import org.flywaydb.core.internal.util.PlaceholderReplacer;
import org.flywaydb.core.internal.util.scanner.Scanner;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.config.BeanPostProcessor;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import javax.sql.DataSource;
import java.sql.SQLException;
#Configuration
#ComponentScan("db.migration")
public class FlywayConfiguration {
#Bean
public BeanPostProcessor postProcessFlyway(ApplicationContext context) {
return new BeanPostProcessor() {
#Override
public Object postProcessBeforeInitialization(Object o, String s) throws BeansException {
return o;
}
#Override
public Object postProcessAfterInitialization(Object o, String s) throws BeansException {
if (o instanceof Flyway) {
Flyway flyway = (Flyway) o;
flyway.setSkipDefaultResolvers(true);
ApplicationContextAwareSpringJdbcMigrationResolver resolver = new ApplicationContextAwareSpringJdbcMigrationResolver(
new Scanner(Thread.currentThread().getContextClassLoader()),
new Location("classpath:db/migration"),
context.getBean(org.flywaydb.core.api.configuration.FlywayConfiguration.class),
context);
SqlMigrationResolver sqlMigrationResolver = null;
try {
sqlMigrationResolver = new SqlMigrationResolver(
getDbSupport(),
new Scanner(Thread.currentThread().getContextClassLoader()),
new Location("classpath:db/migration"),
PlaceholderReplacer.NO_PLACEHOLDERS,
"UTF-8",
"V",
"R",
"__",
".sql");
} catch (SQLException e) {
e.printStackTrace();
}
flyway.setResolvers(sqlMigrationResolver, resolver);
}
return o;
}
private DbSupport getDbSupport() throws SQLException {
DataSource dataSource = context.getBean(DataSource.class);
if( ((org.apache.tomcat.jdbc.pool.DataSource)dataSource).getDriverClassName().equals("org.h2.Driver"))
{
return new H2DbSupport(dataSource.getConnection());
}
else
{
return new MySQLDbSupport(dataSource.getConnection());
}
}
};
}
}
Note that I have some hardcoded dependencies on tomcat jdbc pool, h2 and mysql. If you are using something else, you will need to change the code there (If there is anybody that knows how to avoid it, please comment!)
Also note that the #ComponentScan package needs to match with where you will put the Java migration classes.
Also note that I had to add the SqlMigrationResolver back in since I want to support both the SQL and the Java flavor of the migrations.
3) Create a Java class in the db.migrations package that does the actual migration:
#Component
public class V2__add_default_surveys implements SpringJdbcMigration {
private final SurveyRepository surveyRepository;
#Autowired
public V2__add_surveys(SurveyRepository surveyRepository) {
this.surveyRepository = surveyRepository;
}
#Override
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
surveyRepository.save(...);
}
}
Note that you need to make the class a #Component and it needs to implement the SpringJdbcMigration. In this class, you can use Spring constructor injection for any Spring bean from your context you might need to do the migration(s).
Note: Be sure to disable ddl validation of Hibernate, because the validation seems to run before Flyway runs:
spring.jpa.hibernate.ddl-auto=none
In short do not autowire beans in your db migrations or even reference classes from your application!
If you refactor/delete/change classes you referenced in the migration it may not even compile or worse corrupt your migrations.
The overhead of using plain JDBC template for the migrations is not worth the risk.
If you are using deltaspike you can use BeanProvider to get a reference to your Class. Here is a DAO example, but it should work fine with your class too.
Change your DAO code:
public static UserDao getInstance() {
return BeanProvider.getContextualReference(UserDao.class, false, new DaoLiteral());
}
Then in your migration method:
UserDao userdao = UserDao.getInstance();
And there you've got your reference.
(referenced from: Flyway Migration with java)

Issue with Spring Data JPA - BeanEntityManagerFactory

I recently started learning spring. I am trying a simple example with spring data jpa in a spring mvc project. I am getting the following error while deploying the war file in the tomcat.
Caused by: org.springframework.beans.factory.BeanCreationException: Error creati
ng bean with name '(inner bean)#584d15f2': Cannot resolve reference to bean 'ent
ityManagerFactory' while setting constructor argument; nested exception is org.s
pringframework.beans.factory.BeanCurrentlyInCreationException: Error creating be
an with name 'entityManagerFactory': Requested bean is currently in creation: Is
there an unresolvable circular reference?
at org.springframework.beans.factory.support.BeanDefinitionValueResolver
.resolveReference(BeanDefinitionValueResolver.java:359)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver
.resolveValueIfNecessary(BeanDefinitionValueResolver.java:108)
at org.springframework.beans.factory.support.ConstructorResolver.resolve
ConstructorArguments(ConstructorResolver.java:634)
at org.springframework.beans.factory.support.ConstructorResolver.instant
iateUsingFactoryMethod(ConstructorResolver.java:444)
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:11
19)
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1014)
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.doCreateBean(AbstractAutowireCapableBeanFactory.java:504)
at org.springframework.beans.factory.support.AbstractAutowireCapableBean
Factory.createBean(AbstractAutowireCapableBeanFactory.java:476)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver
.resolveInnerBean(BeanDefinitionValueResolver.java:299)
... 92 more
Caused by: org.springframework.beans.factory.BeanCurrentlyInCreationException: E
rror creating bean with name 'entityManagerFactory': Requested bean is currently
in creation: Is there an unresolvable circular reference?
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistr
y.beforeSingletonCreation(DefaultSingletonBeanRegistry.java:347)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistr
y.getSingleton(DefaultSingletonBeanRegistry.java:223)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBe
an(AbstractBeanFactory.java:299)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean
(AbstractBeanFactory.java:194)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver
.resolveReference(BeanDefinitionValueResolver.java:351)
... 100 more
27-Sep-2015 17:56:45.304 INFO [http-apr-8080-exec-35] org.apache.catalina.startu
p.HostConfig.deployWAR Deployment of web application archive D:\ApacheTomcat\apa
che-tomcat-8.0.26\webapps\springTest.war has finished in 6,124 ms
My controller code is as follows,
package com.demo.repo;
import com.demo.model.Customer;
import java.text.DateFormat;
import java.util.Date;
import java.util.Locale;
import java.util.Properties;
import javax.activation.DataSource;
import javax.persistence.EntityManagerFactory;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.env.Environment;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
import org.springframework.stereotype.Controller;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import org.springframework.ui.Model;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
/**
* Handles requests for the application home page.
*/
#Controller
#Configuration
#EnableJpaRepositories("com.demo.repo")
#EnableTransactionManagement
public class HomeController {
#Autowired
customerRepository repository;
private static final Logger logger = LoggerFactory.getLogger(HomeController.class);
/**
* Simply selects the home view to render by returning its name.
*/
#Bean(destroyMethod = "close")
DataSource dataSource(Environment env) {
HikariConfig dataSourceConfig = new HikariConfig();
dataSourceConfig.setDriverClassName(env.getRequiredProperty("db.driver"));
dataSourceConfig.setJdbcUrl(env.getRequiredProperty("db.url"));
dataSourceConfig.setUsername(env.getRequiredProperty("db.username"));
dataSourceConfig.setPassword(env.getRequiredProperty("db.password"));
return (DataSource) new HikariDataSource(dataSourceConfig);
}
#Bean
LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource,
Environment env) {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource((javax.sql.DataSource) dataSource);
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
entityManagerFactoryBean.setPackagesToScan("com.demo.repo");
Properties jpaProperties = new Properties();
//Configures the used database dialect. This allows Hibernate to create SQL
//that is optimized for the used database.
jpaProperties.put("hibernate.dialect", env.getRequiredProperty("hibernate.dialect"));
//Specifies the action that is invoked to the database when the Hibernate
//SessionFactory is created or closed.
jpaProperties.put("hibernate.hbm2ddl.auto",
env.getRequiredProperty("hibernate.hbm2ddl.auto")
);
//Configures the naming strategy that is used when Hibernate creates
//new database objects and schema elements
jpaProperties.put("hibernate.ejb.naming_strategy",
env.getRequiredProperty("hibernate.ejb.naming_strategy")
);
//If the value of this property is true, Hibernate writes all SQL
//statements to the console.
jpaProperties.put("hibernate.show_sql",
env.getRequiredProperty("hibernate.show_sql")
);
//If the value of this property is true, Hibernate will format the SQL
//that is written to the console.
jpaProperties.put("hibernate.format_sql",
env.getRequiredProperty("hibernate.format_sql")
);
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
#Bean
JpaTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory);
return transactionManager;
}
#RequestMapping(value = "/", method = RequestMethod.GET)
public String home(Locale locale, Model model) {
logger.info("Welcome home! The client locale is {}.", locale);
Date date = new Date();
DateFormat dateFormat = DateFormat.getDateTimeInstance(DateFormat.LONG, DateFormat.LONG, locale);
String formattedDate = dateFormat.format(date);
model.addAttribute("serverTime", formattedDate );
repository.save(new Customer("Jack", "Bauer"));
repository.save(new Customer("Chloe", "O'Brian"));
repository.save(new Customer("Kim", "Bauer"));
repository.save(new Customer("David", "Palmer"));
repository.save(new Customer("Michelle", "Dessler"));
for(Customer customer : repository.findAll())
{
System.out.println("Log Results :: "+customer.toString());
}
return "myhome";
}
}
Can anyone suggest me what is wrong with my code and any suggestions to resolve the same.
It seems that your entityManagerFactory requires the dataSource that is defined in the same configuration file.
Try moving the definition of dataSource to another configuration class, or, instead of passing the dataSource as parameter, just call the dataSource() method when you need it in the entityManagerFactory.
#Autowired
Environment env;
#Bean
LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean =
new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource((javax.sql.DataSource) dataSource());
....
}
TIP: Don't mix your #Controller and #Configuration. Create a different file for each of them.

Exporting Spring Boot Actuator Metrics (& Dropwizard Metrics) to Statsd

I'm trying to export all of the metrics which are visible at the endpoint /metrics to a StatsdMetricWriter.
I've got the following configuration class so far:
package com.tonyghita.metricsdriven.service.config;
import com.codahale.metrics.MetricRegistry;
import com.ryantenney.metrics.spring.config.annotation.EnableMetrics;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.actuate.autoconfigure.ExportMetricReader;
import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
import org.springframework.boot.actuate.metrics.reader.MetricReader;
import org.springframework.boot.actuate.metrics.reader.MetricRegistryMetricReader;
import org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter;
import org.springframework.boot.actuate.metrics.writer.MetricWriter;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
#EnableMetrics(proxyTargetClass = true)
public class MetricsConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(MetricsConfig.class);
#Value("${statsd.host:localhost}")
private String host = "localhost";
#Value("${statsd.port:8125}")
private int port;
#Autowired
private MetricRegistry metricRegistry;
#Bean
#ExportMetricReader
public MetricReader metricReader() {
return new MetricRegistryMetricReader(metricRegistry);
}
#Bean
#ExportMetricWriter
public MetricWriter metricWriter() {
LOGGER.info("Configuring StatsdMetricWriter to export to {}:{}", host, port);
return new StatsdMetricWriter(host, port);
}
}
Which writes all of the metrics which I've added to Statsd, but I'd like to also send the system/JVM metrics that are visible on the /metrics endpoint.
What am I missing?
I had the same problem and found a solution here: https://github.com/tzolov/export-metrics-example
Just add a MetricsEndpointMetricReader to your config and everything available at th e/metrics endpoint will be published to the StatsdMetricWriter.
Here is a complete example config for spring boot 1.3.x and dropwizard metrics-jvm 3.1.x:
import com.codahale.metrics.MetricRegistry;
import com.codahale.metrics.jvm.GarbageCollectorMetricSet;
import com.codahale.metrics.jvm.MemoryUsageGaugeSet;
import com.codahale.metrics.jvm.ThreadStatesGaugeSet;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.actuate.autoconfigure.ExportMetricWriter;
import org.springframework.boot.actuate.endpoint.MetricsEndpoint;
import org.springframework.boot.actuate.endpoint.MetricsEndpointMetricReader;
import org.springframework.boot.actuate.metrics.Metric;
import org.springframework.boot.actuate.metrics.statsd.StatsdMetricWriter;
import org.springframework.boot.actuate.metrics.writer.Delta;
import org.springframework.boot.actuate.metrics.writer.MetricWriter;
import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean;
import org.springframework.boot.autoconfigure.condition.ConditionalOnProperty;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MetricsConfiguration {
#Bean
public MetricRegistry metricRegistry() {
final MetricRegistry metricRegistry = new MetricRegistry();
metricRegistry.register("jvm.memory",new MemoryUsageGaugeSet());
metricRegistry.register("jvm.thread-states",new ThreadStatesGaugeSet());
metricRegistry.register("jvm.garbage-collector",new GarbageCollectorMetricSet());
return metricRegistry;
}
/*
* Reading all metrics that appear on the /metrics endpoint to expose them to metrics writer beans.
*/
#Bean
public MetricsEndpointMetricReader metricsEndpointMetricReader(final MetricsEndpoint metricsEndpoint) {
return new MetricsEndpointMetricReader(metricsEndpoint);
}
#Bean
#ConditionalOnProperty(prefix = "statsd", name = {"prefix", "host", "port"})
#ExportMetricWriter
public MetricWriter statsdMetricWriter(#Value("${statsd.prefix}") String statsdPrefix,
#Value("${statsd.host}") String statsdHost,
#Value("${statsd.port}") int statsdPort) {
return new StatsdMetricWriter(statsdPrefix, statsdHost, statsdPort);
}
}
From what I've seen in spring-boot code, only calls to CounterService and GaugeService implementations are forwarded to dropwizard's MetricRegistry.
Therefore, as you already observed, only counter.* and gauge.* metrics from the /metrics endpoint will end up in Statsd.
System and JVM metrics are exposed through custom SystemPublicMetrics class, which doesn't use counter or gauge service.
I'm not sure if there is a simpler solution (maybe someone from Spring team will comment), but one way to do it (not spring-boot specific) would be to use a scheduled task that periodically writes system stats to the MetricRegistry.
To register JVM metrics you can use the JVM related MetricSets supplied by codehale.metrics.jvm library. You can just add the whole set without supplying whether they are gauges or counters.
Here is my example code where I am registering jvm related metrics:
#Configuration
#EnableMetrics(proxyTargetClass = true)
public class MetricsConfig {
#Autowired
private StatsdProperties statsdProperties;
#Autowired
private MetricsEndpoint metricsEndpoint;
#Autowired
private DataSourcePublicMetrics dataSourcePublicMetrics;
#Bean
#ExportMetricReader
public MetricReader metricReader() {
return new MetricRegistryMetricReader(metricRegistry());
}
public MetricRegistry metricRegistry() {
final MetricRegistry metricRegistry = new MetricRegistry();
//jvm metrics
metricRegistry.register("jvm.gc",new GarbageCollectorMetricSet());
metricRegistry.register("jvm.mem",new MemoryUsageGaugeSet());
metricRegistry.register("jvm.thread-states",new ThreadStatesGaugeSet());
return metricRegistry;
}
#Bean
#ConditionalOnProperty(prefix = "metrics.writer.statsd", name = {"host", "port"})
#ExportMetricWriter
public MetricWriter statsdMetricWriter() {
return new StatsdMetricWriter(
statsdProperties.getPrefix(),
statsdProperties.getHost(),
statsdProperties.getPort()
);
}
}
Note: I am using spring boot version 1.3.0.M4
Enjoy! (see the public metrics logged in console as dropwizard metrics)
#Configuration
#EnableMetrics
#EnableScheduling
public class MetricsReporter extends MetricsConfigurerAdapter {
#Autowired private SystemPublicMetrics systemPublicMetrics;
private MetricRegistry metricRegistry;
#Scheduled(fixedDelay = 5000)
void exportPublicMetrics() {
for (Metric<?> metric : systemPublicMetrics.metrics()) {
Counter counter = metricRegistry.counter(metric.getName());
counter.dec(counter.getCount());
counter.inc(Double.valueOf(metric.getValue().toString()).longValue());
}
}
#Override
public void configureReporters(MetricRegistry metricRegistry) {
this.metricRegistry = metricRegistry;
ConsoleReporter.forRegistry(metricRegistry).build().start(10, TimeUnit.SECONDS);
}
}

Resources