Spring MVC installation - spring

I would like to create instalation process for the first run of my Spring MVC app. Something like wordpress has (when you first run it, you need to specify DB connection and your first admin account... etc.
I have tried it with spring but the spring won't start because when DataSource Bean is not connected, it will simply fail to start. It always fail when transaction manager bean is beeing created:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'txManager' defined in SpringWebConfig: Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Property 'sessionFactory' is required|#]
Is there any way how to start the spring/hibernate app without transaction manager and then load it on-the-fly when user configures his db access details ? I know there is way to do it with application.properties but I want to create a simple install process just like in wordpress so it is most convenient for non-tech users.
EDIT 1: My current code:
#EnableWebMvc
#Configuration
#ComponentScan(basePackages = {"test"}) //package scan
public class SpringWebConfig implements WebMvcConfigurer {
#Bean(name = "dataSource")
public BasicDataSource dataSource() {
if (Config.getInstance().isInstalled()) {
BasicDataSource ds = new BasicDataSource();
ds.setDriverClassName("com.mysql.cj.jdbc.Driver");
ds.setUrl("jdbc:mysql://" + Config.getInstance().getDburl() + "/" + Config.getInstance().getDbname());
ds.setUsername(Config.getInstance().getDbuser());
ds.setPassword(Config.getInstance().getDbpass());
this.ds = ds;
}
return ds;
}
#Bean
public HibernateTransactionManager txManager() {
if (sessionFactory() == null) {
return new HibernateTransactionManager();
}
else return new HibernateTransactionManager(sessionFactory());
}
#Bean
public SessionFactory sessionFactory() {
try {
LocalSessionFactoryBuilder builder = new LocalSessionFactoryBuilder(dataSource());
builder.scanPackages(dbEntity).addProperties(getHibernateProperties());
return builder.buildSessionFactory();
} catch (Exception e) {
e.printStackTrace();
}
return null;
}

Adding #Lazy to all 3 #Beans solves the issue and hibernate is not requiered at the startup of Spring MVC (configured by class that implements WebMvcConfigurer), this allows to display install page for the user so he can define DB access details.

Related

SpringBoot app. accessing 2 DataSources using JdbcTemplate

I have a SpringBoot app. that has to access to different datasources to export data from 1 to another (1 local datasource and another remote datasource)
This is how my persistenceConfig looks like
public class PersistenceConfig {
#Bean
public JdbcTemplate localJdbcTemplate() {
return new JdbcTemplate(localDataSource());
}
#Bean
public JdbcTemplate remoteJdbcTemplate() {
return new JdbcTemplate(remoteDataSource());
}
#Bean
public DataSource localDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getLocalDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getLocalDbDriverClassName());
config.setJdbcUrl(getLocalDbJdbcUrl());
config.addDataSourceProperty("user", getLocalDbUser());
config.addDataSourceProperty("password", getLocalDbPwd());
return new HikariDataSource(config);
}
#Bean
public DataSource remoteDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getRemoteDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getRemoteDbDriverClassName());
config.setJdbcUrl(getRemoteDbJdbcUrl());
config.addDataSourceProperty("user", getRemoteDbUser());
config.addDataSourceProperty("password", getRemoteDbPwd());
return new HikariDataSource(config);
}
}
But when I init my app I got this error:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type 'javax.sql.DataSource' available: expected single matching bean but found 2: localDataSource,remoteDataSource
I also tried to user qualified beans, as follows:
#Bean(name = "localJdbcTemplate")
public JdbcTemplate localJdbcTemplate() {
return new JdbcTemplate(localDataSource());
}
#Bean(name = "remoteJdbcTemplate")
public JdbcTemplate remoteJdbcTemplate() {
return new JdbcTemplate(remoteDataSource());
}
#Bean(name = "localDataSource")
public DataSource localDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getLocalDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getLocalDbDriverClassName());
config.setJdbcUrl(getLocalDbJdbcUrl());
config.addDataSourceProperty("user", getLocalDbUser());
config.addDataSourceProperty("password", getLocalDbPwd());
return new HikariDataSource(config);
}
#Bean(name = "remoteDataSource")
public DataSource remoteDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getRemoteDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getRemoteDbDriverClassName());
config.setJdbcUrl(getRemoteDbJdbcUrl());
config.addDataSourceProperty("user", getRemoteDbUser());
config.addDataSourceProperty("password", getRemoteDbPwd());
return new HikariDataSource(config);
}
but then I got this other error:
A component required a bean of type 'org.springframework.transaction.PlatformTransactionManager' that could not be found.
- Bean method 'transactionManager' not loaded because #ConditionalOnSingleCandidate (types: javax.sql.DataSource; SearchStrategy: all) did not find a primary bean from beans 'remoteDataSource', 'localDataSource'
I also tried
#SpringBootApplication(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
#EnableAutoConfiguration(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
but then I got
A component required a bean of type 'org.springframework.transaction.PlatformTransactionManager' that could not be found.
You can use bean names to qualify them:
#Bean(name = "localDataSource")
public DataSource localDataSource() {
...
}
#Bean(name = "remoteDataSource")
public DataSource remoteDataSource() {
...
}
Please note: You have to do the same for your JdbcTemplate beans - just give them a name and it will work.
See the Spring JavaDoc for more Information: Bean
#Bean(name = "localJdbcTemplate")
public JdbcTemplate localJdbcTemplate() {
return new JdbcTemplate(localDataSource());
}
When you use your JdbcTemplate beans within your export service implementation via autowiring (#Autowired), you need to use #Qualifier to qualify them:
#Autowired
#Qualifier("localJdbcTemplate")
private JdbcTemplate jdbcTemplate;
#Autowired
#Qualifier("remoteJdbcTemplate")
private JdbcTemplate jdbcTemplate;
Bean gets its name from the method name, providing name attribute just makes it explicit (keeping the name the same as the method name). Overall suggestion about #Bean(name="...") and #Qualifier didn't fix the error for me.
I set up sample project with two embedded databases and got the same error as the aothor. Spring suggestion was to annotate one of the DataSource beans as #Primary and, in fact, this fixes the error. Usually it happens, when some other application parts want to see only one or one primary DataSource, if several present.
What seems to be a better solution is to disable not needed autoconfiguration beans keeping rest of the code as it is:
#SpringBootApplication(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
or:
#EnableAutoConfiguration(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
depending on which annotations are in use.
If author doesn't use any JPA provider and operates directly with JdbcTemplate it may be a suitable solution.

NoUniqueBeanDefinitionException in Spring annotation driven configuration

I am getting the following error when trying to autowire two beans using
No qualifying bean of type [javax.jms.ConnectionFactory] is defined:
expected single matching bean but found 2: aConnectionFactory, bConnectionFactory
Description:
Parameter 1 of method jmsListenerContainerFactory in org.springframework.boot.autoconfigure.jms.JmsAnnotationDrivenConfiguration required a single bean, but 2 were found:
- aConnectionFactory: defined by method 'aConnectionFactory' in package.Application
- bConnectionFactory: defined by method 'bConnectionFactory' in package.Application
Action:
Consider marking one of the beans as #Primary, updating the consumer to accept multiple beans, or using #Qualifier to identify the bean that should be consumed
I have this annotation driven configuration:
#SpringBootApplication
#EnableIntegration
#IntegrationComponentScan
public class Application extends SpringBootServletInitializer implements
WebApplicationInitializer {
#Resource(name = "aConnectionFactory")
private ConnectionFactory aConnectionFactory;
#Resource(name = "bConnectionFactory")
private ConnectionFactory bConnectionFactory;
#Bean
public IntegrationFlow jmsInboundFlow() {
return IntegrationFlows
.from(
Jms.inboundAdapter(aConnectionFactory)
.destination(aQueue),
e -> e.poller( Pollers.fixedRate(100,
TimeUnit.MILLISECONDS).maxMessagesPerPoll(100))
).channel("entrypoint")
.get();
}
#Bean
public IntegrationFlow jmsInboundFlowB() {
return IntegrationFlows
.from(
Jms.inboundAdapter(bConnectionFactory)
.destination(bQueue),
e -> e.poller( Pollers.fixedRate(100,
TimeUnit.MILLISECONDS).maxMessagesPerPoll(100))
).channel("entrypoint")
.get();
}
#Bean(name = "aConnectionFactory")
#Profile({"weblogic"})
public ConnectionFactory aConnectionFactory() {
ConnectionFactory factory = null;
JndiTemplate jndi = new JndiTemplate();
try {
factory = (ConnectionFactory) jndi.lookup("jms/ConnectionFactory");
} catch (NamingException e) {
logger.error("NamingException for jms/ConnectionFactory", e);
}
return factory;
}
#Bean(name = "bConnectionFactory")
#Profile({"weblogic"})
public ConnectionFactory bConnectionFactory() {
ConnectionFactory factory = null;
JndiTemplate jndi = new JndiTemplate();
try {
factory = (ConnectionFactory) jndi.lookup("jms/ConnectionFactory");
} catch (NamingException e) {
logger.error("NamingException for jms/ConnectionFactory", e);
}
return factory;
}
}
Any ideas what's wrong in this code? This seems to be straight forward, but specifying the Qualifier doesn't work, I have also tried to use #Resource. What am I missing there?
Any help appreciated.
Nothing wrong with your code.
That is just JmsAnnotationDrivenConfiguration from Spring Boot which doesn't like your two ConnectionFactory beans, but requires only one.
Why just don't follow with that report recommendations and mark one of them with the #Primary?
Looks like you don't use Spring Boot JMS auto-configuration feature, so that would be just straightforward to disable JmsAnnotationDrivenConfiguration: http://docs.spring.io/spring-boot/docs/1.4.1.RELEASE/reference/htmlsingle/#using-boot-disabling-specific-auto-configuration
The problem consist
javax.jms.ConnectionFactory is singleton, you need one object that type!
Solutions for your problem:
If you need two object that create objects and extend ConnectionFactory
them change scope as needed.
try #Scope("singleton") or #Scope("prototype").
if you receive error, make a objects. then use a scope #Scope("singleton")
"Other Two" disfigure the other class that is already using and setting such an.

Spring Boot 1.2.5.RELEASE & Spring Security 4.0.2.RELEASE - How to load configuration file before security context initialization?

I'm facing a problem with Spring: I'm migrating from Spring Security ver. 3.2.7.RELEASE to 4.0.2.RELEASE. Everything was working fine in older version, however a problem occured when it came to loading DataSource.
Let me describe the architecture:
Application is secured with both SAML and LDAP mechanisms (SAML configuration is pretty similar to config given here: https://github.com/vdenotaris/spring-boot-security-saml-sample/blob/master/src/main/java/com/vdenotaris/spring/boot/security/saml/web/config/WebSecurityConfig.java).
They both need to connect to database in order to get some required data. We use MyBatis with Spring Mybatis to get needed data. That's, where the problem begins.
My DAO configuration class looks like this:
#Configuration
#EnableConfigurationProperties
#MapperScan(basePackages = { "pl.myapp" })
public class DaoConfiguration {
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#Primary
public JdbcTemplate jdbcTemplate() {
return new JdbcTemplate(dataSource());
}
#Bean
#Primary
public SqlSessionFactoryBean sqlSessionFactoryBean() {
SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setDataSource(dataSource());
// some stuff happens here
return sqlSessionFactoryBean;
}
#Bean
#Primary
public DataSourceTransactionManager transactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
#ConfigurationProperties(prefix = "liquibase.datasource")
#ConditionalOnProperty(name="liquibase.enabled")
public DataSource liquibaseDataSource() {
DataSource liquiDataSource = DataSourceBuilder.create().build();
return liquiDataSource;
}
}
In previous version it worked like a charm, but now it has a problem loading mappers, resulting in Bean creation exception on FactoryBean type check: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'someMapper' defined in file [<filename>]: Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Property 'sqlSessionFactory' or 'sqlSessionTemplate' are required
over and over again (it's not my problem, it's a known Spring/MyBatis bug).
I did some debugging and discovered something interesting: it looks like DaoConfiguration is not treated like a configuration here! I mean: if I add
#Bean
public SqlSessionFactory sqlSessionFactory() throws Exception {
return sqlSessionFactoryBean().getObject();
}
to this config, "normal" call of #Bean annotated method should result in calling proper interceptor, here it lacks this funcionality.
My prediction is that: this config class has not been properly wrapped yet and Spring Security already needs beans produced by it.
Is there any solution to properly load this configuration before Spring Security is initialized? Or am I just wrong and missing something (maybe not so) obvious?

spring-boot-starter-jta-atomikos and spring-boot-starter-batch

Is it possible to use both these starters in a single application?
I want to load records from a CSV file into a database table. The Spring Batch tables are stored in a different database, so I assume I need to use JTA to handle the transaction.
Whenever I add #EnableBatchProcessing to my #Configuration class it configures a PlatformTransactionManager, which stops this being auto-configured by Atomikos.
Are there any spring boot + batch + jta samples out there that show how to do this?
Many Thanks,
James
I just went through this and I found something that seems to work. As you note, #EnableBatchProcessing causes a DataSourceTransactionManager to be created, which messes up everything. I'm using modular=true in #EnableBatchProcessing, so the ModularBatchConfiguration class is activated.
What I did was to stop using #EnableBatchProcessing and instead copy the entire ModularBatchConfiguration class into my project. Then I commented out the transactionManager() method, since the Atomikos configuration creates the JtaTransactionManager. I also had to override the jobRepository() method, because that was hardcoded to use the DataSourceTransactionManager created inside DefaultBatchConfiguration.
I also had to explicitly import the JtaAutoConfiguration class. This wires everything up correctly (according to the Actuator's "beans" endpoint - thank god for that). But when you run it the transaction manager throws an exception because something somewhere sets an explicit transaction isolation level. So I also wrote a BeanPostProcessor to find the transaction manager and call txnMgr.setAllowCustomIsolationLevels(true);
Now everything works, but while the job is running, I cannot fetch the current data from batch_step_execution table using JdbcTemplate, even though I can see the data in SQLYog. This must have something to do with transaction isolation, but I haven't been able to understand it yet.
Here is what I have for my configuration class, copied from Spring and modified as noted above. PS, I have my DataSource that points to the database with the batch tables annotated as #Primary. Also, I changed my DataSource beans to be instances of org.apache.tomcat.jdbc.pool.XADataSource; I'm not sure if that's necessary.
#Configuration
#Import(ScopeConfiguration.class)
public class ModularJtaBatchConfiguration implements ImportAware
{
#Autowired(required = false)
private Collection<DataSource> dataSources;
private BatchConfigurer configurer;
#Autowired
private ApplicationContext context;
#Autowired(required = false)
private Collection<BatchConfigurer> configurers;
private AutomaticJobRegistrar registrar = new AutomaticJobRegistrar();
#Bean
public JobRepository jobRepository(DataSource batchDataSource, JtaTransactionManager jtaTransactionManager) throws Exception
{
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource);
factory.setTransactionManager(jtaTransactionManager);
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean
public JobLauncher jobLauncher() throws Exception {
return getConfigurer(configurers).getJobLauncher();
}
// #Bean
// public PlatformTransactionManager transactionManager() throws Exception {
// return getConfigurer(configurers).getTransactionManager();
// }
#Bean
public JobExplorer jobExplorer() throws Exception {
return getConfigurer(configurers).getJobExplorer();
}
#Bean
public AutomaticJobRegistrar jobRegistrar() throws Exception {
registrar.setJobLoader(new DefaultJobLoader(jobRegistry()));
for (ApplicationContextFactory factory : context.getBeansOfType(ApplicationContextFactory.class).values()) {
registrar.addApplicationContextFactory(factory);
}
return registrar;
}
#Bean
public JobBuilderFactory jobBuilders(JobRepository jobRepository) throws Exception {
return new JobBuilderFactory(jobRepository);
}
#Bean
// hopefully this will autowire the Atomikos JTA txn manager
public StepBuilderFactory stepBuilders(JobRepository jobRepository, JtaTransactionManager ptm) throws Exception {
return new StepBuilderFactory(jobRepository, ptm);
}
#Bean
public JobRegistry jobRegistry() throws Exception {
return new MapJobRegistry();
}
#Override
public void setImportMetadata(AnnotationMetadata importMetadata) {
AnnotationAttributes enabled = AnnotationAttributes.fromMap(importMetadata.getAnnotationAttributes(
EnableBatchProcessing.class.getName(), false));
Assert.notNull(enabled,
"#EnableBatchProcessing is not present on importing class " + importMetadata.getClassName());
}
protected BatchConfigurer getConfigurer(Collection<BatchConfigurer> configurers) throws Exception {
if (this.configurer != null) {
return this.configurer;
}
if (configurers == null || configurers.isEmpty()) {
if (dataSources == null || dataSources.isEmpty()) {
throw new UnsupportedOperationException("You are screwed");
} else if(dataSources != null && dataSources.size() == 1) {
DataSource dataSource = dataSources.iterator().next();
DefaultBatchConfigurer configurer = new DefaultBatchConfigurer(dataSource);
configurer.initialize();
this.configurer = configurer;
return configurer;
} else {
throw new IllegalStateException("To use the default BatchConfigurer the context must contain no more than" +
"one DataSource, found " + dataSources.size());
}
}
if (configurers.size() > 1) {
throw new IllegalStateException(
"To use a custom BatchConfigurer the context must contain precisely one, found "
+ configurers.size());
}
this.configurer = configurers.iterator().next();
return this.configurer;
}
}
#Configuration
class ScopeConfiguration {
private StepScope stepScope = new StepScope();
private JobScope jobScope = new JobScope();
#Bean
public StepScope stepScope() {
stepScope.setAutoProxy(false);
return stepScope;
}
#Bean
public JobScope jobScope() {
jobScope.setAutoProxy(false);
return jobScope;
}
}
I found a solution where I was able to keep #EnableBatchProcessing but had to implement BatchConfigurer and atomikos beans, see my full answer in this so answer.

Spring profile, code based config and JNDI

I am moving to new Spring 3.x code based #Configuration style. I have done good, so far, but when I tried to add a #Profile("production") to select a JNDI data source instead of C3p0 connection pool #Profile("standalone"), I've got:
Error creating bean with name 'dataSource': Requested bean is currently in creation: Is there an unresolvable circular reference?
...
Of course, there is a lot of details (I included some below for reference), but my question is: would someone kindly post/point to a working example?
Thanks,
Details:
Same configuration, as XML, is working fine,
web.xml points to a root context configured in AppConfig.class and to a servlet context configured in WebConfig.class,
AppConfig has an #Import(SomeConfig.class) pulling a config from a service layer project - dependencies are wired by maven,
The SomeConfig class has the #Profile("standalone") on the C3p0 datasource,
The AppConfig class has the #Profile("production") on the JNDI datasource,
web.xml defines spring.profiles.default=production.
Edit: Solved
I moved the profile to the right place. Now they are in the same file and in the same project:
...
other bean definitions
/**
* Stand-alone mode of operation.
*/
#Configuration
#Profile("standalone")
static class StandaloneProfile {
#Autowired
private Environment env;
/**
* Data source.
*/
#Bean
public DataSource dataSource() {
try {
ComboPooledDataSource ds = new ComboPooledDataSource();
ds.setDriverClass(env.getRequiredProperty("helianto.jdbc.driverClassName"));
ds.setJdbcUrl(env.getRequiredProperty("helianto.jdbc.url"));
ds.setUser(env.getRequiredProperty("helianto.jdbc.username"));
ds.setPassword(env.getRequiredProperty("helianto.jdbc.password"));
ds.setAcquireIncrement(5);
ds.setIdleConnectionTestPeriod(60);
ds.setMaxPoolSize(100);
ds.setMaxStatements(50);
ds.setMinPoolSize(10);
return ds;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}
/**
* Production mode of operation.
*/
#Configuration
#Profile("production")
static class ProductionProfile {
#Autowired
private Environment env;
/**
* JNDI data source.
*/
#Bean
public DataSource dataSource() {
try {
JndiObjectFactoryBean jndiFactory = new JndiObjectFactoryBean();
jndiFactory.setJndiName("java:comp/env/jdbc/heliantoDB");
jndiFactory.afterPropertiesSet(); //edited: do not forget this!
return (DataSource) jndiFactory.getObject();
} catch (Exception e) {
throw new RuntimeException(e);
}
}
}

Resources