For my application, we are using the spring's
org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource
The target dataSources are configured and chosen based on request's domain URL.
Eg:
qa.example.com ==> target datasource = DB1
qa-test.example.com ==> target datasource = DB2
Following is the configuration for the same
#Bean(name = "dataSource")
public DataSource dataSource() throws PropertyVetoException, ConfigurationException {
EERoutingDatabase routingDB = new EERoutingDatabase();
Map<Object, Object> targetDataSources = datasourceList();
routingDB.setTargetDataSources(targetDataSources);
return routingDB;
}
public class EERoutingDatabase extends AbstractRoutingDataSource {
#Override
protected Object determineCurrentLookupKey() {
// This is derived from the request's URL/Domain
return SessionUtil.getDataSourceHolder();
}
}
The task is now using Quartz JDBCJobStore to store the quartz jobs/triggers.
The preferred option is using JobStoreCMT.
We used the following config
#Configuration
public class QuartzConfig {
private static final Logger LOG = LoggerFactory.getLogger(QuartzConfig.class);
private static final String QUARTZ_CONFIG_FILE = "ee-quartz.properties";
#Autowired
private DataSource dataSource;
#Autowired
private PlatformTransactionManager transactionManager;
#Autowired
private ApplicationContext applicationContext;
/**
* Spring wrapper over Quartz Scheduler bean
*/
#Bean(name="quartzRealTimeScheduler")
SchedulerFactoryBean schedulerFactoryBean() {
LOG.info("Creating QUARTZ Scheduler for real time Job invocation");
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setConfigLocation(new ClassPathResource(QUARTZ_CONFIG_FILE));
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setJobFactory(springBeanJobFactory());
factory.setWaitForJobsToCompleteOnShutdown(true);
factory.setApplicationContextSchedulerContextKey("applicationContext");
return factory;
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
jobFactory.setIgnoredUnknownProperties("applicationContext");
return jobFactory;
}
}
and following is the config in quartz properties file (ee-quartz.properties)
org.quartz.scheduler.instanceId=AUTO
org.quartz.jobStore.useProperties=false
org.quartz.jobStore.misfireThreshold: 60000
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
On starting the application, following exception occurs
Caused by: java.lang.IllegalStateException: Cannot determine target DataSource for lookup key [null]
at org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource.determineTargetDataSource(AbstractRoutingDataSource.java:202) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at com.expertly.config.EERoutingDatabase.determineTargetDataSource(EERoutingDatabase.java:60) ~[EERoutingDatabase.class:na]
at org.springframework.jdbc.datasource.lookup.AbstractRoutingDataSource.getConnection(AbstractRoutingDataSource.java:164) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:111) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:77) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:289) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:329) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
at org.springframework.scheduling.quartz.LocalDataSourceJobStore.initialize(LocalDataSourceJobStore.java:149) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.quartz.impl.StdSchedulerFactory.instantiate(StdSchedulerFactory.java:1321) ~[quartz-2.2.2.jar:na]
at org.quartz.impl.StdSchedulerFactory.getScheduler(StdSchedulerFactory.java:1525) ~[quartz-2.2.2.jar:na]
at org.springframework.scheduling.quartz.SchedulerFactoryBean.createScheduler(SchedulerFactoryBean.java:599) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.scheduling.quartz.SchedulerFactoryBean.afterPropertiesSet(SchedulerFactoryBean.java:482) ~[spring-context-support-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1612) ~[spring-beans-4.0.1.RELEASE.jar:4.0.1.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1549) ~[spring-
beans-4.0.1.RELEASE.jar:4.0.1.RELEASE]
It seems that
Quartz is trying to create connections with my datasource upfront.
Since my dataSource isn't concrete one (its routing dataSource) and in addition doesn't have knowledge to which target Db to connect (at config time), it fails
Do we have any provision, where quartz can be used with RoutingDataSource? If Not, what would be the next best thing?
Ideally you can try making SchedulerFactoryBean as #Lazy.
But It seems lazy initialization will not work bug, there is also a work around listed in the comments.
Create schedulerFactory bean dynamically after
ContextRefreshedEvent received on root context.
Let us know, If this works.
Related
I have a code base which is using for two different applications. some of my spring service classes has annotation #Transactional. On server start I would like to disable #Transactional based on some configuration.
The below is my configuration Class.
#Configuration
#EnableTransactionManagement
#PropertySource("classpath:application.properties")
public class WebAppConfig {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
#Resource
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
dataSource.setUrl(url);
dataSource.setUsername(userId);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public PlatformTransactionManager txManager() {
DefaultTransactionDefinition def = new DefaultTransactionDefinition();
def.setIsolationLevel(TransactionDefinition.ISOLATION_DEFAULT);
if(appName.equqls("ABC")) {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_NEVER);
}else {
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
}
CustomDataSourceTransactionManager txM=new CustomDataSourceTransactionManager(def);
txM.setDataSource(dataSource());
return txM;
}
#Bean
public JdbcTemplate jdbcTemplate() {
JdbcTemplate jdbcTemplate = new JdbcTemplate();
jdbcTemplate.setDataSource(dataSource());
return jdbcTemplate;
}
}
I am trying to ovveried methods in DataSourceTransactionManager to make the functionality. But still it is trying to commit/rollback the transaction at end of transaction. Since there is no database connection available it is throwing exception.
If I keep #Transactional(propagation=Propagation.NEVER), everything works perfectly, but I cannot modify it as another app is using the same code base and it is necessary in that case.
I would like to know if there is a to make transaction fully disable from configuration without modifying #Transactional annotation.
I'm not sure if it would work but you can try to implement custom TransactionInterceptor and override its method that wraps invocation into a transaction, by removing that transactional stuff. Something like this:
public class NoOpTransactionInterceptor extends TransactionInterceptor {
#Override
protected Object invokeWithinTransaction(
Method method,
Class<?> targetClass,
InvocationCallback invocation
) throws Throwable {
// Simply invoke the original unwrapped code
return invocation.proceedWithInvocation();
}
}
Then you declare a conditional bean in one of #Configuration classes
// assuming this property is stored in Spring application properties file
#ConditionalOnProperty(name = "turnOffTransactions", havingValue = "true"))
#Bean
#Role(BeanDefinition.ROLE_INFRASTRUCTURE)
public TransactionInterceptor transactionInterceptor(
/* default bean would be injected here */
TransactionAttributeSource transactionAttributeSource
) {
TransactionInterceptor interceptor = new NoOpTransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource);
return interceptor;
}
Probably you gonna need additional configurations, I can't verify that right now
I am trying to Pull Records from SQLServer Database to Persist into Mysql Using Spring Boot and Sprin Batch(JpaPagingItemReader and JpaItemWriter).
I have configured Multiple Datasources.
How ever i am facing with below error.
org.springframework.batch.item.ItemStreamException: Error while closing item reader
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.close(AbstractItemCountingItemStreamItemReader.java:138)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.DisposableBeanAdapter.invokeCustomDestroyMethod(DisposableBeanAdapter.java:337)
at org.springframework.beans.factory.support.DisposableBeanAdapter.destroy(DisposableBeanAdapter.java:271)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroyBean(DefaultSingletonBeanRegistry.java:571)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingleton(DefaultSingletonBeanRegistry.java:543)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingleton(DefaultListableBeanFactory.java:957)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.destroySingletons(DefaultSingletonBeanRegistry.java:504)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.destroySingletons(DefaultListableBeanFactory.java:964)
at org.springframework.context.support.AbstractApplicationContext.destroyBeans(AbstractApplicationContext.java:1041)
at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1017)
at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:967)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.start(CommandLineJobRunner.java:377)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.main(CommandLineJobRunner.java:597)
at net.com.org.batch.MyApplication.main(MyApplication.java:15)
Caused by: java.lang.IllegalStateException: Session/EntityManager is closed
at org.hibernate.internal.AbstractSharedSessionContract.checkOpen(AbstractSharedSessionContract.java:344)
at org.hibernate.engine.spi.SharedSessionContractImplementor.checkOpen(SharedSessionContractImplementor.java:137)
at org.hibernate.internal.AbstractSharedSessionContract.checkOpenOrWaitingForAutoClose(AbstractSharedSessionContract.java:350)
at org.hibernate.internal.SessionImpl.close(SessionImpl.java:413)
at org.springframework.batch.item.database.JpaPagingItemReader.doClose(JpaPagingItemReader.java:232)
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.close(AbstractItemCountingItemStreamItemReader.java:135)
... 17 common frames omitted
20:10:55.875 [main] DEBUG org.springframework.beans.factory.support.DefaultListableBeanFactory - Retrieved dependent beans for bean 'jpaMappingContext': [mySqljobRepository, sqlServerLogsRepository]
Below is my Batch,Step Configuration
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
#Qualifier("mysqlEntityManager")
private LocalContainerEntityManagerFactoryBean mysqlLocalContainerEntityManagerFactoryBean;
#Autowired
#Qualifier("secondarySqlEntityManager")
private LocalContainerEntityManagerFactoryBean localContainerEntityManagerFactoryBean;
#Autowired
#Qualifier("mysqlTransactionManager")
private PlatformTransactionManager mySqlplatformTransactionManager;
#Autowired
#Qualifier("secondaryTransactionManager")
private PlatformTransactionManager secondaryTransactionManager;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Bean
public JpaPagingItemReader itemReader(PlatformTransactionManager secondaryTransactionManager) {
JpaPagingItemReader<SqlServerJobLogs> serverJobLogsJpaPagingItemReader = new JpaPagingItemReader<>();
serverJobLogsJpaPagingItemReader.setMaxItemCount(1000);
serverJobLogsJpaPagingItemReader.setPageSize(100);
serverJobLogsJpaPagingItemReader.setEntityManagerFactory(localContainerEntityManagerFactoryBean.getNativeEntityManagerFactory());
serverJobLogsJpaPagingItemReader.setQueryString("select p from SqlServerJobLogs p");
return serverJobLogsJpaPagingItemReader;
}
#Bean
public ItemProcessor itemProcessor() {
return new DataItemProcessor();
}
#Bean
public ItemWriter itemWriter(PlatformTransactionManager mySqlplatformTransactionManager) {
DataWriter dataWriter = new DataWriter();
return dataWriter;
}
#Bean
public Step step() {
return stepBuilderFactory.get("myJob").chunk(100).reader(itemReader(secondaryTransactionManager)).processor(itemProcessor()).writer(itemWriter(mySqlplatformTransactionManager)).build();
}
#Bean(name = "myJob")
public Job myJob() throws Exception {
return jobBuilderFactory.get("myJob").start(step()).build();
}
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager(){
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository jobRepository() throws Exception{
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean(resourcelessTransactionManager());
return mapJobRepositoryFactoryBean.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher() throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository());
return simpleJobLauncher;
}
I have Tried to Configure BatchConfigurer.But No luck.
Please let me know if i need to configure anything else apart from the above mentioned details
Thanks in Advance
I would like to answer the question.
A big thanks to #MahmoudBenHassine
Providing a custom transaction manager by overriding
DefaultBatchConfigurer#getTransactionManager works only with Spring
Batch v4.1+ as said in comments. We need to use Spring Boot v2.1 to
have Spring Batch 4.1
I have a SpringBoot app. that has to access to different datasources to export data from 1 to another (1 local datasource and another remote datasource)
This is how my persistenceConfig looks like
public class PersistenceConfig {
#Bean
public JdbcTemplate localJdbcTemplate() {
return new JdbcTemplate(localDataSource());
}
#Bean
public JdbcTemplate remoteJdbcTemplate() {
return new JdbcTemplate(remoteDataSource());
}
#Bean
public DataSource localDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getLocalDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getLocalDbDriverClassName());
config.setJdbcUrl(getLocalDbJdbcUrl());
config.addDataSourceProperty("user", getLocalDbUser());
config.addDataSourceProperty("password", getLocalDbPwd());
return new HikariDataSource(config);
}
#Bean
public DataSource remoteDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getRemoteDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getRemoteDbDriverClassName());
config.setJdbcUrl(getRemoteDbJdbcUrl());
config.addDataSourceProperty("user", getRemoteDbUser());
config.addDataSourceProperty("password", getRemoteDbPwd());
return new HikariDataSource(config);
}
}
But when I init my app I got this error:
Caused by: org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type 'javax.sql.DataSource' available: expected single matching bean but found 2: localDataSource,remoteDataSource
I also tried to user qualified beans, as follows:
#Bean(name = "localJdbcTemplate")
public JdbcTemplate localJdbcTemplate() {
return new JdbcTemplate(localDataSource());
}
#Bean(name = "remoteJdbcTemplate")
public JdbcTemplate remoteJdbcTemplate() {
return new JdbcTemplate(remoteDataSource());
}
#Bean(name = "localDataSource")
public DataSource localDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getLocalDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getLocalDbDriverClassName());
config.setJdbcUrl(getLocalDbJdbcUrl());
config.addDataSourceProperty("user", getLocalDbUser());
config.addDataSourceProperty("password", getLocalDbPwd());
return new HikariDataSource(config);
}
#Bean(name = "remoteDataSource")
public DataSource remoteDataSource(){
HikariConfig config = new HikariConfig();
config.setMaximumPoolSize(getRemoteDbPoolSize());
config.setMinimumIdle(5);
config.setDriverClassName(getRemoteDbDriverClassName());
config.setJdbcUrl(getRemoteDbJdbcUrl());
config.addDataSourceProperty("user", getRemoteDbUser());
config.addDataSourceProperty("password", getRemoteDbPwd());
return new HikariDataSource(config);
}
but then I got this other error:
A component required a bean of type 'org.springframework.transaction.PlatformTransactionManager' that could not be found.
- Bean method 'transactionManager' not loaded because #ConditionalOnSingleCandidate (types: javax.sql.DataSource; SearchStrategy: all) did not find a primary bean from beans 'remoteDataSource', 'localDataSource'
I also tried
#SpringBootApplication(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
#EnableAutoConfiguration(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
but then I got
A component required a bean of type 'org.springframework.transaction.PlatformTransactionManager' that could not be found.
You can use bean names to qualify them:
#Bean(name = "localDataSource")
public DataSource localDataSource() {
...
}
#Bean(name = "remoteDataSource")
public DataSource remoteDataSource() {
...
}
Please note: You have to do the same for your JdbcTemplate beans - just give them a name and it will work.
See the Spring JavaDoc for more Information: Bean
#Bean(name = "localJdbcTemplate")
public JdbcTemplate localJdbcTemplate() {
return new JdbcTemplate(localDataSource());
}
When you use your JdbcTemplate beans within your export service implementation via autowiring (#Autowired), you need to use #Qualifier to qualify them:
#Autowired
#Qualifier("localJdbcTemplate")
private JdbcTemplate jdbcTemplate;
#Autowired
#Qualifier("remoteJdbcTemplate")
private JdbcTemplate jdbcTemplate;
Bean gets its name from the method name, providing name attribute just makes it explicit (keeping the name the same as the method name). Overall suggestion about #Bean(name="...") and #Qualifier didn't fix the error for me.
I set up sample project with two embedded databases and got the same error as the aothor. Spring suggestion was to annotate one of the DataSource beans as #Primary and, in fact, this fixes the error. Usually it happens, when some other application parts want to see only one or one primary DataSource, if several present.
What seems to be a better solution is to disable not needed autoconfiguration beans keeping rest of the code as it is:
#SpringBootApplication(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
or:
#EnableAutoConfiguration(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class})
depending on which annotations are in use.
If author doesn't use any JPA provider and operates directly with JdbcTemplate it may be a suitable solution.
Just recently started looking into Spring and specifically its latest features, like Java config etc.
I have this somewhat strange issue:
Java config Snippet:
#Configuration
#ImportResource({"classpath*:application-context.xml","classpath:ApplicationContext_Output.xml"})
#Import(SpringJavaConfig.class)
#ComponentScan(excludeFilters={#ComponentScan.Filter(org.springframework.stereotype.Controller.class)},basePackages = " com.xx.xx.x2.beans")
public class ApplicationContextConfig extends WebMvcConfigurationSupport {
private static final Log log = LogFactory.getLog(ApplicationContextConfig.class);
#Autowired
private Environment env;
#Autowired
private IExtendedDataSourceConfig dsconfig;
#PostConstruct
public void initApp() {
...
}
#Bean(name="transactionManagerOracle")
#Lazy
public DataSourceTransactionManager transactionManagerOracle() {
return new DataSourceTransactionManager(dsconfig.oracleDataSource());
}
IExtendedDataSourceConfig has two implementations which are based on spring active profile one or the other in instantiated. For this example let say this is the implementation :
#Configuration
#PropertySources(value = {
#PropertySource("classpath:MYUI.properties")})
#Profile("dev")
public class MYDataSourceConfig implements IExtendedDataSourceConfig {
private static final Log log = LogFactory.getLog(MYDataSourceConfig.class);
#Resource
#Autowired
private Environment env;
public MYDataSourceConfig() {
log.info("creating dev datasource");
}
#Bean
public DataSource oracleDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName("oracle.jdbc.driver.OracleDriver");
dataSource.setUrl(env.getProperty("oracle.url"));
dataSource.setUsername(env.getProperty("oracle.user"));
dataSource.setPassword(env.getProperty("oracle.pass"));
return dataSource;
}
The problem is that when transactionManagerOracle bean is called, (even if I try to mark it as lazy) dsconfig variable value appears to be null.
I guess #beans are processed first and then all Autowires, is there a fix for this? How do I either tell spring to inject dsconfig variable before creating beans, or somehow create #beans after dsconfig is injected?
You can just specify DataSource as method parameter for the transaction manager bean. Spring will then automatically inject the datasource, which is configured in the active profile:
#Bean(name="transactionManagerOracle")
#Lazy
public DataSourceTransactionManager transactionManagerOracle(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
If you still want to do this through the configuration class, specify that as parameter:
public DataSourceTransactionManager transactionManagerOracle(IExtendedDataSourceConfig dsconfig) {}
In both ways you declare a direct dependency to another bean, and Spring will make sure, that the dependent bean exists and will be injected.
I have a console module / app (not a webapp), which I'd like to use with a service module / app built using spring-data-neo4j.
Console App ---> Uses Spring Data Neo4j module
I'm using what I think is the standard way to configure my session, session factory and server (code pasted below), inheriting from Neo4jConfiguration.
When the console app tries to use the ogm session in the spring-data-neo4j service module, I get the error message:
Caused by: java.lang.IllegalStateException: No Scope registered for scope 'session'
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:336)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:196)
at org.springframework.aop.target.SimpleBeanTargetSource.getTarget(SimpleBeanTargetSource.java:35)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:187)
at com.sun.proxy.$Proxy51.loadAll(Unknown Source)
at nz.co.thescene.core.member.MemberAccountService.loadMemberByEmailAddressPasswordAccount(MemberAccountService.java:95)
at nz.co.thescene.console.menu.Menu.login(Menu.java:36)
at nz.co.thescene.console.menu.Menu.login(Menu.java:98)
at nz.co.thescene.console.menu.MainMenu.processUserInput(MainMenu.java:107)
at nz.co.thescene.console.menu.Menu.processUserInput(Menu.java:82)
at nz.co.thescene.console.ConsoleUI.run(ConsoleUI.java:64)
at org.springframework.boot.SpringApplication.runCommandLineRunners(SpringApplication.java:672)
... 9 more
My config is below:
#Configuration
#ComponentScan("nz.co.*****")
#EnableTransactionManagement
#EnableNeo4jRepositories(basePackages = "nz.co.*****")
#EnableConfigurationProperties(Neo4jProperties.class)
public class Neo4jConfig extends Neo4jConfiguration {
private static final Logger log = LoggerFactory.getLogger(Neo4jConfig.class);
#Inject
private Neo4jProperties properties;
#PostConstruct
public void init() {
log.debug("Initializing Neo4jConfig...");
}
#Bean
#Override
public Neo4jServer neo4jServer() {
log.info("Initialising server connection");
return new RemoteServer(properties.getUrl(), properties.getUsername(), properties.getPassword());
//return new InProcessServer();
}
#Bean
#Override
public SessionFactory getSessionFactory() {
log.info("Initialising Session Factory");
return new SessionFactory("nz.co.*****");
}
#Bean
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
#Override
public Session getSession() throws Exception {
log.info("Initialising session-scoped Session Bean");
return super.getSession();
}
}
What do I need to do to get this to work?
Since there's no concept of a session bean in a console app, dropping
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS) from getSession() should do it. Then you don't even need to override getSession().