I'm trying to use #EnableMongoRepositories for using two separate mongo repositories like:
#Configuration
#EnableMongoRepositories(mongoTemplateRef = "mongoBOTemplate", basePackages = "sandbox.dao.bo")
public class BOMongoConfig {
#Value("#{mongo.hostBO}")
private String hostBO;
#Value("#{mongo.databaseBO}")
private String databaseBO;
#Bean
public MongoDbFactory mongoBODbFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(hostBO), databaseBO);
}
#Bean
public MongoTemplate mongoBOTemplate() throws Exception {
return new MongoTemplate(mongoBODbFactory());
}
}
and
#Configuration
#EnableMongoRepositories(mongoTemplateRef = "mongoTemplate", basePackages = "sandbox.dao.sandbox")
public class SandboxMongoConfig {
#Value("#{mongo.host}")
private String host;
#Value("#{mongo.database}")
private String database;
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
return new SimpleMongoDbFactory(new MongoClient(host), database);
}
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongoDbFactory());
}
}
but I'm confused because of this error:
710 [RMI TCP Connection(2)-127.0.0.1] ERROR org.springframework.web.servlet.DispatcherServlet - Context initialization failed
java.lang.IllegalArgumentException: Environment must not be null!
at org.springframework.util.Assert.notNull(Assert.java:112)
at org.springframework.data.repository.config.RepositoryConfigurationSourceSupport.<init>(RepositoryConfigurationSourceSupport.java:50)
at org.springframework.data.repository.config.AnnotationRepositoryConfigurationSource.<init>(AnnotationRepositoryConfigurationSource.java:74)
at org.springframework.data.repository.config.RepositoryBeanDefinitionRegistrarSupport.registerBeanDefinitions(RepositoryBeanDefinitionRegistrarSupport.java:74)
at org.springframework.context.annotation.ConfigurationClassParser.processImport(ConfigurationClassParser.java:340)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:233)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:154)
at org.springframework.context.annotation.ConfigurationClassParser.processImport(ConfigurationClassParser.java:349)
at org.springframework.context.annotation.ConfigurationClassParser.doProcessConfigurationClass(ConfigurationClassParser.java:233)
at org.springframework.context.annotation.ConfigurationClassParser.processConfigurationClass(ConfigurationClassParser.java:154)
at org.springframework.context.annotation.ConfigurationClassParser.parse(ConfigurationClassParser.java:140)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:282)
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:223)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:630)
As I understood there is only one option to fix it is using #Profile. I'm using maven to profile management and not sure why I need hardcore profiles in code...
Could anyone help me with misunderstanding?
Thanks.
Well, you have to somehow show spring which of those configurations to use for particular case. Otherwise how would it be possible decide which MongoDbFactory instance to create? So yes, use #Profile above both #Configuration classes.
Also please note that maven profiles are not spring profiles. Might be that you dont have to mix maven into that ( if maven profile is only use to set spring one ). I such case you can add -Dspring.profiles.active=profile while running your app.
Related
This is a follow up to the question Making spring-data-mongodb multi-tenant
Oliver Gierke explained options how to set-up multi-tenancy for a SpringDataMongo application. I followed his "collection approach" and was quite successful. So far. Problems arise, when I want to customise the MongoTemplate used. Have a look on this example:
#SpringBootApplication
public class MultiTenantMongoApplication {
public static void main(String[] args) {
SpringApplication.run(MultiTenantMongoApplication.class, args);
}
#Bean
public MongoTemplate mongoTemplate(Mongo mongo, #Value("${random.name}") String randomName) throws Exception {
String dbname = "db_" + randomName;
MongoTemplate mongoTemplate = new MongoTemplate(mongo, dbname) {
#SuppressWarnings("unused")
public void shutdown() {
mongo.dropDatabase(dbname);
}
};
return mongoTemplate;
}
}
#Document(collection="#{tenantProvider.getTenantCollectionName('Metric')}")
public class Metric {
}
#Repository
public interface MetricRepository extends MongoRepository<Metric, ObjectId>{}
#Component
public class TenantProvider {
public String getTenantCollectionName(String collectionName) {
...
}
}
This yields the following error:
SpelEvaluationException: EL1007E: Property or field 'tenantProvider'
cannot be found on null
When I remove the definition of the MongoTemplate bean in the application class everything is fine and runs as desired.
Obviously the property provider gets not configured appropriately, when the MongoTemplate is customised. Why is this happening? And what can I do, to get the property in place?
I think the above error is because of the SpEL expression. You can try this way to access the TenantProvider class using the below SpEL expression.
#{T(TenantProvider).getTenantCollectionName('Metric')}
or you can add a fully qualified class name for TenantProvider in the above expression.
My goal is to have a have integration tests that ensures that there isn't too many database queries happening during lookups. (This helps us catch n+1 queries due to incorrect JPA configuration)
I know that the database connection is correct because there is no configuration problems during the test run whenever MyDataSourceWrapperConfiguration is not included in the test. However, once it is added, the circular dependency happens. (see error below) I believe #Primary is necessary in order for the JPA/JDBC code to use the correct DataSource instance.
MyDataSourceWrapper is a custom class that tracks the number of queries that have happened for a given transaction, but it delegates the real database work to the DataSource passed in via constructor.
Error:
The dependencies of some of the beans in the application context form a cycle:
org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration
┌─────┐
| databaseQueryCounterProxyDataSource defined in me.testsupport.database.MyDataSourceWrapperConfiguration
↑ ↓
| dataSource defined in org.springframework.boot.autoconfigure.jdbc.DataSourceConfiguration$Tomcat
↑ ↓
| dataSourceInitializer
└─────┘
My Configuration:
#Configuration
public class MyDataSourceWrapperConfiguration {
#Primary
#Bean
DataSource databaseQueryCounterProxyDataSource(final DataSource delegate) {
return MyDataSourceWrapper(delegate);
}
}
My Test:
#ActiveProfiles({ "it" })
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration({ DatabaseConnectionConfiguration.class, DatabaseQueryCounterConfiguration.class })
#EnableAutoConfiguration
public class EngApplicationRepositoryIT {
#Rule
public MyDatabaseQueryCounter databaseQueryCounter = new MyDatabaseQueryCounter ();
#Rule
public ErrorCollector errorCollector = new ErrorCollector();
#Autowired
MyRepository repository;
#Test
public void test() {
this.repository.loadData();
this.errorCollector.checkThat(this.databaseQueryCounter.getSelectCounts(), is(lessThan(10)));
}
}
UPDATE: This original question was for springboot 1.5. The accepted answer reflects that, however, the answer from #rajadilipkolli works for springboot 2.x
In your case you will get 2 DataSource instances which is probably not what you want. Instead use BeanPostProcessor which is the component actually designed for this. See also the Spring Reference Guide.
Create and register a BeanPostProcessor which does the wrapping.
public class DataSourceWrapper implements BeanPostProcessor {
public Object postProcessBeforeInitialization(Object bean, String beanName) {
if (bean instanceof DataSource) {
return new MyDataSourceWrapper((DataSource)bean);
}
return bean;
}
public Object postProcessAfterInitialization(Object bean, String beanName) throws BeansException {
return bean;
}
}
Then just register that as a #Bean instead of your MyDataSourceWrapper.
Tip: Instead of rolling your own wrapping DataSource you might be interested in datasource-proxy combined with datasource-assert which has counter etc. support already (saves you maintaining your own components).
Starting from spring boot 2.0.0.M3 using BeanPostProcessor wont work.
As a work around create your own bean like below
#Bean
public DataSource customDataSource(DataSourceProperties properties) {
log.info("Inside Proxy Creation");
final HikariDataSource dataSource = (HikariDataSource) properties
.initializeDataSourceBuilder().type(HikariDataSource.class).build();
if (properties.getName() != null) {
dataSource.setPoolName(properties.getName());
}
return ProxyDataSourceBuilder.create(dataSource).countQuery().name("MyDS")
.logSlowQueryToSysOut(1, TimeUnit.MINUTES).build();
}
Another way is to use datasource-proxy version of datasource-decorator starter
Following solution works for me using Spring Boot 2.0.6.
It uses explicit binding instead of annotation #ConfigurationProperties(prefix = "spring.datasource.hikari").
#Configuration
public class DataSourceConfig {
private final Environment env;
#Autowired
public DataSourceConfig(Environment env) {
this.env = env;
}
#Primary
#Bean
public MyDataSourceWrapper primaryDataSource(DataSourceProperties properties) {
DataSource dataSource = properties.initializeDataSourceBuilder().build();
Binder binder = Binder.get(env);
binder.bind("spring.datasource.hikari", Bindable.ofInstance(dataSource).withExistingValue(dataSource));
return new MyDataSourceWrapper(dataSource);
}
}
You can actually still use BeanPostProcessor in Spring Boot 2, but it needs to return the correct type (the actual type of the declared Bean). To do this you need to create a proxy of the correct type which redirects DataSource methods to your interceptor and all the other methods to the original bean.
For example code see the Spring Boot issue and discussion at https://github.com/spring-projects/spring-boot/issues/12592.
When configuring MongoDB in Spring, the reference sais:
register MongoDB like this:
#Configuration
public class AppConfig {
/*
* Use the standard Mongo driver API to create a com.mongodb.Mongo instance.
*/
public #Bean Mongo mongo() throws UnknownHostException {
return new Mongo("localhost");
}
}
pollutes the code with the UnknownHostException checked exception. The use of the checked exception is not desirable as Java based bean metadata uses methods as a means to set object dependencies, making the calling code cluttered.
so Spring proposes
#Configuration
public class AppConfig {
/*
* Factory bean that creates the com.mongodb.Mongo instance
*/
public #Bean MongoFactoryBean mongo() {
MongoFactoryBean mongo = new MongoFactoryBean();
mongo.setHost("localhost");
return mongo;
}
}
But unfortunately since Spring-Data-MongoDB 1.7 MongoFactoryBean has been deprecated and replaced by MongoClientFactoryBean.
So
#Bean
public MongoClientFactoryBean mongoClientFactoryBean() {
MongoClientFactoryBean factoryBean = new MongoClientFactoryBean();
factoryBean.setHost("localhost");
return factoryBean;
}
Then it's time to configure MongoDbFactory which has only one implementation SimpleMongoDbFactory. The SimpleMongoDbFactory has only two initializer not deprecated one of which is SimpleMongoDbFactory(MongoClient, DataBase).
But MongoClientFactoryBean can only return type of Mongo instead of MongoClient.
So, am I missing something to make this pure Spring configuration work?
Yes it returns a Mongo :-(
But as MongoClient extends Mongo that'll be ok anyway, just #Autowire the bean as a Mongo
#Autowired
private Mongo mongo;
Then use it
MongoOperations mongoOps = new MongoTemplate(mongo, "databaseName");
Do you really need the SimpleMongoDbFactory ? See this post.
In my case, I'm using the following code to create MongoTemplate. I'm using MongoRespository. As it only needs MongoTemplate I need to create the MongoTemplate bean only.
#Bean
public MongoTemplate mongoTemplate() throws Exception {
MongoClient mongoClient = new MongoClient("localhost");
MongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(mongoClient, "kyc_reader_test");
return new MongoTemplate(mongoDbFactory);
}
In my configuration file, I've added
#EnableMongoRepositories(basePackages = "mongo.repository.package.name")
I am using Spring for IOC and transaction management, and am planning to use Apache Shiro as the security library.
Whenever I want to check a user's permissions, I call subject.isPermitted("right"), whereupon Shiro checks for the permission using a datastore. Within these calls, a database connection is established, and I have annotated the method with #Transactional. However, I always receive an error that there is no Hibernate session bound to the thread whenever I execute the permission check.
The method is in the Realm class. I defined a custom Shiro Realm class:
#Component
public class MainRealm extends AuthorizingRealm {
#Autowired
protected SysUserDao userDao;
#Transactional
protected AuthenticationInfo doGetAuthenticationInfo(AuthenticationToken token)
throws AuthenticationException {
...
final SysUser user = this.userDao.findByUsername(un);
...
return authInfo;
}
#Transactional
protected AuthorizationInfo doGetAuthorizationInfo(PrincipalCollection principals) {
...
permissions = this.userDao.getAccessRights(un);
...
return authInfo;
}
}
Apache Shiro uses a Servlet Filter, so I have the following defined in web.xml:
<filter>
<filter-name>shiroFilter</filter-name>
<filter-class>org.springframework.web.filter.DelegatingFilterProxy</filter-class>
<init-param>
<param-name>targetFilterLifecycle</param-name>
<param-value>true</param-value>
</init-param>
</filter>
I am using programmatic configuration for Spring. Here is my App Config class:
#Configuration //Replaces Spring XML configuration
#ComponentScan(basePackages = "com.mycompany")
#EnableTransactionManagement //Enables declarative Transaction annotations
public class SpringAppConfig {
#Bean
public DataSource sqlServerDataSource() throws Exception {...}
#Bean
#Autowired
public PlatformTransactionManager transactionManager(SessionFactory sessionFactory) {...}
#Bean
public AnnotationSessionFactoryBean getSessionFactory() throws Exception {...}
#Bean
public static PersistenceExceptionTranslationPostProcessor exceptionTranslation() {...}
#Bean
#Autowired
public DefaultWebSecurityManager securityManager(MainRealm mainRealm) {
final HashedCredentialsMatcher hcm = new HashedCredentialsMatcher(shiroHash);
hcm.setHashIterations(shiroIter);
hcm.setStoredCredentialsHexEncoded(shiroHexEncoded);
mainRealm.setCredentialsMatcher(hcm);
final DefaultWebSecurityManager sm = new DefaultWebSecurityManager();
sm.setRealm(mainRealm);
return sm;
}
#Bean
#Autowired
public ShiroFilterFactoryBean shiroFilter(DefaultWebSecurityManager securityManager) {
final ShiroFilterFactoryBean filter = new ShiroFilterFactoryBean();
filter.setSecurityManager(securityManager);
return filter;
}
/**
* This method needs to be static due to issues defined here:<br>
* https://issues.apache.org/jira/browse/SHIRO-222
*/
#Bean
public static LifecycleBeanPostProcessor lifecycleBeanPostProcessor() {
LifecycleBeanPostProcessor lbpp = new LifecycleBeanPostProcessor();
return lbpp;
}
#Bean
#DependsOn("lifecycleBeanPostProcessor")
public static DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
#Autowired
public AuthorizationAttributeSourceAdvisor authorizationAttributeSourceAdvisor(DefaultWebSecurityManager secMan) {
AuthorizationAttributeSourceAdvisor advBean = new AuthorizationAttributeSourceAdvisor();
advBean.setSecurityManager(secMan);
return advBean;
}
}
To summarize the issue, I believe my MainRealm class is being wired properly (it has an #Autowired dependency to a DAO object and I verified that it is not null) with the exception of the #Transactional annotation. Due to this, I cannot directly call user.isPermitted("") as it prompts an error: No Hibernate Session bound to thread, and configuration does not allow creation of non-transactional one here.
Would like to ask for help in checking whether I missed anything in my Spring configuration.
In the meantime I have hacked over this issue by calling the user.isPermitted("") function within a method in my Service class that's correctly bound by a #Transactional.
EDIT When I checked the logs for Spring initialization I can see this:
Bean 'mainRealm' of type [class com.x.security.MainRealm] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
According to this SO answer it means MainRealm isn't being postprocessed by the transaction manager therefore any #Transactional annotations get ignored. If so, how can I correct this?
EDIT 2 According to this SO question: "In other words, if I write my own BeanPostProcessor, and that class directly references other beans in the context, then those referenced beans will not be eligible for auto-proxying, and a message is logged to that effect." I just checked ShiroFilterFactoryBean and it is in fact a BeanPostProcessor. And the problem is it requires a SecurityManager instance that in turn requires a MainRealm instance. So both beans are autowired and are thus ineligible for proxying. I feel like I'm closer to the solution but I still can't resolve it.
The root cause of the issue is in fact due to the following:
All BeanPostProcessors and their directly referenced beans will be instantiated on startup... Since AOP auto-proxying is implemented as a BeanPostProcessor itself, no BeanPostProcessors or directly referenced beans are eligible for auto-proxying (and thus will not have aspects 'woven' into them.
Referenced SO question is here.
I have resolved this issue by decoupling the Realm bean creation from the SecurityManager bean creation.
The relevant change is from the following code:
#Bean
#Autowired
public DefaultWebSecurityManager securityManager(MainRealm mainRealm) {
final HashedCredentialsMatcher hcm = new HashedCredentialsMatcher(shiroHash);
hcm.setHashIterations(shiroIter);
hcm.setStoredCredentialsHexEncoded(shiroHexEncoded);
mainRealm.setCredentialsMatcher(hcm);
final DefaultWebSecurityManager sm = new DefaultWebSecurityManager();
sm.setRealm(mainRealm);
return sm;
}
to the following code:
#Bean
public DefaultWebSecurityManager securityManager() {
final DefaultWebSecurityManager sm = new DefaultWebSecurityManager();
//sm.setRealm(mainRealm); -> set this AFTER Spring initialization so no dependencies
return sm;
}
Then I use a ServletContextListener which listens to when the Spring context initialization completes and I have the both the MainRealm and SecurityManager beans. Then I just plug one bean inside another.
#Override
public void contextInitialized(ServletContextEvent sce) {
try {
//Initialize realms
final MainRealm mainRealm = (MainRealm)ctx.getBean("mainRealm");
final DefaultWebSecurityManager sm = (DefaultWebSecurityManager)ctx.getBean("securityManager");
sm.setRealm(mainRealm);
} catch (Exception e) {
System.out.println("Error loading: " + e.getMessage());
throw new Error("Critical system error", e);
}
}
#Transactional annotation can be used before :
An interface def
An interface method
A class def
A PUBLIC method of a class
As explained in the documentation
The fact that your method is protected must be the reason of your problem, and your service method was maybe declared as public which would explained why it worked in that case
I'm trying to build a Spring 3.1 PropertySource which reads its values from Zookeeper nodes. For connecting to Zookeeper I am using Curator from Netflix.
For that I've built a custom property source which reads the value of a property from Zookeeper and returns it. This works fine when I am resolving the property like this
ZookeeperPropertySource zkPropertySource = new ZookeeperPropertySource(zkClient);
ctx.getEnvironment().getPropertySources().addLast(zkPropertySource);
ctx.getEnvironment().getProperty("foo"); // returns 'from zookeeper'
However, when I try to instantiate a bean which has a field with an #Value annotation then this fails:
#Component
public class MyBean {
#Value("${foo}") public String foo;
}
MyBean b = ctx.getBean(MyBean.class); // fails with BeanCreationException
This problem has most likely nothing to do with Zookeeper but with the way I'm registering the property sources and creating the beans.
Any insight is highly appreciated.
Update 1:
I'm creating the app context from an XML file like this:
public class Main {
public static void main(String[] args) throws Exception {
ConfigurableApplicationContext ctx = new ClassPathXmlApplicationContext("applicationContext.xml");
ctx.registerShutdownHook();
}
}
The class which connects to Zookeeper is a #Component.
#Component
public class Server {
CuratorFramework zkClient;
public void connectToZookeeper() {
zkClient = ... (curator magic) ...
}
public void registerPropertySource() {
ZookeeperPropertySource zkPropertySource = new ZookeeperPropertySource(zkClient);
ctx.getEnvironment().getPropertySources().addLast(zkPropertySource);
ctx.getEnvironment().getProperty("foo"); // returns 'from zookeeper'
}
#PostConstruct
public void start() {
connectToZookeeper();
registerPropertySource();
MyBean b = ctx.getBean(MyBean.class);
}
}
Update 2
This seems to work when I'm using XML-less configuration, i.e. #Configuration, #ComponentScan and #PropertySource in combination with an AnnotationConfigApplicationContext. Why doesn't it work with a ClassPathXmlApplicationContext?
#Configuration
#ComponentScan("com.goleft")
#PropertySource({"classpath:config.properties","classpath:version.properties"})
public class AppConfig {
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
Answering to your Update 2: This does not work with your original configuration(registering a PropertySource using #PostConstruct) because the PropertySource is being registered very late, by this time your target bean has already been constructed and initialized.
Typically the injection of the placeholders happens via a BeanFactoryPostProcessor which is very early in the Spring lifecycle(beans have not been created at this stage) and if a PropertySource is registered at that stage, then placeholders should be resolved.
The best approach though is to use a ApplicationContextInitializer, get a handle on the applicationContext and to register the propertySource there:
public class CustomInitializer implements ApplicationContextInitializer<ConfigurableWebApplicationContext> {
public void initialize(ConfigurableWebApplicationContext ctx) {
ZookeeperPropertySource zkPropertySource = new ZookeeperPropertySource(zkClient);
ctx.getEnvironment().getPropertySources().addFirst(zkPropertySource);
}
}