I'm trying to configure Apache Ignite instance for Hibernate-JCache in a SpringBoot Application and trying to access Spring properties (bootstrap.yml) in order to configure it externally.
Whenever Apache Ignite tries to startup by using org.apache.ignite.cache.CachingProvider and ignite.xml as configuration, it ends up creating its own Spring Application Context that doesn't have any relationship with any other Application Context.
The ApplicationContext creation happens inside this method loadConfigurations in IgniteSpringHelperImpl.java class.
This makes me impossible to make use of properties during the configuration phase, since those properties are accessible from the default SpringBoot Application Context.
Here I leave my bootstrap.yml configuration triggering the startup of Hibernate with JCache and Ignite as implementor.
hibernate:
format_sql: false
generate_statistics: true
globally_quoted_identifiers: true
globally_quoted_identifiers_skip_column_definitions: true
cache:
region:
factory_class: jcache
use_second_level_cache: true
use_query_cache: true
javax:
cache:
provider: org.apache.ignite.cache.CachingProvider
uri: classpath:ignite.xml
dialect:
org.hibernate.dialect.MySQL5Dialect # Allows Hibernate to generate SQL optimized for a particular DBMS
And also the part of my ignite.xml configuration where I try to put properties.
The kubernetesConnectionConfiguration bean is where I'm trying to put properties, inside namespace and serviceName, but I'm doing so inside the ReplicatedCacheIgniteConfiguration class where the same bean is declared as follows (I left commented since it's not working).
<!-- <context:annotation-config/>-->
<!-- <bean class="com.xxxxx.xxxxxx.infrastructure.cache.ReplicatedCacheIgniteConfiguration"/>-->
<bean id="kubernetesConnectionConfiguration" class="org.apache.ignite.kubernetes.configuration.KubernetesConnectionConfiguration">
<property name="namespace" value="default"/>
<property name="serviceName" value="xxxxxx"/>
</bean>
#Configuration
public class ReplicatedCacheIgniteConfiguration {
#Value("${ignite.k8s.namespace:default}")
private String namespace;
#Value("${ignite.k8s.serviceName:xxxxxx}")
private String serviceName;
#Bean
public KubernetesConnectionConfiguration getKubernetesConnectionConfiguration() {
return new KubernetesConnectionConfiguration()
.setNamespace(namespace)
.setServiceName(serviceName);
}
Any idea of how to solve it?
Update:
I achieved to read bootstrap.yml from the Ignite Application Context as follows:
<bean id="yamlProperties" class="org.springframework.beans.factory.config.YamlPropertiesFactoryBean">
<property name="resources" value="classpath:bootstrap.yml"/>
</bean>
<context:property-placeholder properties-ref="yamlProperties"/>
<context:annotation-config/>
<bean class="com.playspace.mc_minigames.infrastructure.cache.ReplicatedCacheIgniteConfiguration"/>
So, inside ReplicatedCacheIgniteConfiguration I can get and read properties from yamlProperties bean.
Anyway the problem is that bootstrap.yml should be updated by a Kubernetes ConfigMap, and by the start-up time of Apache Ignite, this didn't happen yet..
Related
I have modified my original question slightly to better reflect my question. I have a non-Spring Boot application that I would like to have working with the Spring Cloud Config Server. I have searched around online and tried many things but it seems like the crux of my issue is that the server only works within a Spring Boot context. Although ActiveMQ is a Spring application already, it seems non-trivial to convert it to be a Spring Boot one.
I would like to have an ActiveMQ Broker that is configured from the Spring Cloud Config. My local settings within the application.properties should be replaced by those that come from the server. This works in other servers I work on, now I need it to work for my Broker Filter plugins.
I added the following to activemq.xml:
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="locations">
<list>
<value>classpath:application.properties</value>
<value>file:${activemq.conf}/credentials.properties</value>
</list>
</property>
</bean>
NOTE: Several base packages omitted here but are similar to:
<context:component-scan base-package="org.springframework.cloud.bootstrap"/>
<!-- enables annotation based configuration -->
<context:annotation-config />
After doing so I was able to get various #Value annotations to work with settings coming from my application.properties but the whole Spring Cloud Config Server thing seems to not replace my local application.properties file settings. Other Spring Boot application I work on do so I know the server is fine.
I have added the following jars to the apache-activemq-5.12.0\lib\extra directory:
spring-aop-4.1.8.RELEASE.jar
spring-beans-4.1.8.RELEASE.jar
spring-boot-1.2.7.RELEASE.jar
spring-boot-actuator-1.2.7.RELEASE.jar
spring-boot-autoconfigure-1.2.7.RELEASE.jar
spring-boot-starter-1.2.7.RELEASE.jar
spring-boot-starter-actuator-1.2.7.RELEASE.jar
spring-boot-starter-data-mongodb-1.2.7.RELEASE.jar
spring-boot-starter-logging-1.2.7.RELEASE.jar
spring-cloud-commons-1.0.1.RELEASE.jar
spring-cloud-config-client-1.0.1.RELEASE.jar
spring-cloud-context-1.0.1.RELEASE.jar
spring-cloud-starter-1.0.1.RELEASE.jar
spring-cloud-starter-config-1.0.1.RELEASE.jar
spring-context-4.1.8.RELEASE.jar
spring-context-support-4.1.8.RELEASE.jar
spring-core-4.1.8.RELEASE.jar
spring-data-commons-1.11.0.RELEASE.jar
spring-data-mongodb-1.8.0.RELEASE.jar
spring-expression-4.1.8.RELEASE.jar
spring-jms-4.1.8.RELEASE.jar
spring-security-crypto-3.2.8.RELEASE.jar
spring-test-4.1.8.RELEASE.jar
spring-tx-4.1.8.RELEASE.jar
spring-web-4.1.8.RELEASE.jar
refreshendpoint is not necessarily initialized when the constructor is called. You need to add a method annotation with #PostConstruct (or implement InitializingBean and implement afterPropertiesSet method) in which you'll perform refreshendpoint.refresh(); , e.g:
#PostConstruct
void init() {
refreshendpoint.refresh();
}
I am installing Activiti 5.17.0's Activiti Explorer and would like to use a JNDI-based datasource configuration to connect to an Oracle DB. The documentation I found here: http://www.activiti.org/userguide/#jndiDatasourceConfig is very explicit about making this change but unfortunately the docs seems to be obsolete.
In particular, I found no activiti-standalone-context.xml and no activiti-context.xml at the mentioned places. I assume it got changed to activiti-custom-context.xml, but the whole content of this Spring configuration is commented out (which makes me wonder where the actual Spring config might come from).
I tried to configure the datasource in this file anyway using this approach:
<jee:jndi-lookup id="dataSource"
jndi-name="jdbc/activiti-ds"
expected-type="javax.sql.DataSource" />
and this approach as well:
<bean id="dataSource" class="org.springframework.jndi.JndiObjectFactoryBean">
<property name="proxyInterface" value="javax.sql.DataSource"/>
<property name="jndiName"><value>jdbc/activiti-ds</value></property>
</bean>
but both my attempts ended up in the same ClassCastException, claiming that the generated Proxy class is not an instance of javax.sql.DataSource:
java.lang.ClassCastException: org.springframework.jndi.JndiObjectFactoryBean$$EnhancerBySpringCGLIB$$69ba43af cannot be cast to javax.sql.DataSource
at org.activiti.explorer.conf.ActivitiEngineConfiguration$$EnhancerBySpringCGLIB$$5db7207e.dataSource(<generated>)
at org.activiti.explorer.conf.ActivitiEngineConfiguration.processEngineConfiguration(ActivitiEngineConfiguration.java:91)
Any hints how to accomplish to this task? Maybe a pointer to an up-to-date documentation?
For further reference, I solved the problem by editing the Spring JavaConfig in ActivitiEngineConfiguration.java and replacing the dataSource bean creation there with the following code:
#Bean
public DataSource dataSource() {
final JndiDataSourceLookup dsLookup = new JndiDataSourceLookup();
dsLookup.setResourceRef(true);
DataSource dataSource = dsLookup.getDataSource("jdbc/activiti-ds");
return dataSource;
}
After recompiling the module and deploying, it seems to work flawlessly.
Thanks a lot to Greg Harley above whose questions and commented helped to solve the problem!
The Activiti users guide includes updated instructions for how to configure a JDBC datasource here: http://www.activiti.org/userguide/#jndiDatasourceConfig
You will need to configure a datasource bean in the ActivitiEngineConfiguration class of your web application and update the following line of code to reference your new datasource:
processEngineConfiguration.setDataSource(dataSource());
If you want to continue to use the Spring XML configuration, you can still define your custom beans in the activiti-custom-context.xml.
How do I properly configure flyway when integrating with Spring? I see there is a configure method that takes properties, but from the spring XML it would take a setter method to provide a way to inject a Properties instance.
I could write my own Pojo to delegate the configuration to the flyway instance, but it somehow feels like I have missed something.
Here is my configuration:
<bean
id="flyway"
class="com.googlecode.flyway.core.Flyway"
init-method="migrate"
lazy-init="false"
depends-on="dataSource"
>
<property name="dataSource" ref="dataSource" />
<property name="locations" value="classpath:/META-INF/migrations" />
</bean>
I would like to provide a dedicated property file for the migration configuration as documented here:
https://github.com/flyway/flyway/blob/master/flyway-commandline/src/main/assembly/flyway.properties
From the javadoc I see that I can set most of the properties. I could work with spring ${} property replacements and loading the property file with the built in mechs, but this would make those properties available to all beans, and I would add each one I need.
My wrapper would provide a setter so I could add the following to my spring xml config:
<property name="configLocations" value="classpath:/META-INF/flyway.properties" />
Any thoughts appreciated.
Spring's MethodInvokingFactoryBean should do what you want.
Alternatively, you can create a migration based on JdbcTemplate using Flyway's SpringJdbcMigration. The following example is copied from the Flyway documentation:
import com.googlecode.flyway.core.api.migration.spring.SpringJdbcMigration;
import org.springframework.jdbc.core.JdbcTemplate;
public class V1_2__Another_user implements SpringJdbcMigration {
#Override
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
jdbcTemplate.execute("INSERT INTO test_user (name) VALUES ('Obelix')");
}
}
You should use spring annotation and wrap Flyway class, and do whatever you want. For instance, configuring flyway properties. This blog post may give you an example how to do http://esofthead.com/migrate-database-highly-change-environment-multiple-versions-management/
We have large application written in Spring 3. I need to write JUnit test checking behavior of some service. It is not a unit but part of a system. There are some services and repositories working together insite it -> lot of injected beans inside. The app also uses aspects.
My question is. How to manage config and beans in this case of tests? I need to use beans defined in app configes and in tests only redefine beans using persistence to work with a embedded db. So I need to use beans from src as they are defined and override only some causing troubles (persistance beans, beans using webservices,...)
In test package I made Config class definying beans for persistance, using datasource for hsql. But I don`t know what next. I tried to annotate Test config class with:
#Configuration
#EnableAspectJAutoProxy
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ, proxyTargetClass = true)
#ComponentScan(basePackages = "com.example.our.app")
public class MyTestConfig implements TransactionManagementConfigurer {
to scan whole application and use configuration of beans from src folder. But this also takes configs from other tests causing problems. Is this whole good strategy or not? What now - use excludeFilters to remove other test configs? Or is this strategy whole bad?
thanks
You can selectively overwrite beans with the context merging functionality supplied by the #ContextHierarchy annotation.
In order to get this working for your use case you will have to create a base context that scans your app for Spring beans:
#Configuration
#ComponentScan({"com.example.our.app"})
public class MyTestConfig implements TransactionManagementConfigurer {
Then create a base class that utilizes this context and names it - this won't work with named contexts!:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextHierarchy( {
#ContextConfiguration(name="testContext", classes = MyTestConfig.class),
})
public class BaseTest {
And finally write a unit test that extends the base class and defines a new context under the same name to overwrite individual beans with a test specific configuration:
#ContextHierarchy(#ContextConfiguration(name="testContext", classes = OneUnitTest.Config.class))
public class OneUnitTest extends AggroBaseTest {
#Configuration
static class Config {
..
}
I think best way here to use is Spring profiles.
Check here now to use H2 for tests with profiles.
You can also override with another import
<beans>
<import resource="classpath*:applocationContext.xml" />
<bean id="dataSourceFactory" class=com.demo.MyNewClass/>
</beans>
And in you class if you
this.applicationContext.getBean("dataSourceFactory");
retrieve the class, you would see the instance of new class
Further
<bean id="dataSourceFactory" class="org.springframework.jdbc.datasource.DriverManagerDataSource" >
<property name="driverClassName" value="${jdbc.driverClassName}"/>
<property name="url" value="${jdbc.url}"/>
<property name="username" value="${jdbc.username}"/>
<property name="password" value="${jdbc.password}"/>
</bean>
So there are different ways you can override the default behaviour
I wanna inject a bean which will persist the map-entries at hazelcast.
<map name="storethiselements-map">
<backup-count>1</backup-count>
<map-store enabled="true">
<class-name>name.of.MapPersistenceObject</class-name>
<write-delay-seconds>0</write-delay-seconds>
</map-store>
</map>
These are constructor-args for the hazelcast-instance.
In the MapPersistenceObject there exists a Service which is responsible for persisting the entries. I have marked MapPersistenceObject as component and made the Service-Object Autowired so that Spring will inject the right Service-Bean with the right Datasource.
I have tried this but i get a NullPointer where the Service should be injected. It seems to me that Spring can't connect or autowire the MapPersistenceObject with the Service. It looks like this:
#Component
public class MapPersistenceObject implements
MapLoader<Long, DeviceWakeupAction>, MapStore<Long, DeviceWakeupAction> {
#Autowired
StoreMapEntries storeMapEntriesService;
[...]
Maybe somebody knows a solution of the problem?
regards && tia
noircc
You should use Spring configuration, not Hazelcast xml configuration.
<hz:hazelcast id="hazelcast">
<hz:config>
...
<hz:map name="storethiselements-map" backup-count="1">
<hz:map-store enabled="true" implementation="mapPersistenceObject" write-delay-seconds="0"/>
</hz:map>
...
</hz:config>
</hz:hazelcast>
<bean id="mapPersistenceObject" class="name.of.MapPersistenceObject"/>
See Hazelcast Spring integration.