Spring, autowire #Value from a database - spring

I am using a properties File to store some configuration properties, that are accessed this way:
#Value("#{configuration.path_file}")
private String pathFile;
Is it possible (with Spring 3) to use the same #Value annotation, but loading the properties from a database instead of a file ?

Assuming you have a table in your database stored key/value pairs:
Define a new bean "applicationProperties" - psuedo-code follows...
public class ApplicationProperties {
#AutoWired
private DataSource datasource;
public getPropertyValue(String key) {
// transact on your datasource here to fetch value for key
// SNIPPED
}
}
Inject this bean where required in your application. If you already have a dao/service layer then you would just make use of that.

Yes, you can keep your #Value annotation, and use the database source with the help of EnvironmentPostProcessor.
As of Spring Boot 1.3, we're able to use the EnvironmentPostProcessor to customize the application's Environment before application context is refreshed.
For example, create a class which implements EnvironmentPostProcessor:
public class ReadDbPropertiesPostProcessor implements EnvironmentPostProcessor {
private static final String PROPERTY_SOURCE_NAME = "databaseProperties";
private String[] CONFIGS = {
"app.version"
// list your properties here
};
#Override
public void postProcessEnvironment(ConfigurableEnvironment environment, SpringApplication application) {
Map<String, Object> propertySource = new HashMap<>();
try {
// the following db connections properties must be defined in application.properties
DataSource ds = DataSourceBuilder
.create()
.username(environment.getProperty("spring.datasource.username"))
.password(environment.getProperty("spring.datasource.password"))
.url(environment.getProperty("spring.datasource.url"))
.driverClassName("com.mysql.jdbc.Driver")
.build();
try (Connection connection = ds.getConnection();
// suppose you have a config table storing the properties name/value pair
PreparedStatement preparedStatement = connection.prepareStatement("SELECT value FROM config WHERE name = ?")) {
for (int i = 0; i < CONFIGS.length; i++) {
String configName = CONFIGS[i];
preparedStatement.setString(1, configName);
ResultSet rs = preparedStatement.executeQuery();
while (rs.next()) {
propertySource.put(configName, rs.getString("value"));
}
// rs.close();
preparedStatement.clearParameters();
}
}
environment.getPropertySources().addFirst(new MapPropertySource(PROPERTY_SOURCE_NAME, propertySource));
} catch (Throwable e) {
throw new RuntimeException(e);
}
}
}
Finally, don't forget to put your spring.factories in META-INF. An example:
org.springframework.boot.autoconfigure.EnableAutoConfiguration=
com.baeldung.environmentpostprocessor.autoconfig.PriceCalculationAutoConfig

Although not having used spring 3, I'd assume you can, if you make a bean that reads the properties from the database and exposes them with getters.

Related

Spring Boot Gemfire Server Configuration

I am trying to understand how to host a Spring Boot Gemfire server process.
I found this example Spring Gemfire Server
The problem I am having is the the server I am trying to add to the cluster is not showing up in the cluster after I start the process.
Here are the steps I am taking:
Start a new locator locally (default port): gfsh>start locator --name=loc-one
I want to add this SpringBootGemfireServer to the cluster:
note I have commented out the embeded locator start-up - I want to add this to the existing locator already running
#SpringBootApplication
#SuppressWarnings("unused")
public class SpringGemFireServerApplication {
private static final boolean DEFAULT_AUTO_STARTUP = true;
public static void main(String[] args) {
SpringApplication.run(SpringGemFireServerApplication.class, args);
}
#Bean
static PropertyPlaceholderConfigurer propertyPlaceholderConfigurer() {
return new PropertyPlaceholderConfigurer();
}
private String applicationName() {
return SpringGemFireServerApplication.class.getSimpleName();
}
#Bean
Properties gemfireProperties(
#Value("${gemfire.log.level:config}") String logLevel,
#Value("${gemfire.locator.host-port:localhost[10334]}") String locatorHostPort,
#Value("${gemfire.manager.port:1099}") String managerPort) {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", applicationName());
gemfireProperties.setProperty("log-level", logLevel);
//gemfireProperties.setProperty("start-locator", locatorHostPort);
//gemfireProperties.setProperty("jmx-manager", "true");
//gemfireProperties.setProperty("jmx-manager-port", managerPort);
//gemfireProperties.setProperty("jmx-manager-start", "true");
return gemfireProperties;
}
#Bean
CacheFactoryBean gemfireCache(#Qualifier("gemfireProperties") Properties gemfireProperties) {
CacheFactoryBean gemfireCache = new CacheFactoryBean();
gemfireCache.setClose(true);
gemfireCache.setProperties(gemfireProperties);
return gemfireCache;
}
#Bean
CacheServerFactoryBean gemfireCacheServer(Cache gemfireCache,
#Value("${gemfire.cache.server.bind-address:localhost}") String bindAddress,
#Value("${gemfire.cache.server.hostname-for-clients:localhost}") String hostNameForClients,
#Value("${gemfire.cache.server.port:40404}") int port) {
CacheServerFactoryBean gemfireCacheServer = new CacheServerFactoryBean();
gemfireCacheServer.setCache(gemfireCache);
gemfireCacheServer.setAutoStartup(DEFAULT_AUTO_STARTUP);
gemfireCacheServer.setBindAddress(bindAddress);
gemfireCacheServer.setHostNameForClients(hostNameForClients);
gemfireCacheServer.setPort(port);
return gemfireCacheServer;
}
#Bean
PartitionedRegionFactoryBean<Long, Long> factorialsRegion(Cache gemfireCache,
#Qualifier("factorialsRegionAttributes") RegionAttributes<Long, Long> factorialsRegionAttributes) {
PartitionedRegionFactoryBean<Long, Long> factorialsRegion = new PartitionedRegionFactoryBean<>();
factorialsRegion.setAttributes(factorialsRegionAttributes);
factorialsRegion.setCache(gemfireCache);
factorialsRegion.setClose(false);
factorialsRegion.setName("Factorials");
factorialsRegion.setPersistent(false);
return factorialsRegion;
}
#Bean
#SuppressWarnings("unchecked")
RegionAttributesFactoryBean factorialsRegionAttributes() {
RegionAttributesFactoryBean factorialsRegionAttributes = new RegionAttributesFactoryBean();
factorialsRegionAttributes.setCacheLoader(factorialsCacheLoader());
factorialsRegionAttributes.setKeyConstraint(Long.class);
factorialsRegionAttributes.setValueConstraint(Long.class);
return factorialsRegionAttributes;
}
FactorialsCacheLoader factorialsCacheLoader() {
return new FactorialsCacheLoader();
}
class FactorialsCacheLoader implements CacheLoader<Long, Long> {
// stupid, naive implementation of Factorial!
#Override
public Long load(LoaderHelper<Long, Long> loaderHelper) throws CacheLoaderException {
long number = loaderHelper.getKey();
assert number >= 0 : String.format("Number [%d] must be greater than equal to 0", number);
if (number <= 2L) {
return (number < 2L ? 1L : 2L);
}
long result = number;
while (number-- > 1L) {
result *= number;
}
return result;
}
#Override
public void close() {
}
}
}
When I go to gfsh>connect list members
I only see the locator.
I haven't verified the full configuration to check whether there's something else wrong, but the main issue I see right now is that you seem to be confusing the start-locator property (automatically starts a locator in the current process when the member connects to the distributed system and stops the locator when the member disconnects) with the locators property (the list of locators used by system members, it must be configured consistently for every member of the distributed system). Since you're not correctly setting the locators property when configuring the server, it just can't join the existing distributed system because it doesn't know which locator to connect to.
The default locator port used by GemFire is 10334, so you should change your gemfireProperties method as follows:
#Bean
Properties gemfireProperties(#Value("${gemfire.log.level:config}") String logLevel, #Value("${gemfire.locator.host-port:localhost[10334]}") String locatorHostPort, #Value("${gemfire.manager.port:1099}") String managerPort) {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", applicationName());
gemfireProperties.setProperty("log-level", logLevel);
// You can directly use the locatorHostPort variable instead.
gemfireProperties.setProperty("locators", "localhost[10334]");
return gemfireProperties;
}
Hope this helps.
Cheers.

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

How to read all properties values from the source properties file in Spring-cloud-config client

i have this spring-cloud-config client class and i can access the individual properties using the #Value annotation just fine. However, i am interested to know how to read ALL the properties values from a properties file without binding each of the property's key to a #Value annotation. Basically the idea is that i would like to read all the properties value from the properties file without even knowing anything about the properties defined in the file. Any idea how i can do that?
Client Class
#EnableAutoConfiguration
#ComponentScan
#RestController
#RefreshScope
public class ConfigDemoClientApplication
{
#Value("${special}")
String special;
#RequestMapping("/restaurant")
public String hello()
{
return "Hello " + special;
}
public static void main(String[] args) {
SpringApplication.run(ConfigDemoClientApplication.class, args);
}
}
Sample Properties file
special: bargain!
amount: 200
city: New York
In this example, i would like to read all the 3 properties without defining a #Value annotation for each of them in my class. Is that possible?
Thanks for your help.
I just solved you problem creating this applicationProps bean, that is a java.util.Properties object containing all the properties of the application.
The only think needed is an autowired Environment object.
Here's the code:
#Autowired
Environment env;
//Load all the properties of the server and put them into a java Properties obj
#Bean(name = "applicationProps")
public Properties applicationProperties() {
final Properties properties = new Properties();
for(Iterator it = ((AbstractEnvironment) env).getPropertySources().iterator(); it.hasNext(); ) {
PropertySource propertySource = (PropertySource) it.next();
if (propertySource instanceof PropertiesPropertySource) {
log.info("Adding all properties contained in " + propertySource.getName());
properties.putAll(((MapPropertySource) propertySource).getSource());
}
if (propertySource instanceof CompositePropertySource){
properties.putAll(getPropertiesInCompositePropertySource((CompositePropertySource) propertySource));
}
}
return properties;
}
private Properties getPropertiesInCompositePropertySource(CompositePropertySource compositePropertySource){
final Properties properties = new Properties();
compositePropertySource.getPropertySources().forEach(propertySource -> {
if (propertySource instanceof MapPropertySource) {
log.info("Adding all properties contained in " + propertySource.getName());
properties.putAll(((MapPropertySource) propertySource).getSource());
}
if (propertySource instanceof CompositePropertySource)
properties.putAll(getPropertiesInCompositePropertySource((CompositePropertySource) propertySource));
});
return properties;
}
#Autowired
#Qualifier("applicationProps")
Properties applicationProperties;
The recursive step in getPropertiesInCompositePropertySource method is needed because the properties fetched from the config server are recursively nested in a CompositePropertySource
Hope it helps
Greetings
Try this : Its all Spring, you can maybe use it with a PostConstruct Method
Map<String,String> someMap = new HashMap<>();
Resource resource = new ClassPathResource("some.properties");
Properties props = PropertiesLoaderUtils.loadProperties(resource);
for(Object key : props.keySet()) {
someMap.put(key.toString(),props.getProperty(key.toString()));
}

Remote PropertySource

Has anyone had any luck constructing a PropertySource that uses a remote source (for example a database) from which to retrieve property values. The idea would be to construct a PropertySource (needs some connection information such as host/port) and plug that into a PropertySourcePlaceholderConfigurer.
The problem seems to be a chicken and egg problem. How can I get the connection information down to the PropertySource? I could first instantiate the PropertySourcePlaceholderConfigurer with configuration to load a property file with the remote host and port properties and then later instantiate the PropertySource and inject that back into the configurer. However, I can't seem to figure a way to ensure that the very first bean to be instantiated (and quickly injected into the configurer) is my property source. I need to have this because, of course, all my other beans depend on the remote properties.
Commons Configuration supports loading properties from a variety of sources (including JDBC Datasources) into a org.apache.commons.configuration.Configuration object via a org.apache.commons.configuration.ConfigurationBuilder.
Using the org.apache.commons.configuration.ConfiguratorConverter, you can convert the Configuration object into a java.util.Properties object which can be passed to the PropertySourcesPlaceholderConfigurer.
As to the chicken and egg question of how to configure the ConfigurationBuilder, I recommend using the org.springframework.core.env.Environment to query for system properties, command-line properties or JNDI properties.
In this exampe:
#Configuration
public class RemotePropertyConfig {
#Bean
public static PropertySourcesPlaceholderConfigurer propertyPlaceholderConfigurer(Environment environment)
throws Exception {
final PropertySourcesPlaceholderConfigurer props = new PropertySourcesPlaceholderConfigurer();
final ConfigurationBuilder configurationBuilder = new DefaultConfigurationBuilder(environment.getProperty("configuration.definition.file"));
props.setProperties(ConfigurationConverter.getProperties(configurationBuilder.getConfiguration()));
return props;
}
You will need to specify the environment property configuration.definition.file which points to a file needed to configure Commons Configuration:
Similar to Recardo's answer above, I used Spring's PropertiesLoaderUtils instead of Apache's, but it amounts to the same thing. It's not exactly ideal.. hard coded dependency injection, but hey, it works!
/**
* This method must remain static as it's part of spring's initialization effort.
* #return
**/
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
PropertySourcesPlaceholderConfigurer configurer = new PropertySourcesPlaceholderConfigurer();
String dbHost = null;
Integer dbPort = null;
// check system / environment properties first
Environment environment = new StandardEnvironment();
if (environment.containsProperty(DB_HOST_KEY)) {
dbHost = environment.getProperty(DB_HOST_KEY);
}
if (environment.containsProperty(DB_PORT_KEY)) {
dbPort = Integer.valueOf(environment.getProperty(DB_PORT_KEY));
}
if (dbHost == null || dbPort == null) {
// ok one or (probably) both properties null, let's go find the database.properties file
Properties dbProperties;
try {
dbProperties = PropertiesLoaderUtils.loadProperties(new EncodedResource(new ClassPathResource("database.properties"), "UTF-8"));
}
catch (IOException e) {
throw new RuntimeException("Could not load database.properties. Please confirm the file is in the classpath");
}
if (dbHost == null) {
dbHost = dbProperties.getProperty(DB_HOST_KEY);
}
if (dbPort == null) {
dbPort = Integer.valueOf(dbProperties.getProperty(DB_PORT_KEY));
}
}
PropertySourceService propertySourceService = new DBPropertySourceService(dbHost, dbPort);
PropertySource<PropertySourceService> propertySource = new DBPropertySource(propertySourceService);
MutablePropertySources propertySources = new MutablePropertySources();
propertySources.addFirst(propertySource);
configurer.setPropertySources(propertySources);
return configurer;
}
per request, here is the source of the remote property source. It depends on a 'service' class that might do.. well.. anything.. remote access of a property over a socket, talk to a database, whatever.
/**
* Property source for use with spring's PropertySourcesPlaceholderConfigurer where the source is a service
* that connects to remote server for property values.
**/
public class RemotePropertySource extends PropertySource<PropertySourceService> {
private final Environment environment;
/**
* Constructor...
* #param name
* #param source
**/
public RemotePropertySource(PropertySourceService source) {
super("RemotePropertySource", source);
environment = new StandardEnvironment();
}
/* (non-Javadoc)
* #see org.springframework.core.env.PropertySource#getProperty(java.lang.String)
*/
#Override
public Object getProperty(String name) {
// check system / environment properties first
String value;
if (environment.containsProperty(name)) {
value = environment.getProperty(name);
}
else {
value = source.getProperty(name);
}
return value;
}
}

Can't create an AbstractRoutingDataSource that needs some data from another database

We currently have an application which uses multiple databases with the same schema. At the moment we're using a custom solution for switching between them based on the user's session. This works via
public final class DataSourceProxy extends BasicDataSource {
...
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
if (auth != null && auth.getDetails() instanceof Map) {
Map<String, String> details = (Map<String, String>) auth.getDetails();
String targetUrl = details.get("database");
Connection c = super.getConnection();
Statement s = c.createStatement();
s.execute("USE " + targetUrl + ";");
s.close();
return c;
} else {
return super.getConnection();
}
}
Now we want to build a solution using AbstractRoutingDataSource. The problem is:
#Component
public class CustomRoutingDataSource extends AbstractRoutingDataSource {
#Autowired
Environment env;
#Autowired
DbDetailsRepositoy repo;
public CustomRoutingDataSource() {
Map<Object, Object> targetDataSources = new HashMap<Object, Object>();
for(DBDetails dbd : repo.findAll() {
// create DataSource and put it into the map
}
setTargetDataSources(new HashMap<Object, Object>());
}
#Override
protected Object determineCurrentLookupKey() {
Authentication auth = SecurityContextHolder.getContext().getAuthentication();
if (auth != null && auth.getDetails() instanceof Map) {
Map<String, String> details = (Map<String, String>) auth.getDetails();
return details.get("database");
}
return null;
}
}
Inside the constructor (or even via #PostConstruct) we have to fill the targetDataSources Map. But(!) for this we need the connection details which are stored in another database, which has its own DataSource and Entity Manager.
It seems like Spring can't determine the order of Bean construction, or maybe I'm just missing something. It always gives a NullPointerException when accessing the repository (which btw is a JpaRepository).
We're using Spring 3.2.3, Spring Data, Hibernate 4.2. Complete Annotation and Java-Code configuration of Spring and Spring Security.
Please help us!
Spring of course has to call the constructor before it can populate the properties. But that's not a Spring thing, that's basic Java 101 and one of the plenty downsides of using field injection.
To avoid this, simply add your dependencies to the constructor:
#Component
class CustomRoutingDataSource extends AbstractRoutingDataSource {
#Autowired
public CustomRoutingDataSource(DbDetailsRepository repo, Environment environment) {
…
}
…
}

Resources