How can I explicitly define an order in which Spring's out-of-the-box process of reading properties out of an available-in-classpath application.yml - spring

UPDATE: I just published this question also here, I might have done a better work phrasing it there.
How can I explicitly define an order in which Spring's out-of-the-box process of reading properties out of an available-in-classpath application.yml will take place BEFORE my #Configuration annotated class which reads configuration data from zookeeper and places them as system properties which are later easily read and injected into members using #Value?
I have a #Configuration class, which defines a creation of a #Bean, in a which configuration data from zookeeper is read and placed as system properties, in a way that they can easily be read and injected into members using #Value.
#Profile("prod")
#Configuration
public class ZookeeperConfigurationReader {
#Value("${zookeeper.url}")
static String zkUrl;
#Bean
public static PropertySourcesPlaceholderConfigurer zkPropertySourcesPlaceHolderConfigurer() {
PropertySourcesConfigurerAdapter propertiesAdapter = new PropertySourcesConfigurerAdapter();
new ConfigurationBuilder().populateAdapterWithDataFromZk(propertiesAdapter);
return propertiesAdapter.getConfigurer();
}
public void populateAdapterWithDataFromZk(ConfigurerAdapter ca) {
...
}
}
Right now I pass the zookeeper.url into the executed program using a -Dzookeeper.url which is added to the execution line. Right now I read it by calling directly System.getProperty("zookeeper.url").
Since I'm using Spring-Boot application, I also have a application.yml configuration file.
I would like to be able to set the zookeeper.url in the application.yml, and keep my execution line clean as possible from explicit properties.
The mission turns out to be harder than I thought.
As you can see in the above code sniplet of ZookeeperConfigurationReader, I'm trying to inject that value using #Value("${zookeeper.url}") into a member in the class which performs the actual read of data from zookeeper, but at the time the code that needs that value accesses it, it is still null. The reason for that is that in spring life cycle wise, I'm still in the phase of "configuration" as I'm a #Configuration annotated class myself, and the spring's code which reads the application.yml data and places them as system properties, hasn't been executed yet.
So bottom line, what I'm looking for is a way to control the order and tell spring to first read application.yml into system properties, and then load ZookeeperConfigurationReader class.

You can try to use Spring Cloud Zookeeper. I posted a brief example of use here

Related

Set a ListMoveChange filter when using OptaPlanner Spring Boot starter

We are using the OptaPlanner Spring Boot starter to create a vehicle routing problem solver based on the example in the OptaPlanner quickstarts:
https://github.com/kiegroup/optaplanner-quickstarts/tree/stable/use-cases/vehicle-routing
So we do not have an solveConfig.xml file. We would like to define a filter for ListChangeMoves but it's not clear how we would register this without using an XML file. We have tried using a solverConfig.xml e.g.
<localSearch>
<unionMoveSelector>
<listChangeMoveSelector>
<filterClass>my.filter.Class</filterClass>
</listChangeMoveSelector>
</unionMoveSelector>
</localSearch>
But this is not working. Is there an example of setting up a filter for list moves?
This is a XML solver config using a a swap move selector and a change move selector with move filtering:
<constructionHeuristic/>
<localSearch>
<unionMoveSelector>
<changeMoveSelector>
<filterClass>org.acme.vehiclerouting.solver.ChangeMoveSelectorFilter</filterClass>
</changeMoveSelector>
<swapMoveSelector/>
</unionMoveSelector>
</localSearch>
If you don't want to use swap moves, then you don't need the union move selector and the configuration can be simplified to:
<constructionHeuristic/>
<localSearch>
<changeMoveSelector>
<filterClass>org.acme.vehiclerouting.solver.ChangeMoveSelectorFilter</filterClass>
</changeMoveSelector>
</localSearch>
A few comments:
I'm including the CH phase because it is necessary in a typical case. See OptaPlanner terminates immediately if I add constructionHeuristic config for an explanation.
The ChangeMoveSelector is automatically configured to produce ListChangeMoves if the planning entity has a #PlanningListVariable. There is no <listChangeMoveSelector> config element.
More information including how to implement the move selection filter can be found in the documentation.
UPDATE: No XML configuration
It's possible to inject SolverConfig, modify it and then use it to create other objects, for example Solver, SolverManager, and ScoreManager.
The code would look something like this:
#Component
class MyService {
// Don't inject these.
private final SolverManager<VrpSolution, Long> solverManager;
private final ScoreManager<VrpSolution, HardSoftScore> scoreManager;
// But inject the SolverConfig.
public MyService(SolverConfig solverConfig) {
// And instantiate SolverManager and ScoreManager manually.
this.solverManager = SolverManager.<VrpSolution, Long>create(
solverConfig.withPhaseList(Arrays.asList(
new ConstructionHeuristicPhaseConfig(),
new LocalSearchPhaseConfig().withMoveSelectorConfig(
new ChangeMoveSelectorConfig()
.withFilterClass(MyFilter.class)))));
this.scoreManager = ScoreManager.create(SolverFactory.create(solverConfig));
}
}
Pros:
SolverConfig will be initialized by OptaPlannerAutoConfiguration (from optaplanner-spring-boot-starter) before it's injected into your component. That means:
The solution and entity classes will be auto-discovered and you don't have to specify them (which you have to in solverConfig.xml).
You can use application.properties to do minor solver config tweaks, for example set time-spent termination.
Cons:
You have to create Solver,SolverManager, and ScoreManager instances manually. Specifically, you can't have a #Bean method producing an instance of one of the types above because that would deactivate OptaPlannerAutoConfiguration, which creates the SolverConfig bean.

Load Yml at runtime with Spring Boot

I have multiple yml files in different folders. All the files in the folder share the same property structure which I mapped with a java bean.
At runtime, with a factory, I want to get the right bean populated with the values of the specific file chosen at runtime. How do I do that?
Thanks
The #ConfigurationProperties annotation or the mechanism behind it is built to be used for configuration of an application at startup, not loading data at runtime.
I'm sure you could somehow start mini spring environments at runtime just to read this data using different spring profiles (this is e.g. how spring-cloud-configserver loads properties) but this seems not right and there are better alternatives.
E.g., if you need that data to be loaded at runtime, you can use jackson's yamlfactory for that, with that you can read your data in 3-4 statements. A good example is here: https://www.baeldung.com/jackson-yaml.
Consider a Bean like this: (Pseudo code, just to explain)
class MyConfigBean {
private Properties currentProperties;
private Map<String, Properties> allPropertiesMap;
void loadAllProperties() { ... }
void switchProperties(String name) {
this.currentProperties = this.allPropertiesMap.get(name);
}
String getProperty(String key) {
return this.currentProperties.get(key);
}
}
You can load all of the Yaml files into a Map in your bean. The Map's key could be the "name" of the properties file and the value would be the Properties object.
A switchProperties(String name) method will "select" the properties file you wish to work with. Using the name, you will get the appropriate Properties object from the Map and assign it to the "currentProperties" object.
This way, each time you get a property by key, it will be fetched from the "currentProperties" according to what you "switched" for.
Important - You'll have to decide what is the default properties after you load all of them.

Spring Boot & Vault: incomplete context initialization issue

I've faced an issue that on rare occasions (it might take dozens of restarts) Spring doesn't initialize all properties correctly.
I define the bean of CbKafkaConsumerConfig (my custom bean) type and check its state in the thread that was created by a method that is marked as #EventListener(ApplicationReadyEvent.class), so I expect it to be completely initialized by this point. However, this is what I see:
Values that I expected to be filled are left with placeholders.
Here's how they are defined in application.properties file. (And I've checked the spelling - it's correct, otherwise it would fail every time, not occasionally)
config-bean-prefix.msg-topics=${cb.kafka.tc-topic}
config-bean-prefix.unexpected-error-topic=${cb.kafka.unexpected-errors-topic}
These properties are defined in Vault and I expected them to be fetched and set with the power of Spring Cloud Vault. Here you can see that Vault is present as a property source AND that these properties are populated there.
At the same time, in the context there are other beans of the same type CbKafkaConsumerConfig that are referring to these properties and yet it resolved fine for them.
Here's how the bean is defined
#Bean({"myBean"})
#ConfigurationProperties(
prefix = "config-bean-prefix"
)
public CbKafkaConsumerConfig myBeanConsumer() {
return new CbKafkaConsumerConfig();
}
And the bean itself:
#Data
public class CbKafkaConsumerConfig extends CbKafkaBaseConfig {
#NotNull
#Size(
min = 1
)
private Collection<String> msgTopics;
#NotNull
private String unExpectedErrorTopic;
}
We're using Spring Boot 2.2.x however this issue is also present for Spring Boot 2.1.x.
It's not specific for this type of beans, other might fail as well while being correctly set in Vault. What could be the reason of such unpredictable behavior and what I should look into?
Turns out by default spring cloud vault is not simply fetching properties on start, every so often it's updating them. While updating, there's a short time window when properties were already deleted from property source in the context, but not filled with the new ones and it might actually happen during context initialization (super questionable behavior in my opinion) causing some beans being corrupted.
If you don't want properties to be updated in runtime just set spring.cloud.vault.config.lifecycle.enabled to false

Java Configuration vs Component Scan Annotations

Java configuration allows us to manage bean creation within a configuration file. Annotated #Component, #Service classes used with component scanning does the same. However, I'm concerned about using these two mechanisms at the same time.
Should Java configuration and annotated component scans be avoided in the same project? I ask because the result is unclear in the following scenario:
#Configuration
public class MyConfig {
#Bean
public Foo foo() {
return new Foo(500);
}
}
...
#Component
public class Foo {
private int value;
public Foo() {
}
public Foo(int value) {
this.value = value;
}
}
...
public class Consumer {
#Autowired
Foo foo;
...
}
So, in the above situation, will the Consumer get a Foo instance with a 500 value or 0 value? I've tested locally and it appears that the Java configured Foo (with value 500) is created consistently. However, I'm concerned that my testing isn't thorough enough to be conclusive.
What is the real answer? Using both Java config and component scanning on #Component beans of the same type seems like a bad thing.
I think your concern is more like raised by the following use case:
You have a custom spring-starter-library that have its own #Configuration classes and #Bean definitions, BUT if you have #Component/#Service in this library, you will need to explicitly #ComponentScan these packages from your service, since the default #ComponentScan (see #SpringBootApplication) will perform component scanning from the main class, to all sub-packages of your app, BUT not the packages inside the external library. For that purpose, you only need to have #Bean definitions in your external library, and to inject these external configurations via #EnableSomething annotation used on your app's main class (using #Import(YourConfigurationAnnotatedClass.class) OR via using spring.factories in case you always need the external configuration to be used/injected.
Of course, you CAN have #Components in this library, but the explicit usage of #ComponentScan annotation may lead to unintended behaviour in some cases, so I would recommend to avoid that.
So, to answer your question -> You can have both approaches of defining beans, only if they're inside your app, but bean definitions outside your app (e.g. library) should be explicitly defined with #Bean inside a #Configuration class.
It is perfectly valid to have Java configuration and annotated component scans in the same project because they server different purposes.
#Component (#Service,#Repository etc) are used to auto-detect and auto-configure beans.
#Bean annotation is used to explicitly declare a single bean, instead of letting Spring do it automatically.
You can do the following with #Bean. But, this is not possible with #Component
#Bean
public MyService myService(boolean someCondition) {
if(someCondition) {
return new MyServiceImpl1();
}else{
return new MyServiceImpl2();
}
}
Haven't really faced a situation where both Java config and component scanning on the bean of the same type were required.
As per the spring documentation,
To declare a bean, simply annotate a method with the #Bean annotation.
When JavaConfig encounters such a method, it will execute that method
and register the return value as a bean within a BeanFactory. By
default, the bean name will be the same as the method name
So, As per this, it is returning the correct Foo (with value 500).
In general, there is nothing wrong with component scanning and explicit bean definitions in the same application context. I tend to use component scanning where possible, and create the few beans that need more setup with #Bean methods.
There is no upside to include classes in the component scan when you create beans of their type explicitly. Component scanning can easily be targeted at certain classes and packages. If you design your packages accordingly, you can component scan only the packages without "special" bean classes (or else use more advanced filters on scanning).
In a quick look I didn't find any clear information about bean definition precedence in such a case. Typically there is a deterministic and fairly stable order in which these are processed, but if it is not documented it maybe could change in some future Spring version.

Update field annotated with #Value in runtime

Let's imagine we have such a component in Spring:
#Component
public class MyComponent {
#Value("${someProperty}")
private String text;
}
If we define the property placeholder:
<context:property-placeholder location="classpath:myProps.properties"/>
And myPropos.properties contains the value for someProperty the value will be injected to the text field when the context is initialized. That's quite simple and easy.
But let's say that I have a service that enables user to change the value of the someProperty:
public void changeProp(String name, String newValue);
Is there a chance I can re-inject the newValue to text field. I mean it should be quite straight forward.. Basically it's nothing different than the after-initialization injection. I can not imagine that Spring does not have support for this? Can I fire some event or something?
I could do this on my own basically, but I wander is it maybe something there already? If not does anyone know what Spring class is in fact handling the injections at the first place? I could probably reuse the code there do perform this on my own if a solution does not exists.
I expect spring does not have a support for this, because the normal injection is done while creating the bean, but not will it is put in service.
Anyway: in this blog entry "Reloadable Application Properties with Spring 3.1, Java 7 and Google Guava", you can find the idea for an solution.
The key idea is to use a post processor to build a list of all fields with property fields. And if the properties are changed on can use this list to update the fields.

Resources