#Bean-method providing a list is ignored, if there is a #Bean-method proving a single bean - spring

I have a configuration providing a single bean and a configuration providing a list of beans. All these beans have the same type.
When I start up an application context with these configurations, I see that an autowired list of the bean type only contains the single bean. I want it to include all beans of that type. I use Spring 5.2.0.
I have boiled it down to one configuration: if I provide a single bean and a list of beans, only the single bean will be used.
This is reproduced in the following test. It fails, because the list only contains "A" and "D" (which shows it did not autowire the list of beans):
#ExtendWith(SpringExtension.class)
#ContextConfiguration(classes = { TestConfiguration.class })
class AutowiringListsTest {
#Autowired
private List<TestBean> testBeanList;
#Test
void testThatBothConfigurationsContributeToBeanList() {
final List<String> idList = testBeanList.stream().map(TestBean::getId).sorted().collect(Collectors.toList());
assertThat(idList, hasItems("A", "B", "C", "D"));
}
#Configuration
public static class TestConfiguration {
#Bean
public TestBean someBean() {
return new TestBean("A");
}
#Bean
public List<TestBean> someMoreBeans() {
return Arrays.asList(new TestBean("B"), new TestBean("C"));
}
#Bean
public TestBean anotherBean() {
return new TestBean("D");
}
}
public static class TestBean {
private final String id;
public TestBean(final String id) {
this.id = id;
}
private String getId() {
return id;
}
}
}
I want to get this to run so that multiple modules can provide beans of a certain type.
Some modules want to provide multiple beans and their number depends on a property.
Some modules will always provide one bean.
The module using the beans (autowiring them as list) should autowire all beans.
How can I get this to run? In what scenario does Spring's behavior make sense?

I can work around the issue by introducing a TestBeanFactory. Each configuration that wants to contribute to the list of TestBeans instead provides a factory.
#Configuration
public static class TestConfiguration {
/** Implemented once in the configuration that defines <code>TestBean</code>. */
#Bean
public List<TestBean> testBeansFromFactory(Collection<TestBeanFactory> factories) {
return factories.stream().map(TestBeanFactory::createTestBeans).flatMap(Collection::stream)
.collect(toList());
}
// Further methods can be defined in various configurations that want to add to the list of TestBeans.
#Bean
public TestBeanFactory someBean() {
return () -> Arrays.asList(new TestBean("A"));
}
#Bean
public TestBeanFactory someMoreBeans() {
return () -> Arrays.asList(new TestBean("B"), new TestBean("C"));
}
#Bean
public TestBeanFactory anotherBean() {
return () -> Arrays.asList(new TestBean("D"));
}
}
public static class TestBean { ... }
public static interface TestBeanFactory {
public Collection<TestBean> createTestBeans();
}
That works and is only slightly more code.
M.Deinum makes a point in the comments for Spring's behavior being consistent:
As you defined a bean of type List it will be used. Autowiring is based on type, it will try to detect the bean of the certain type. The collection (and map as well) is a special one looking up all dependencies of the given type.

Related

Is it safe to return "this" from a Spring bean method?

I want to create a chain api on a prototype bean.
#Component
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public class ReportGenerator {
List lines = new ArrayList();
public ReportGenerator append(String line){
lines.add(line);
return this;
}
}
public class SomeSingletonBean{
#Autowire
ReportGenerator rg;
void doSomeWork(){
rg.append("a").append("b");
}
}
The above seems to work fine.
But is this safe in all cases?
Also, can I safely do the same in a bean scoped "session"?

SpringBoot: how to inject two classes having same name

In my application, I have two classes having the same name, but of course in different packages.
Both classes need to be injected in the application; Unfortunately, I get the following error message:
Caused by: org.springframework.context.annotation.ConflictingBeanDefinitionException: Annotation-specified bean name 'myFeature' for bean class [org.pmesmeur.springboot.training.service.feature2.MyFeature] conflicts with existing, non-compatible bean definition of same name and class [org.pmesmeur.springboot.training.service.feature1.MyFeature]
My issue can be reproduced by the following sample:
#Component
#EnableConfigurationProperties(ServiceProperties.class)
public class MyService implements IService {
private final ServiceProperties serviceProperties;
private final IProvider provider;
private final org.pmesmeur.springboot.training.service.feature1.IMyFeature f1;
private final org.pmesmeur.springboot.training.service.feature2.IMyFeature f2;
#Autowired
public MyService(ServiceProperties serviceProperties,
IProvider provider,
org.pmesmeur.springboot.training.service.feature1.IMyFeature f1,
org.pmesmeur.springboot.training.service.feature2.IMyFeature f2) {
this.serviceProperties = serviceProperties;
this.provider = provider;
this.f1 = f1;
this.f2 = f2;
}
...
package org.pmesmeur.springboot.training.service.feature1;
public interface IMyFeature {
void print();
}
package org.pmesmeur.springboot.training.service.feature1;
import org.springframework.stereotype.Component;
#Component
public class MyFeature implements IMyFeature {
#Override
public void print() {
System.out.print("HelloWorld");
}
}
package org.pmesmeur.springboot.training.service.feature2;
public interface IMyFeature {
void print();
}
package org.pmesmeur.springboot.training.service.feature2;
import org.springframework.stereotype.Component;
#Component
public class MyFeature implements IMyFeature {
#Override
public void print() {
System.out.print("FooBar");
}
}
If I use different names for my classes MyFeature, my problem disappears!!!
I am used to work with Guice and this framework does not have this kind of problem/limitation
It seems that the spring dependencies injection framework uses only
the class-name instead of package-name + class-name in order to
select its classes.
In "real-life" I have this problem with a far-bigger project and I would strongly prefer not to have to rename my classes: can anyone help me?
One last point, I would prefer to avoid "tricks" such as using
#Qualifier(value = "ABC") when injecting my classes: in my sample,
there should be no ambiguity for finding the correct instance of
MyFeature as they do not implement the same interface
Simply re-implementing BeanNameGenerator adds a new problem for beans declared/instantiated by names
#Component("HelloWorld")
class MyComponent implements IComponent {
...
}
#Qualifier(value = "HelloWorld") IComponent component
I solved this issue by extending AnnotationBeanNameGenerator and redefining method buildDefaultBeanName()
static class BeanNameGeneratorIncludingPackageName extends AnnotationBeanNameGenerator {
public BeanNameGeneratorIncludingPackageName() {
}
#Override
public String buildDefaultBeanName(BeanDefinition beanDefinition, BeanDefinitionRegistry beanDefinitionRegistry) {
return beanDefinition.getBeanClassName();
}
}
You can assigna a value for each component e.g. #Component(value="someBean") and then inject it with #Qualifier e.g.
#Autowired
public SomeService(#Qualifier("someBean") Some s){
//....
}
Spring provides autowire by type and name. Your classname are same. By default spring considers only className not package. But you can override this behaviour by defining custom implementation of BeanNameGenerator interface in which you can generate name using both package and name. I am not providing code solution because i think you should explore more on this.
You can do something like this;
in package a
public class MyFeature implements IMyFeature {
#Override
public void print() {
System.out.print("FooBar");
}
}
in package b
public class MyFeature implements IMyFeature {
#Override
public void print() {
System.out.print("HelloWorld");
}
}
and in some config class;
#Configuration
public class Configuration {
#Bean
public a.MyFeature f1() {
return new a.MyFeature();
}
#Bean
public b.MyFeature f2() {
return new b.MyFeature();
}
}
Then you can autowire them with names f1 and f2, that are the names of their respective bean constructor methods.
You can do the similar thing with #Component("f1") &
#Component("f2")
Even though different interfaces are implemented and are in different packages, identical bean name causes this trouble, and you have to utilize some sort of custom naming to distinguish. Utilizing some custom Spring logic would be way too ugly compared to what you'd do with above solutions.

Multiple Spring Configuration files (one per Profile)

I'm a Spring rookie and trying to benefit from the advantages of the easy 'profile' handling of Spring. I already worked through this tutorial: https://spring.io/blog/2011/02/14/spring-3-1-m1-introducing-profile and now I'd like to adapt that concept to an easy example.
I've got two profiles: dev and prod. I imagine a #Configuration class for each profile where I can instantiate different beans (implementing a common interface respectively) depending on the set profile.
My currently used classes look like this:
StatusController.java
#RestController
#RequestMapping("/status")
public class StatusController {
private final EnvironmentAwareBean environmentBean;
#Autowired
public StatusController(EnvironmentAwareBean environmentBean) {
this.environmentBean = environmentBean;
}
#RequestMapping(method = RequestMethod.GET)
Status getStatus() {
Status status = new Status();
status.setExtra("environmentBean=" + environmentBean.getString());
return status;
}
}
EnvironmentAwareBean.java
public interface EnvironmentAwareBean {
String getString();
}
EnvironmentAwareBean.java
#Service
public class DevBean implements EnvironmentAwareBean {
#Override
public String getString() {
return "development";
}
}
EnvironmentAwareBean.java
#Service
public class ProdBean implements EnvironmentAwareBean {
#Override
public String getString() {
return "production";
}
}
DevConfig.java
#Configuration
#Profile("dev")
public class DevConfig {
#Bean
public EnvironmentAwareBean getDevBean() {
return new DevBean();
}
}
ProdConfig.java
#Configuration
#Profile("prod")
public class ProdConfig {
#Bean
public EnvironmentAwareBean getProdBean() {
return new ProdBean();
}
}
Running the example throws this exception during startup (SPRING_PROFILES_DEFAULT is set to dev):
(...) UnsatisfiedDependencyException: (...) nested exception is org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [EnvironmentAwareBean] is defined: expected single matching bean but found 3: prodBean,devBean,getDevBean
Is my approach far from a recommended configuration? In my opinion it would make more sense to annotate each Configuration with the #Profile annotation instead of doing it for each and every bean and possibly forgetting some variants when new classes are added later on.
Your implementations of EnvironmentAwareBean are all annotated with #Service.
This means they will all be picked up by component scanning and hence you get more than one matching bean. Do they need to be annotated with #Service?
Annotating each #Configuration with the #Profile annotation is fine. Another way as an educational exercise would be to not use #Profile and instead annotate the #Bean or Config classes with your own implementation of #Conditional.

Spring #Required properties when creating #Bean annotated beans

I'm developing a Spring Boot application and am trying out using Java annotation-based bean creation (using #Configuration and #Bean) rather than the familiar old XML-based bean creation. I'm puzzled though. If I attempt to create a bean in XML but fail to set an #Required property I get a BeanInitializationException when the application context is created. In my trials so far with annotation-based bean creation though this does not seem to be the case.
For example:
public class MyClass {
...
#Required
public void setSomeProp(String val){
}
}
Then in Spring XML:
<bean class="MyClass"/>
This will blow up during application startup (and IntelliJ flags it) because the required property is not set. But the same does not seem to be true of this:
#Configuration
public class MyConfig {
#Bean
public MyClass myClass() {
return new MyClass();
}
}
This application starts up just fine even though the required property is not ever set. I must be missing something here, because this seems like a pretty key feature in Spring.
UPDATE
I did some digging & debugging and it turns out that the bean definition is somehow being flagged to skip checking that #Required fields are set. In the Spring class 'RequiredAnnotationBeanPostProcessor' the boolean method 'shouldSkip()' is returning true for beans created this way. When I used the debugger to force that method to return false bean creation did indeed blow up with the expected exception.
Seeing as I'm making a pretty basic Spring Boot application I'm inclined (as Zergleb suggests) to submit this as a bug.
UPDATE 2
Some further debugging has revealed that even if the field is getting set forcing the check still throws the same exception, as if it hadn't been set. So perhaps dunni is correct and there is no way for this to work with #Bean notation.
As you said I also could not get #Required to run as expected this may be a bug and needs to be reported. I have a few other suggestions that did work for me.
Class annotated with #Configuration
//With the bean set up as usual These all worked
#Bean
public MyClass myClass() {
return new MyClass();
}
When you annotate the class #Component and load using component scanning works as expected.(The component scanning part is important you either need your #Configuration class to either have #ComponentScan or perhaps remove #Configuration and replace with #SpringBootApplication and this will enable scanning for components without needing to wire them up using #Bean configs)
#Component // Added this
public class MyClass {
...
#Required //Failed as expected
public void setSomeProp(String val){
}
}
Use #Autowired(required=true) //Fails with BeanCreationException //No qualifying bean of type [java.lang.String] found for dependency
//No more #Component
public class MyClass {
...
#Autowired(required=true) //Fails
public void setSomeProp(String val){
}
}
#Autowired required=false //Does not crash
public class MyClass {
...
#Autowired(required=false) //Simply never gets called if missing
public void setSomeProp(String val){
}
}
#Value //Does not work if test.property is missing // Could not resolve placeholder 'test.property' in string value "${test.property}
public class MyClass {
#Value("${test.property}")
String someProp;
//This getter is not neccesary neither is a setter
public String getSomeProp() {
return this.someProp;
}
}
#Value with default value//Does not crash // When getSomeProp is called it returns "My Default Value"(Unless you have test.property=Anything in your application.properties file then it returns "Anything"
public class MyClass {
#Value("${test.property:My Default Value}")
String someProp;
//This getter is not neccesary neither is a setter
public String getSomeProp() {
return this.someProp; //Returns "My Default Value"
}
}
Inside your #Configuration file also fails if it cannot find anything to populate String someProp in the myClass method
#Bean
public MyClass myClass(String someProp) { //Fails being unable to populate this arg
MyClass myObj = new MyClass();
myObj.setSomeProp(someProp);
return ;
}
If course this won't work, since you create the object of MyClass yourself (new MyClass()), thus the annotations are not evaluated. If you create a bean with a #Bean method, the container will only make sure, that all dependencies are there (method parameters) and that the bean scope is adhered to, meaning if it's a singleton bean, only one bean is created per application context. The creation of the bean/object itself is solely the responsibility of the developer.
The equivalent of the xml <bean> tag is annotating the class with #Component, where the bean is created completely by the container, thus the annotations are evaluated.
As it is being said that when you are having your own #Configuration class where you are creating the bean by itself, #Required doesn't apply there.
When you already have a #Component, let Spring Boot do the component scan and at the required setter property you can add #Autowired and it will work fine.
Found this link on web- https://www.boraji.com/spring-required-annotation-example
For example:
I have a Component called Employee having Id and Name.
#Component
public class Employee {
int id;
String name;
public int getId() {
return id;
}
#Autowired
#Required
public void setId(int id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
I have a Configuration class called AppConfig.java
#Configuration
public class AppConfig {
#Bean
public int getId() {
return 1;
}
}
So now we see, that component Employee needs an Id property for binding during startup, so I wrote bean method of type Integer, which will get autowired during runtime. If you do not write a bean of type Integer, it will result a BeanCreationException.
And here is my main class file.
#SpringBootApplication
public class SingletonApplication {
public static void main(String[] args) {
ApplicationContext ctx =
SpringApplication.run(SingletonApplication.class, args);
Employee emp = (Employee)ctx.getBean(Employee.class);
System.out.println(emp.getId());
}
}

Accessing Beans outside of the Step Scope in Spring Batch

Is it possible to access beans defined outside of the step scope? For example, if I define a strategy "strategyA" and pass it in the job parameters I would like the #Value to resolve to the strategyA bean. Is this possible? I am currently working round the problem by getting the bean manually from the applicationContext.
#Bean
#StepScope
public Tasklet myTasklet(
#Value("#{jobParameters['strategy']}") MyCustomClass myCustomStrategy)
MyTasklet myTasklet= new yTasklet();
myTasklet.setStrategy(myCustomStrategy);
return myTasklet;
}
I would like to have the ability to add more strategies without having to modify the code.
The sort answer is yes. This is more general spring/design pattern issue rater then Spring Batch.
The Spring Batch tricky parts are the configuration and understanding scope of bean creation.
Let’s assume all your Strategies implement Strategy interface that looks like:
interface Strategy {
int execute(int a, int b);
};
Every strategy should implements Strategy and use #Component annotation to allow automatic discovery of new Strategy. Make sure all new strategy will placed under the correct package so component scan will find them.
For example:
#Component
public class StrategyA implements Strategy {
#Override
public int execute(int a, int b) {
return a+b;
}
}
The above are singletons and will be created on the application context initialization.
This stage is too early to use #Value("#{jobParameters['strategy']}") as JobParameter wasn't created yet.
So I suggest a locator bean that will be used later when myTasklet is created (Step Scope).
StrategyLocator class:
public class StrategyLocator {
private Map<String, ? extends Strategy> strategyMap;
public Strategy lookup(String strategy) {
return strategyMap.get(strategy);
}
public void setStrategyMap(Map<String, ? extends Strategy> strategyMap) {
this.strategyMap = strategyMap;
}
}
Configuration will look like:
#Bean
#StepScope
public MyTaskelt myTasklet () {
MyTaskelt myTasklet = new MyTaskelt();
//set the strategyLocator
myTasklet.setStrategyLocator(strategyLocator());
return myTasklet;
}
#Bean
protected StrategyLocator strategyLocator(){
return = new StrategyLocator();
}
To initialize StrategyLocator we need to make sure all strategy were already created. So the best approach would be to use ApplicationListener on ContextRefreshedEvent event (warning in this example strategy names start with lower case letter, changing this is easy...).
#Component
public class PlugableStrategyMapper implements ApplicationListener<ContextRefreshedEvent> {
#Autowired
private StrategyLocator strategyLocator;
#Override
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent) {
ApplicationContext applicationContext = contextRefreshedEvent.getApplicationContext();
Map<String, Strategy> beansOfTypeStrategy = applicationContext.getBeansOfType(Strategy.class);
strategyLocator.setStrategyMap(beansOfTypeStrategy);
}
}
The tasklet will hold a field of type String that will be injected with Strategy enum String using #Value and will be resolved using the locator using a "before step" Listener.
public class MyTaskelt implements Tasklet,StepExecutionListener {
#Value("#{jobParameters['strategy']}")
private String strategyName;
private Strategy strategy;
private StrategyLocator strategyLocator;
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
strategy = strategyLocator.lookup(strategyName);
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
int executeStrategyResult = strategy.execute(1, 2);
}
public void setStrategyLocator(StrategyLocator strategyLocator) {
this.strategyLocator = strategyLocator;
}
}
To attach the listener to the taskelt you need to set it in your step configuration:
#Bean
protected Step myTaskletstep() throws MalformedURLException {
return steps.get("myTaskletstep")
.transactionManager(transactionManager())
.tasklet(deleteFileTaskelt())
.listener(deleteFileTaskelt())
.build();
}
jobParameters is holding just a String object and not the real object (and I think is not a good pratice store a bean definition into parameters).
I'll move in this way:
#Bean
#StepScope
class MyStategyHolder {
private MyCustomClass myStrategy;
// Add get/set
#BeforeJob
void beforeJob(JobExecution jobExecution) {
myStrategy = (Bind the right strategy using job parameter value);
}
}
and register MyStategyHolder as listener.
In your tasklet use #Value("#{MyStategyHolder.myStrategy}") or access MyStategyHolder instance and perform a getMyStrategy().

Resources