I have a need of conditionally creating one of three possible implementations of a service depending upon the environment detected by a Spring application at runtime. If Service A is available, then I want to create a concrete implementation class that uses Service A as a dependency. If Service A is not available, then I want to create an implementation using Service B as a dependency. And so-on.
Classes which depend on the implementation will Autowire the Interface and not care what the underlying Service was that got selected for the particular environment.
My first stab at this was to implement multiple #Bean methods which either return a bean or null, depending on whether the Service is available, and to then have a separate #Configuration class which #Autowire(required=false) the two possible services, conditionally creating the implementation depending on which of the #Autowired fields was not-null.
The problem here is that when required=false, Spring doesn't appear to care whether it waits around for candidates to be constructed; that is to say, the class which tries to pick the implementation might be constructed before one or both of the required=false Beans gets constructed, thus ensuring that one or both might always be null, regardless of whether it may manage to initialize correctly.
It kind of feels like I'm going against the grain at this point, so I'm looking for advice on the "right" way to do this sort of thing, where a whole set of beans might get switched out based on the availability of some outside service or environment.
Profiles don't look like the right answer, because I won't know until after my Service beans try to initialize which implementation I want to choose; I certainly won't know it at the time I create the context.
#Order doesn't achieve the goal either. Nor does #Conditional and testing on the existence of the bean (because it still might not be constructed yet). Same problem with FactoryBean- it does no good to check for the existence of beans that might not have been constructed at the time the FactoryBean is asked to create an instance.
What I really need to do is create a Bean based on the availability of other beans, but only AFTER those beans have at least had a chance to try to initialize.
Spring Profiles is your friend. You can set the current profile by way of environmental variable, command-line argument, and other methods. You can annotate a Spring-managed component so that it's created for a certain profile.
Spring Profiles from the Spring Documentation
Well in this case it turned out to be a tangential mistake that influenced the whole wrong behavior.
To give some background, my first, naive (but workable) approach looked like this:
#Autowired(required=false)
#Qualifier(RedisConfig.HISTORY)
private RLocalCachedMap<String, History> redisHistoryMap;
#Autowired(required=false)
#Qualifier(HazelcastConfig.HISTORY)
private IMap<String, History> hazelcastHistoryMap;
// RequestHistory is an interface
#Bean
public RequestHistory requestHistory() {
if (redisHistoryMap != null) {
return new RedisClusteredHistory(redisHistoryMap);
} else if (hazelcastHistoryMap != null) {
return new HazelcastClusteredHistory(hazelcastHistoryMap);
} else {
return new LocalRequestHistory(); // dumb hashmap
}
}
In other #Configuration classes, if the beans that get #Autowired here aren't available (due to missing configuration, exceptions during initialization, etc), the #Bean methods that create them return null.
The observed behavior was that this #Bean method was getting called after the RLocalCachedMap<> #Bean method got called, but before Spring attempted to create the IMap<> by calling its #Bean method. I had incorrectly thought that this had something to do with required=false but in fact that had nothing to do with it.
What actually happened was I accidentally used the same constant for both #Bean names (and consequently #Qualifiers), so presumably Spring couldn't tell the difference when it was calculating its dependency graph for this #Configuration class... because the two #Autowired beans appeared to be the same thing (because they had the same name).
(There's a secondary reason for using #Qualifier in this case, which I won't go into here, but suffice it to say it's possible to have many maps of the same type.)
Once I qualified the names better, the code did exactly what I wanted it to, albeit in a way that's somewhat inelegant/ugly.
At some point I'll go back and see if it looks more elegant / less ugly and works just as well to use #Conditional and #Primary instead of the if/else foulness.
The lesson here is that if you explicitly name beans, make absolutely sure your names are unique across your application, even if you plan to swap things around like this.
Related
I have a case where an Spring AutoConfiguration class is getting its dependencies through field injection and creating and exposing certain beans after interacting with them.
I would like to override one of its dependencies so the exposed beans are initialized in the way I expect.
Obviously I can disable the Autoconfiguration class and duplicate it completely locally with my desired dependency, but that would not be a maintainable solution since the amount of behaviour to reproduce is huge, and it might break on each spring update.
Is there any easy mechanisme to let the autconfiguration be loaded, and later on use the BeanFactory or something to reinject a particular instance into a particular bean?
I cannot guarantee that this is the ideal solution since this is for topics, instead of classes, but for most cases, it will do the trick.
The AutoConfiguration can be disabled in one topic, and any bean in the topic can be initialized using a particular method in the class Configuration (as usual).
List of AutoConfigurations classes (=topics)
Syntax (to exclude from autoconfiguration):
#Configuration
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class MyConfiguration {
#bean
public SpecificClass getSpecificClass() {
//init the instance as you want
}
}
In a Spring Boot Web Application layout, I have defined a Service Interface named ApplicationUserService. An implementation called ApplicationUserServiceImpl implements all the methods present in ApplicationUserService.
Now, I have a Controller called ApplicationUserController that calls all the methods of ApplicationUserServiceImpl under different #GetMapping annotations.
As suggested by my instructor, I have defined a Dependency Injection as follows:
public class ApplicationUserController {
private final ApplicationUserService applicationUserService; //This variable will work as an object now.
public ApplicationUserController(ApplicationUserService applicationUserService) {
this.applicationUserService = applicationUserService;
}
#GetMapping
//REST OF THE CODE
}
I am new to Spring Boot and I tried understanding Dependency Injection in plain English and I understood how it works. I understood that the basic idea is to separate the dependency from the usage. But I am totally confused about how this works in my case.
My Questions:
Here ApplicationUserService is an Interface and it's implementation has various methods defined. In the code above, applicationUserService now has access to every method from ApplicationUserServiceImpl. How did that happen?
I want to know how the Object creation works here.
Could you tell me the difference between not using DI and using DI in this case?
Answers
The interface layer is used for abstraction of the code it will be really helpfull when you want to provide different implementations for it. When you create a instance you are simply creating a ApplicationUserServiceImpl and assigning it into a reference variable of ApplicationUserService this concept is called upcasting. Even though you have created the object of child class/implementation class you can only access the property or methods defined in parent class/interface class. Please refer this for more info
https://stackoverflow.com/questions/62599259/purpose-of-service-interface-class-in-spring-boot#:~:text=There%20are%20various%20reasons%20why,you%20can%20create%20test%20stubs.
in this example when a applicationusercontroller object is created. The spring engine(or ioc container) will create a object of ApplicationUserServiceImpl (since there is a single implementation of the interface ) and returns to controller object as a constructor orgument which you assign to the refrence variable also refer the concept called IOC(Invertion of control)
as explained in the previous answer the spring will take care of object creation (object lifecycle)rather than you explsitly doing it. it will make the objects loosely coupled. In this the controll of creating the instances is with spring .
The non DI way of doing this is
private ApplicationUserService applicationUserService = new ApplicationUserServiceImpl()
Hope I have answered your questions
this analogy may make you understand better consider an example, wherein you have the ability to cook. According to the IoC principle(principal which is the basis of DI), you can invert the control, so instead of you cooking food, you can just directly order from outside, wherein you receive food at your doorstep. Thus the process of food delivered to you at your doorstep is called the Inversion of Control.
You do not have to cook yourself, instead, you can order the food and let a delivery executive, deliver the food for you. In this way, you do not have to take care of the additional responsibilities and just focus on the main work.
Now, that you know the principle behind Dependency Injection, let me take you through the types of Dependency Injection
I am trying to find some 'controller' (not #Controller) within Spring 5.0 that is responsible to resolve what instance of an implementation to inject within Spring. I want to provide my own implementation of that controller (or to extend it), so that I can add my own logic for state-based bean resolution.
For example, given some interface Foo, with implementation FooImpl1 and FooImpl2, and some state Baz.
Then, when Baz = 1, I want to step into my own logic to decide to provide FooImpl1 instead of FooImpl2 for the required inject of Foo implementation.
Spring does this today, the logic seems to be:
Given the need to inject class X, find its implementations
If only one of X is found, use that
If more than one X is found, use Primary
If more than one X is found and no Primary, find Qualifier
If more than one X if found and no Primary and no Qualifier, attempt to match X with property or parameter X by name (ie: don't inject Y if the parameter or property is x and not y).
What I want to do is at some point in the logic above, to invoke my own disambiguation/resolution of the required implementation to be injected, based on my own logic and state.
So, before I go and dig into Spring to locate where that logic is implemented, I am hoping to find that it is implemented in some controller/service that I can extend, best still if this is backed by some configuration...
You can implement your own #Configuration that returns a Spring #Bean:
#Configuration
public class Config {
private final Baz baz;
#Autowired
Config(Baz baz) {
this.baz = baz
}
#Bean
public Foo getFoo() {
switch (baz) {
case 1:
return new FooImpl1();
default:
return new FooImpl2();
}
}
}
Please also read the paragraph about Full #Configuration vs “lite” #Bean mode?. The last paragraph states that:
In common scenarios, #Bean methods are to be declared within #Configuration classes, ensuring that “full” mode is always used and that cross-method references therefore get redirected to the container’s lifecycle management. This prevents the same #Bean method from accidentally being invoked through a regular Java call, which helps to reduce subtle bugs that can be hard to track down when operating in “lite” mode.
It seems what I am looking for are the interfaces BeanFactory and most likely ConfigurableBeanFactory. There are still some things to work through off-course, but this definitely is the right direction.
For those immediately curious as to what I am talking about ... see https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/beans/factory/config/ConfigurableBeanFactory.html
I'm new to using Spring with Neo4j and I have a question about #Autowire for a GraphRepository.
Most examples I've seen use one #Autowire per Controller, but I have two Nodes I need to modify at the same time when a particular method is called in the controller. Should I simply #Autowire the repositories for both nodes (eg per the code below)? Is there any impact if I do this in a second controller with the same repositories as well (so if I had a ChatSessionController which also #Autowired ChatMessageService and ChatSessionService)?
ChatMessageController.java
#Controller
public class ChatMessageController {
#Autowired
private ChatMessageService chatMessageService;
#Autowired
private ChatSessionService chatSessionService;
#RequestMapping(value = "/message/add/{chatSessionId}", method = RequestMethod.POST)
#ResponseBody
#Transactional
public void addMessage(#RequestBody ChatMessagePack chatMessagePack,
#PathVariable("chatSessionId") Long chatSessionId) {
ChatMessage chatMessage = new ChatMessage(chatMessagePack);
chatMessageService.save(chatMessage);
// TODO: Make some modifications to the ChatSession as well
}
}
Any help would be much appreciated! I've been googling and looking through Stackoverflow to understand this better but I haven't found anything yet. Any pointers in the right directions would be great.
Another underlying question is, should I be (and can I?) modifying other Nodes in a GraphRepository that handles a particular node? Eg Should my GraphRepository be able to modify my GraphRespository?
Thanks!
I'm not convinced that this is a SO question, it's not really a Neo4J or Spring question either, it is more about the architecture of your application. However assuming that you understand the negatives of class fan out, and how to use the #Transactional annotation to achieve what you want then the answer to your question is that it is just fine to have many Repositories (Neo4J or otherwise, autowired or otherwise) in your class and in as many classes as you want.
Neo4J transactions default to Isolation level READ_COMMITTED and if you need anything else, you need to add the guards/locks yourself. Nested transactions are consideredd tobe the same transaction. The Spring #Transactional annotation relies on proxies that you should be aware of as they have implications when calling methods from within the same class.
I would go through this tuotorial over at Spring Data and get your head around how real world vs domain vs node models differ, there will be cases where one repository impacts another node type but I would think it is often transparent to you (i.e adding relationships). You can do what you like in each repository (the generic nature of them is largely confined to all of the built in CRUD and queries derived from finder-method names (see documentation ) using the #Query annotation, and some queries have side effects, but largely you should avoid it.
As you start adding multiple repositories to multiple controllers I think that your code will begin to smell bad and that you should consider encapsulating this business logic off on its own somewhere, neatly unit tested. I also wouldn't tie myself to one controller per data object, it would be fine to have a single ChatController with a POST/chat/ to create a new session and POST /chat/{sessionId} to add a message. Intersting questions on Programmers:
How accurate is "Business logic should be in a service, not in a model?"
Best Practices for MVC Architecture
MVC Architecture — How many Controllers do I need?
I am reading the 《Spring in Action》, and it says "singleton beans in Spring often don't maintain state because they're usually shared among multiple threads", but in my opinion, a bean in Spring is a POJO, so how can it not maintain state?
I am reading the 《Spring in Action》, and it says "singleton beans in
Spring often don't maintain state because they're usually shared among
multiple threads", but in my opinion, a bean in Spring is a POJO, so
how can it not maintain state?
Yes, it's better for a Spring/Singleton to not have a state (of course it can uses other Spring/Singletons [also them without a state]) so you can call its methods from different threads without worring about they could messed up its state (it doesn't have one :-)).
Let's think about a calculator that stores its intermediate results inside an internal stack, what can happen if two threads try to calculate something at the same time?
A Spring/Singleton is annotated (and if it's not it's just like it would be) and lives inside the spring context , it's not a POJO.
If you want to have a Spring/Bean with a state you have to use the scope "prototype", with this kind of scope every time you get a bean you will get a difference instance.
Sorry for the bad english
The book is implying that the state of a bean may not be trustworthy at a given point in time due to manipulation of its state by another thread. If multiple threads are accessing the same instance of a bean you cannot be sure of what state changes have occurred on the bean prior to using it. Since Spring uses Singletons by default there is only one instance of a bean. If multiple threads hit the bean at the same time there could be issues with the state of the bean.
So your correct that the beans will maintain state, however the state may be unreliable due to modifications from other threads.
In a typical web application, you have multiple threads (one per request) and all the requests are using the same beans. In a typical layered application, they will uses a Controller which uses then a Service which uses then a Repository. All of these beans should be singletons without any state. Otherwise you will have bugs due to concurrency (e.g. one thread modifies data in a singleton while another thread do the same).
I think that your misunderstanding comes from saying that bean in Spring are POJO. Most of Spring beans are stateless beans which I would not describe as POJO.
A Spring bean is considered a POJO in the sense that it does not need to adhere to special requirements (implement certain interfaces, extend particular class, be specific in some way, etc.) to be used with Spring.
Depending on the requirements a Spring bean might need to maintain, and operate on, its state.
It also depends on the requirements to consider if the Spring bean should be a singleton or not.
If a singleton Spring bean is maintaining its own state, proper safeguards must be taken to ensure correct concurrent access/modification.
Confusion comes from a general pattern used in enterprise applications where Spring beans are used to implement bulk of the system logic, and support operations.
In these scenarios, generally its a good practice to not have any state in the Spring bean itself, and just be an extension of the Value Object/Data Transfer Object that contain actual state.
Since these Spring beans are not going to maintain their own state, they are almost always created as singletons.
SomeClass, SomeOtherClass and Situation are all POJO in below code.
SomeClass and SomeOtherClass are Spring beans but Situation is not, which is a DTO.
class SomeClass {
private int x; // not restricted
public void handleSituation(Situation s) {
// may or may not use/modify SomeClass state represented by 'x'
// one approach is to provide all required information through parameters
}
}
class SomeOtherClass {
#Autowired
private SomeClass sc;
public void process() {
// gets called from another object
Situation z = new Situation();
sc.handleSituation(z);
// other processing
}
}
<!-- default scope = singleton -->
<bean id="someClassBean" class="SomeClass"/>
<bean id="someOtherClassBean" class="SomeOtherClass">
<property name="someClass" ref="someClassBean"/>
</bean>
This is a slight variation from pure OOP in which above would be coded similar to following.
class SomeOtherClass {
public void process() {
// gets called from another object
Situation z = new Situation();
z.handle();
// other processing
}
}