Do I use dependency injection in the following example:
#Scope("prototype")
#Component
public class Order{
#Autowired
public Order(User user,List<OrderItem> items,.......){
Now somewhere else:
#Component
public class PersistOrder{
#Autowired
Provider<Order> orderProvider;
public void prepareOrder() {
Order order = orderProvider.get();
AS JSR-330 states, the get() method of Provider returns a new instance of Order, but what objects are passed to the new order's constructor? As you can see the order has its own dependencies that has to be injected before the actual order object is retrieved to the method.
Without DI I simply create all neccessary arguments and pass them to the constructor of the new order. So should I use DI here?
EDIT:
This is what the code would seem without DI:
#Component
public class PersistOrder{
public void prepareOrder() {
User user=userDao.get(userId);
List<OrderItem> orderItems=orderItemDao.getAll(orderItemIds);
Order order = new SmartPhoneOrder(user,orderItems);
As you see I have the ids of user and order items and can get their instances from DAOs. Then I pass these instances to the constructor of a subclass of the Order to get an instance. The process seems very clear to me. I mean how can I do this work with DI so that my code enjoy decoupling between PersistOrder and Order classes? Or using DI in this example makes the logic more complex?
Without dependency injection, domain objects can be considered anaemic, and this is arguably an anti-pattern. You're losing many of the benefits of OO, by having data structures, without associated behaviors. For example, to evaluate Order.isValid() you may need to inject some dependency to perform the validation.
You can get dependency injection on classes outside the context of the Spring container by using:
#Configurable
You then declare the recipe for the bean as a protoype, and Spring will use AOP to swizzle the constructor and look up the required dependencies. . Even though its looking up the dependencies, the net effect for you is dependency injection, because you'd only have to change one line to inject something else that evaluates Order.isValid().
This way when you do new Order, or get it from you persistence factory, it already has the dependencies "injected".
To do this requires aspectJ weaving and not just Proxy/CGLib weaving, which is the default. You can use either runtime weaving with a Java agent or build-time weaving. . . I recommend the former for integration tests and the latter for deployment. . .
Service vs Domain Object:
Now that you have the option of rich domain objects the question becomes where to put stuff. Service vs Entity? My take is that the service should orchestrate a use-case specific process based on reusable domain objects. The service is your non-OO gateway for outside subscribers into you system.
. . Google Spring #Configurable for more information and tutorials on AspectJ-based dependency injection of domain classes.
Related
I am wondering what is the current best practice as to the use of factory pattern within the context of Spring framework in using dependency injection. My wonder arises about whether the factory pattern is still relevant nowadays in light of the use of Spring dependency injection. I did some searching and see some past discussion (Dependency Injection vs Factory Pattern) but seem there is different view.
I see in some real life project in using a Map to hold all the beans and rely on autowiring to create those beans. When the bean is needed, it get it via the map using the key.
public abstract class Service {
//some methods
}
#Component
public class serviceA extends Service {
//implementation
}
#Component
public class serviceB extends Service {
//implementation
}
Map<String, Service> services;
But I see there is some difference among the two approaches.
Using the above method, all beans are created on application start up and the creation of object is handled by the framework. It also implies there is only one bean for each type.
While for factory pattern, the factory class creates the object on request. And it can create a new object for each request.
I think a deeper question may be, when Spring framework is used in a project, should it be strived to not create any object inside a class, which means the factory pattern ( or any creational design patterns?) should not be used, as Spring is supposed to be the central handler of the objects dependency ?
The answer to this question can be really deep and broad, I'll try to provide some points that hopefully will help.
First off, spring stores its beans (singletons) in the ApplicationContext. Essentially this is the map you're talking about. In a nutshell, it allows getting the bean by name, type, etc.
ApplicationContext, while being a really important concept, is not the whole Spring, in fact Spring framework allows much more flexibility:
You say, using a map implies that all the beans will be created at the beginning of the application and there is one instance of the bean.
Spring has a concept of Lazy beans, basically supporting a concept of beans being actually created only when they're required for the first time, so Spring supports the "delayed" beans initialization
Spring also allows more than one instance of a bean per type. So this map is more "advanced". For example you can create more than one implementation of the interface and use declare both as beans. As long as you provide enough information about what bean should be injected to the class that might use them (for example with a help of qualifiers suppored in spring), you're good to go. In addition, there are features in spring IoC container that allow injecting all registered implementations of an interface into a list:
interface Foo {}
#Component
class FooImpl1 implements Foo {}
#Component
class FooImpl2 implements Foo {}
class Client {
#Autowired
List<Foo> allFoos;
}
Now you say:
While for factory pattern, the factory class creates the object on request. And it can create a new object for each request.
Actually Spring can create objects per request. Not all beans have to be singletons, in general spring has a concept of scopes for this purposes.
For example, scope prototype means that Spring will create a bean upon each usage. In particular one interesting usage that spring supports in variety of ways is Injecting prototype bean into singleton. Some solutions use exactly like a factory (read about annotation #Lookup others rely on auto-generated proxy in runtime (like javax.inject.Provider). Prototype scope beans are not held in the application context, so here again spring goes beyond a simple map abstraction.
Last feature that you haven't mentioned is that sometimes even for singletons the initialization can be a little bit more complicated then calling a constructor with Parameters. Spring can address that by using Java Configurations:
#Configuration
public class MyConfig {
public SomeComplicatedObject foo(#Value("...") config, Bar bar) {
SomeComplicatedObject obj = new SomeComplicatedObject() // lets pretend this object is from some thirdparty, it only has no-op constructor, and you can't place spring annotations on it (basically you can't change it):
obj.setConfig(config);
obj.setBar(bar);
return obj;
}
}
The method foo here initializes the object SomeComplicatedObject and returns it. This can be used instead of factories to integrate "legacy" code (well, java configurations go way beyond this, but its out of scope for this question).
So bottom line, you Spring as an IoC container can provide many different ways to deal with object creation, in particular it can do everything that factory design pattern offers.
Now, I would like to also refer to your last sentense:
I think a deeper question may be, when Spring framework is used in a project, should it be strived to not create any object inside a class, which means the factory pattern ( or any creational design patterns?) should not be used, as Spring is supposed to be the central handler of the objects dependency ?
Indeed you don't have to use Factory Pattern when using Spring, since (as I hopefully have convinced you) provides everything that factory can do and more.
Also I agree that spring is supposed to be the central handler of the objects dependency (unless there are also parts of the application which are written in a different manner so you have to support both :) )
I don't think we should avoid using "new" altogether, not everything should/can be a bean, but I do see (from my subjective experience, so this is arguable) that you use it much less leaving the creation of most of the objects to Spring.
Should we avoid a usage of any creation design pattern? I don't think so, sometimes you can opt for implementing "builder" design pattern for example, its also a creational pattern but spring doesn't provide a similar abstraction.
I think if your project uses Spring framework you should use it. Although it depends on your project design e.g. You may use creational patterns along side with Spring IoC. e.g when you have abstraction layers not framework dependant (agnostic code)
interface ServiceFactory {
Service create(String type);
}
#Component
class SpringServiceFactory implements ServiceFactory {
#Autowired private ApplicationContext context;
Service create(String type) {
return context.getBean(type)
}
}
I use Factory pattern as well when I refactor legacy not unit testable code which also uses Spring Framework in order to implement unit tests.
// legacy service impossible to mock
class LegacyApiClient implements Closeable {...}
#Component
class LegacyApiClientFactory {
LegacyApiClient create(String endpoint) {
return new LegacyApiClient(endpoint);
}
}
#Component
class OtherService {
private final String endpoint
private final LegacyApiClientFactory factory;
OtherService(#Value("${post.endpoint}") String endpoint,
LegacyApiClientFactory factory) {...}
void doCall {
try (LegacyApiClient client = factory.create(endpoint)) {
client.postSomething();
}
}
}
....
// a random unit test
LegacyApiClient client = mock(LegacyApiClient.class)
LegacyApiClientFactory factory = mock(LegacyApiClientFactory.class)
OtherService service = new OtherService("http://scxsc", factory);
when(factory.create(any())).thenReturn(client)
service.doCall()
....
I am new to Spring so my understanding of it is very superficial, nevertheless, suppose we are looking at the following example:
class serviceImpl implements service{
#Autowired
private Mapper mapper;
public void executeService(){
mapper.executeSerice();
}
}
So, I am trying to build some service, which calls mapper from the persistence level. Mapper is an abstract class. So from my understanding, #Autowired will automatically injects one implementation of Mapper here.
Then my question is:
What if there are multiple implementations of Mapper? After some search, it seems that in this case one needs to use #Qualifier to designate which implementation we want to use.
Suppose we are using the implementation powerfulMapper, then we will need to use #Qualifier('powerfulMapper').
Then how is this different from just instantiating Mapper powerfulMapper here?
If you have only one Mapper , you only need #Autowired to inject. If there are more than one Mapper implementation registered as Spring bean , you have to use #Qualifier or #Resource to tell Spring which implementation you want to inject. See this for more details.
Then how is this different from just instantiating Mapper
powerfulMapper here?
The difference is that if a class is a Spring bean , we can apply some Spring feature on it such as :
Apply some AOP magic on it such as #Async , #Transactional , #PreAuthorize etc.
Think about the case that if a class has a lot of dependencies which in turn has a lot of dependencies, creating an instance of this class configuring with the whole dependency graph is not an enjoyable thing to do . Not to mention different dependency can has different requirements (e.g. one may needed to be instantiated as a singleton and will be shared to be used by different class while other may need to be in the prototype scope which different classes need a separate instance etc.)
Using #Autowired and let spring to configure such dependency graph is more easier than do it manually.
On the other hands , if the Mapper has very simple dependencies , only used internally inside ServiceImpl and you don't need any benefit provided by spring on it , you can simply instantiate it without declaring it as spring bean.
Dependency Injection (DI) is so that you avoid having a complicated tree of dependencies.
Imagen having a tree like structure where A instantiates class B and class B instantiates class C, then A instantiates D and D instantiates E.
A --> B ---> C
\
\--> D ---> E
This is all fine fine, until class E needs class C.
Then we need to re arrange everything and instantiate C higher up and pass it through class B and D down to both sides.
This is where DI comes into play.
We instead decide that A instantiates all classes
A --> B
--> C
--> D
--> E
So A is the owner of all classes, and he can then pass in any class to any class to meet whatever demand any class has.
This is what the Spring context does with the help of the #Autowire annotation. You declare for spring what class you want to be passed in the instantiated classes. Then Spring during startup, will instantiate all classes and then figure out what class should be fitted where (this is massive simplification). All classes will per default be instantiated as singletons (but this can be customised).
When spring instantiates you class it is called a spring managed bean. Spring manages the lifecycle of it. Not you, because you are not using new, the framework is.
During this process Spring does a number of checks and also instantiates in a specific order, configuration classes first, then #Bean annotated classes, and lastly #Component, #Service etc (overly simplifying this here), it will scan for what classes that should be instantiated, it will decide upon an order in which should be instantiated first, then which classes should be #Autowired into which classes. There are a number of help annotations that will assist Spring during this phase.
#Qualifier is one, you basically name a class, then you can tell spring where you want it #Autowired.
It's very useful if you have two singletons configured differently but they have the same name.
As you said, if you want different implementations you need to use #Qualifier("powerfulMapper") etc.
Lets suppose you have two different implementations one use #Qualifier("powerfulMapper") and the other is #Qualifier("notAPowerfulMapper").
Then while you autowire you also need to specify which one you need to inject like
#Autowired
#Qualifier("powerfulMapper")
private Mapper mapper;
I am trying to find a way how to programatically create bean in quarkus DI, but without success. Is it possible in this framework? It seems that BeanManager does not implement the needed method yet.
First, we should clarify what "programatically create bean" exactly means.
But first of all, we should define what "bean" means. In CDI, we talk about beans in two meanings:
Component metadata - this one describes the component attributes and how a component instance is created; the SPI is javax.enterprise.inject.spi.Bean
Component instance - the real instance used in application; in the spec we call it "contextual reference".
The metadata is usually derived from the application classes. Such metadata are "backed by a class". By "backed by a class" I mean all the kinds described in the spec. That is class beans, producer methods and producer fields.
Now, if you want to programatically obtain a component instance (option 2), you can:
Inject javax.enterprise.inject.Instance; see for example the Weld docs
Make use of CDI.current().select(Foo.class).get()
Make use of quarkus-specific Arc.container().instance(Foo.class).get()
However, if you want to add/register a component metadata that is not backed by a class (option 2), you need to add an extension that makes use of quarkus-specific SPIs, such as BeanRegistrar.
If you are looking for Quarkus equivalent of Spring #Configuration then you want "bean producer" (as mentioned in comments above)
Here is an example(koltin) of how to manually register a clock:
import java.time.Clock
import javax.enterprise.context.ApplicationScoped
import javax.enterprise.inject.Produces
#ApplicationScoped
class AppConfig {
#Produces
#ApplicationScoped
fun utcClock(): Clock {
return Clock.systemUTC()
}
}
#Produces is actually not required if method is already annotated with #ApplicationScoped
#ApplicationScoped at class level of AppConfig is also not required
Although, I find those extra annotations useful, especially if are used to Spring.
You can inject your beans using Instance:
#Inject
public TestExecutorService(final ManagedExecutor managedExecutor,
final Instance<YourTask> YourTask) {
this.managedExecutor = managedExecutor;
this.YourTask= YourTask;
}
And if you need to create more than one Instance you can use the managed executor:
tasks.forEach(task -> managedExecutor.submit(task::execute));
Keep in mind that depending on the way you start the bean you may need to destroy it and only the "creator class" has its reference, meaning you have to create and destroy the bean in the same classe (you can use something like events to handle that).
For more information please check: CDI Documentation
I want to initialize a model class that is a #SliceResource into an Osgi Service.
Is there a way to do this? Thanks!
In an AEM project that uses Slice, the idiomatic way to obtain a graph of objects in an OSGi service is by obtaining a reference to an Injector.
try (InjectorWithContext injector = InjectorUtil.getInjector(INJECTOR_NAME, resolver)) {
final ModelProvider modelProvider = injector.getInstance(ModelProvider.class);
MyModel myModel = modelProvider.get(MyModel.class, knownResource);
//do something with the model
}
Keep in mind that this can be used not just to instantiate a class annotated with #SliceResource but to build an arbitrary graph of objects using Guice as the Dependency Injection framework. It's a very powerful tool that allows you to manage all sorts of objects, possibly in different injection contexts (More information here)
Remember that the injector needs to be closed once you're done using it. Fortunately, the InjectorWithContext interface extends the AutoCloseable interface so you can use it in a try-with-resources block, as shown above.
Use InjectorUtil to obtain an injector. The INJECTOR_NAME can be found in your Activator where the Injector is instantiated and bindings between interfaces and implementations are registered.
ModelProvider#get allows you to inject a model in a context specified by the second argument. This can be a Resource instance or a path.
Please consider following situation with spring 4.0.7
For Eclipselink, we use a load-time-weaver. So we wanted to experiment with Springs #Configurable annotation using #EnableSpringConfigured with #EnableLoadTimeWeaving at the same time.
This is fully functional, and Spring-Beans are injected perfectly into POJOs during construction. This functionality is helpful for us, because we want to keep some code regarding validation of these POJOs inside these and not somewhere else in a Bean.
SOme of our Spring Context contains Beans that do not implement any interface, because they are local to some package and used only there. Lets say, FooLogicBean is one of them. If this is to be injected into another Bean, and some Spring-AOP-Aspect (not-aspectj) like some performance measurement aspect is in the execution path, Spring will create a CGLIB autoproxy for the FooLogicBean and inject that. This is fully functional and works perfectly too.
Problems arise, when we want to actually use a POJO that is annotated with #Configurable as a parameter in a method of FooLogicBean (like fooLogicBean.doValidate(myPojo); ), respectively a CGLIB Proxy. In this case, some non-trivial magic stops that POJO from being woven thru aspectj (AnnotationBeanConfigurerAspect from spring-aspects). It is even never woven anywhere in the code regardless of calling the aforementioned doValidate() Method.
If we create that POJO inside FooLogicBean, but dont use it as a method Parameter, it gets woven again due to #Configurable.
Without knowing the Autoproxy creation code, we assume some fancy marking routine from hindering a class from being detected by aspectj, if that class was already used in spring-aop. use in this case means Access.
Did anyone experiment with such obscure constellation and knows a solution for that?
Thanks in advance!
Proxies are created by subclassing, i.e. when you create a proxy of an annotated class Foo:
class Foo {
#Bar
void bar() { }
}
the proxy is created by implementing a class
class Foo$Proxy extends Foo {
#Override
void bar() {
// Proxy logic
}
// Overridden methods of Object
}
This makes the Foo$Proxy class a valid Liskov substitute for Foo. However, normal override semantics apply, i.e. non-inherited annotations such as #Bar are not longer present for the overridden methods. Once another annotation-based API processes your beans, all annotations have disappeared which leads to your outcome. This is true for all kinds of annotations. Those on types, methods and method parameters.
But is it avoidable? It most certainly is by using a proxy generator that was written recently and is built with knowledge of what annotations are, something that is not true for cglib which was first shipped in the early days of the Java virtual machine. However, as long as Spring does not move away from cglib, you will have to live with today's limitations.