Is it still loose coupling if we use #qualifier - spring

we use interface for autowiring service into controller
this is for loose coupling , coz the interface can hold object of any of its implementations. so no need to write the implementation name and create tight coupling.
however when we have more than one implementations for an interface, we write the #qualifier .
my question is if we have to write qualifier to tell which implementation needs to be injected, then should we still call it loose coupling??
class ServiceInterface {
interfaceMethod();
}
implementation 1:
#component("service1")
class ServiceImpl1 implements ServiceInterface {
interfacemethod(){
}
}
implementation 2:
#component("service2")
class ServiceImpl2 implements ServiceInterface {
interfaceMethod(){
}
now only instead of directly creating Object of ServiceImpl1() using new
ServiceImpl1 obj = new ServiceImpl1();
we write in
class Controller {
#autowired
#qualifier("service1")
ServiceInterface se;
sc.interfaceMethod();
}

Partially yes, because the component that uses the injected qualified bean still didn't create it or handle its lifecycle.
But indeed using #Qualifier creates some sort of coupling. If you want to avoid this, consider making one of your ServiceInterface beans the primary implementation for the interface annotating its class with #Primary as follows:
#Component
#Primary
class ServiceImpl1 implements ServiceInterface {
interfacemethod(){
}
}
With this, every time you need a ServiceInterface implementation but you don't actually specify which one you want (using #Qualifier) the primary one is injected by Spring.

Even though you are using the #Qualifier annotation you are still using inversion of control to let the framework manage your dependencies.
Furthermore, lets say you wouldn't autowire this implementation but use 'new' to create your object.
When the implementation changes you would need to update all the places where this is created. However, with dependency injection you wouldn't need to do so. Therefore you still have the advantages of dependency injection with regards to loose coupling.
If you would like to have your implementation less coupled with your target class then you could do a few things
Use #Primary for a bean to determine the default implementation.
Autowire your implementations into a List<ServiceInterface>
Use Spring's ObjectFactory to determine which bean to use at runtime
Use Profiles to determine which bean to autowire

Related

Factory design pattern and Spring

I am wondering what is the current best practice as to the use of factory pattern within the context of Spring framework in using dependency injection. My wonder arises about whether the factory pattern is still relevant nowadays in light of the use of Spring dependency injection. I did some searching and see some past discussion (Dependency Injection vs Factory Pattern) but seem there is different view.
I see in some real life project in using a Map to hold all the beans and rely on autowiring to create those beans. When the bean is needed, it get it via the map using the key.
public abstract class Service {
//some methods
}
#Component
public class serviceA extends Service {
//implementation
}
#Component
public class serviceB extends Service {
//implementation
}
Map<String, Service> services;
But I see there is some difference among the two approaches.
Using the above method, all beans are created on application start up and the creation of object is handled by the framework. It also implies there is only one bean for each type.
While for factory pattern, the factory class creates the object on request. And it can create a new object for each request.
I think a deeper question may be, when Spring framework is used in a project, should it be strived to not create any object inside a class, which means the factory pattern ( or any creational design patterns?) should not be used, as Spring is supposed to be the central handler of the objects dependency ?
The answer to this question can be really deep and broad, I'll try to provide some points that hopefully will help.
First off, spring stores its beans (singletons) in the ApplicationContext. Essentially this is the map you're talking about. In a nutshell, it allows getting the bean by name, type, etc.
ApplicationContext, while being a really important concept, is not the whole Spring, in fact Spring framework allows much more flexibility:
You say, using a map implies that all the beans will be created at the beginning of the application and there is one instance of the bean.
Spring has a concept of Lazy beans, basically supporting a concept of beans being actually created only when they're required for the first time, so Spring supports the "delayed" beans initialization
Spring also allows more than one instance of a bean per type. So this map is more "advanced". For example you can create more than one implementation of the interface and use declare both as beans. As long as you provide enough information about what bean should be injected to the class that might use them (for example with a help of qualifiers suppored in spring), you're good to go. In addition, there are features in spring IoC container that allow injecting all registered implementations of an interface into a list:
interface Foo {}
#Component
class FooImpl1 implements Foo {}
#Component
class FooImpl2 implements Foo {}
class Client {
#Autowired
List<Foo> allFoos;
}
Now you say:
While for factory pattern, the factory class creates the object on request. And it can create a new object for each request.
Actually Spring can create objects per request. Not all beans have to be singletons, in general spring has a concept of scopes for this purposes.
For example, scope prototype means that Spring will create a bean upon each usage. In particular one interesting usage that spring supports in variety of ways is Injecting prototype bean into singleton. Some solutions use exactly like a factory (read about annotation #Lookup others rely on auto-generated proxy in runtime (like javax.inject.Provider). Prototype scope beans are not held in the application context, so here again spring goes beyond a simple map abstraction.
Last feature that you haven't mentioned is that sometimes even for singletons the initialization can be a little bit more complicated then calling a constructor with Parameters. Spring can address that by using Java Configurations:
#Configuration
public class MyConfig {
public SomeComplicatedObject foo(#Value("...") config, Bar bar) {
SomeComplicatedObject obj = new SomeComplicatedObject() // lets pretend this object is from some thirdparty, it only has no-op constructor, and you can't place spring annotations on it (basically you can't change it):
obj.setConfig(config);
obj.setBar(bar);
return obj;
}
}
The method foo here initializes the object SomeComplicatedObject and returns it. This can be used instead of factories to integrate "legacy" code (well, java configurations go way beyond this, but its out of scope for this question).
So bottom line, you Spring as an IoC container can provide many different ways to deal with object creation, in particular it can do everything that factory design pattern offers.
Now, I would like to also refer to your last sentense:
I think a deeper question may be, when Spring framework is used in a project, should it be strived to not create any object inside a class, which means the factory pattern ( or any creational design patterns?) should not be used, as Spring is supposed to be the central handler of the objects dependency ?
Indeed you don't have to use Factory Pattern when using Spring, since (as I hopefully have convinced you) provides everything that factory can do and more.
Also I agree that spring is supposed to be the central handler of the objects dependency (unless there are also parts of the application which are written in a different manner so you have to support both :) )
I don't think we should avoid using "new" altogether, not everything should/can be a bean, but I do see (from my subjective experience, so this is arguable) that you use it much less leaving the creation of most of the objects to Spring.
Should we avoid a usage of any creation design pattern? I don't think so, sometimes you can opt for implementing "builder" design pattern for example, its also a creational pattern but spring doesn't provide a similar abstraction.
I think if your project uses Spring framework you should use it. Although it depends on your project design e.g. You may use creational patterns along side with Spring IoC. e.g when you have abstraction layers not framework dependant (agnostic code)
interface ServiceFactory {
Service create(String type);
}
#Component
class SpringServiceFactory implements ServiceFactory {
#Autowired private ApplicationContext context;
Service create(String type) {
return context.getBean(type)
}
}
I use Factory pattern as well when I refactor legacy not unit testable code which also uses Spring Framework in order to implement unit tests.
// legacy service impossible to mock
class LegacyApiClient implements Closeable {...}
#Component
class LegacyApiClientFactory {
LegacyApiClient create(String endpoint) {
return new LegacyApiClient(endpoint);
}
}
#Component
class OtherService {
private final String endpoint
private final LegacyApiClientFactory factory;
OtherService(#Value("${post.endpoint}") String endpoint,
LegacyApiClientFactory factory) {...}
void doCall {
try (LegacyApiClient client = factory.create(endpoint)) {
client.postSomething();
}
}
}
....
// a random unit test
LegacyApiClient client = mock(LegacyApiClient.class)
LegacyApiClientFactory factory = mock(LegacyApiClientFactory.class)
OtherService service = new OtherService("http://scxsc", factory);
when(factory.create(any())).thenReturn(client)
service.doCall()
....

Java Configuration vs Component Scan Annotations

Java configuration allows us to manage bean creation within a configuration file. Annotated #Component, #Service classes used with component scanning does the same. However, I'm concerned about using these two mechanisms at the same time.
Should Java configuration and annotated component scans be avoided in the same project? I ask because the result is unclear in the following scenario:
#Configuration
public class MyConfig {
#Bean
public Foo foo() {
return new Foo(500);
}
}
...
#Component
public class Foo {
private int value;
public Foo() {
}
public Foo(int value) {
this.value = value;
}
}
...
public class Consumer {
#Autowired
Foo foo;
...
}
So, in the above situation, will the Consumer get a Foo instance with a 500 value or 0 value? I've tested locally and it appears that the Java configured Foo (with value 500) is created consistently. However, I'm concerned that my testing isn't thorough enough to be conclusive.
What is the real answer? Using both Java config and component scanning on #Component beans of the same type seems like a bad thing.
I think your concern is more like raised by the following use case:
You have a custom spring-starter-library that have its own #Configuration classes and #Bean definitions, BUT if you have #Component/#Service in this library, you will need to explicitly #ComponentScan these packages from your service, since the default #ComponentScan (see #SpringBootApplication) will perform component scanning from the main class, to all sub-packages of your app, BUT not the packages inside the external library. For that purpose, you only need to have #Bean definitions in your external library, and to inject these external configurations via #EnableSomething annotation used on your app's main class (using #Import(YourConfigurationAnnotatedClass.class) OR via using spring.factories in case you always need the external configuration to be used/injected.
Of course, you CAN have #Components in this library, but the explicit usage of #ComponentScan annotation may lead to unintended behaviour in some cases, so I would recommend to avoid that.
So, to answer your question -> You can have both approaches of defining beans, only if they're inside your app, but bean definitions outside your app (e.g. library) should be explicitly defined with #Bean inside a #Configuration class.
It is perfectly valid to have Java configuration and annotated component scans in the same project because they server different purposes.
#Component (#Service,#Repository etc) are used to auto-detect and auto-configure beans.
#Bean annotation is used to explicitly declare a single bean, instead of letting Spring do it automatically.
You can do the following with #Bean. But, this is not possible with #Component
#Bean
public MyService myService(boolean someCondition) {
if(someCondition) {
return new MyServiceImpl1();
}else{
return new MyServiceImpl2();
}
}
Haven't really faced a situation where both Java config and component scanning on the bean of the same type were required.
As per the spring documentation,
To declare a bean, simply annotate a method with the #Bean annotation.
When JavaConfig encounters such a method, it will execute that method
and register the return value as a bean within a BeanFactory. By
default, the bean name will be the same as the method name
So, As per this, it is returning the correct Foo (with value 500).
In general, there is nothing wrong with component scanning and explicit bean definitions in the same application context. I tend to use component scanning where possible, and create the few beans that need more setup with #Bean methods.
There is no upside to include classes in the component scan when you create beans of their type explicitly. Component scanning can easily be targeted at certain classes and packages. If you design your packages accordingly, you can component scan only the packages without "special" bean classes (or else use more advanced filters on scanning).
In a quick look I didn't find any clear information about bean definition precedence in such a case. Typically there is a deterministic and fairly stable order in which these are processed, but if it is not documented it maybe could change in some future Spring version.

Methods of Autowiring in Spring - Difference between the two possible alternatives below

I have a basic Auto-wiring Question. I see the following two implementations that are possible in Spring auto-wiring
Method1
public class SimpleMovieLister {
private MovieFinder movieFinder;
#Autowired
public void setMovieFinder(MovieFinder movieFinder) {
this.movieFinder = movieFinder;
}
// ...
}
Method2
public class SimpleMovieLister {
#Autowired
private MovieFinder movieFinder;
}
My understanding is that both are the same and I use a lot of Method2 in my code. What are the situations in which Method1 is useful ? Or is it just a case of Spring evolution and we have both possible ways in which to implement.
Sorry, if the question is too basic, but I need to get this cleared up
Method 1 is Setter Injection.
Method 2 is Field Injection.
A 3rd method is Constructor Injection
Example:
public class SimpleMovieLister {
private MovieFinder movieFinder;
#Autowired
public SimpleMovieLister(MovieFinder movieFinder) {
this.movieFinder = movieFinder;
}
// ...
}
Method 3, Constructor Injection is preferred because it makes testing significantly easier since you can pass in the required dependency.
Also, if your Bean only has 1 Constructor, then you can omit the #Autowired annotation. Spring will automatically choose that constructor method when creating the bean.
A good snippet from the docs:
The Spring team generally advocates constructor injection as it
enables one to implement application components as immutable objects
and to ensure that required dependencies are not null. Furthermore
constructor-injected components are always returned to client
(calling) code in a fully initialized state. As a side note, a large
number of constructor arguments is a bad code smell, implying that the
class likely has too many responsibilities and should be refactored to
better address proper separation of concerns.
Setter injection should primarily only be used for optional
dependencies that can be assigned reasonable default values within the
class. Otherwise, not-null checks must be performed everywhere the
code uses the dependency. One benefit of setter injection is that
setter methods make objects of that class amenable to reconfiguration
or re-injection later. Management through JMX MBeans is therefore a
compelling use case for setter injection.
Use the DI style that makes the most sense for a particular class.
Sometimes, when dealing with third-party classes for which you do not
have the source, the choice is made for you. For example, if a
third-party class does not expose any setter methods, then constructor
injection may be the only available form of DI.
https://docs.spring.io/spring/docs/current/spring-framework-reference/html/beans.html
Method 1 is setter Injection
Setter injection should primarily only be used for optional
dependencies that can be assigned reasonable default values within the
class. Otherwise, not-null checks must be performed everywhere the
code uses the dependency. One benefit of setter injection is that
setter methods make objects of that class amenable to reconfiguration
or re-injection later.
Method 2 is field Injection

What are possible causes for Spring #ComponentScan being unable to auto create a class anotated by #Repository

I came across a tutorial which seemed to be fitting my usecase and tried implementing it. I failed but wasn't sure why. So I tried to find another example with similar code and looked at the book "Spring in Action, Fourth Edition by Craig Walls"
The books describes at page 300 the same basic approach. Define a JdbcTemplate Bean first.
#Bean
NamedParameterJdbcTemplate jdbcTemplate(DataSource dataSource) {
return new NamedParameterJdbcTemplate(dataSource);
}
Then a Repository implementing an Interface
#Repository
public class CustomRepositoryImpl implements CustomRepository {
private final NamedParameterJdbcOperations jdbcOperations;
private static final String TEST_STRING = "";
#Autowired
public CustomRepositoryImpl(NamedParameterJdbcOperations jdbcOperations) {
this.jdbcOperations = jdbcOperations;
}
So I did like the example in the book suggests, wrote a test but got the error message
Error creating bean with name 'de.myproject.config.SpringJPAPerformanceConfigTest': Unsatisfied dependency expressed through field 'abc'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'de.myproject.CustomRepository' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
To my understanding as book and tutorial describe, the Repository should be recognized as a Bean definition by the component scan.
To test this I created an context and asked for all registered Beans.
AnnotationConfigApplicationContext
context = new AnnotationConfigApplicationContext();
context.getBeanDefinitionNames()
As assumed my Repository wasn't among them. So I increased, for test purposes only, scope of the search in my project, and set it to the base package. Every other Bean was shown, except the Repository.
As an alternative to component scanning and autowiring, the books describes the possibility to simply declare the Repository as a Bean, which I did.
#Bean
public CustomRepository(NamedParameterJdbcOperations jdbcOperations) {
return new CustomRepositoryImpl(jdbcOperations);
}
After that Spring was able to wire the Repository. I looked at the github code of the book in hope for a better understanding, but unfortunately only the Bean solution, which runs, is implemented there.
So here are my questions:
1.) what possible reasons are there for a Bean definition, is a scenario like this one, not to be recognized by the component scan?
2.) this project already uses Spring JPA Data Repositories, are there any reasons not to use both approaches at the same time?
The problem is naming of your classes. There are many things to understand here.
You define a repository Interface #Repository is optional provided it extends CRUDRepository or one of the repositories provided by spring-data. In this class you can declare methods(find By....). And spring-data will formulate the query based on the underlying database. You can also specify your query using #Query.
Suppose you have a method which involves complex query or something which spring-data cannot do out of the box, in such case we can use the underlying template class for example JdbcTemplate or MongoTemplate..
The procedure to do this is to create another interface and a Impl class. The naming of this interface should be exactly like Custom and your Impl class should be named Impl.. And all should be in same package.
For example if your Repository name is AbcRepository then Your custom repository should be named AbcRepositoryCustom and the implementation should be named AbcRepositoryImpl.. AbcRepository extends AbcRepositoryCustom(and also other spring-data Repositories). And AbcRepositoryImpl implements AbcRepositoryCustom
I was able to "solve" the problem myself.
As we also have a front end class annotated with the same basePackage for the #ComponentScan
#EnableWebMvc
#Configuration
#ComponentScan(basePackages = {"de.myproject.*"})
so there were actually two identical #ComponentScans annotations which I wasn't aware off and this did lead to a conflict. It seams the ordering how the whole application had to be loaded had changed, but thats only me guessing.
I simply moved my Repository and its Impl to a subpackage, and changed the
#ComponentScan(basePackages = {"de.myproject.subpackage.*"})
and now everything works fine. Though it escapes me, what the exact reason behind this behavior is.

Spring DI? Interface Type?

I understand the how, but can't seem formally shape the definitions.
As known DI can be done via constructor or setter or interface.
I am confused about the latest one -interface based DI, is it used in Spring?
UPDATE: I gave bad examle in here, which led to wrong understanding.
To fix it up:
Say we have setter and in setter we inject interface implemented by some class. Is that considered DI via setter or interface?
http://martinfowler.com/articles/injection.html#UsingAServiceLocator
this article divides DI on:
"There are three main styles of dependency injection. The names I'm using for them are Constructor Injection, Setter Injection, and Interface Injection. If you read about this stuff in the current discussions about Inversion of Control you'll hear these referred to as type 1 IoC (interface injection), type 2 IoC (setter injection) and type 3 IoC (constructor injection). I find numeric names rather hard to remember, which is why I've used the names I have here."
Else Service Locator pattern used for IoC, is it the one that actually makes possible #Autowired? - ie that not all classes explicitly need to be declared in xml for DI, as we can declare them as #Repository or #Controller or alike again if I recall correctly.
Thanks,
Autowiring an interface means wire a bean implementing that interface. This relies on an implementation actually existing in the bean factory.
#Autowired
UserService us; // wire a bean implementing UserService
--
#Service
public class UserServiceImpl implements UserService {
// the #Service annotation causes this implementation of UserService to
// be made available for wiring in the bean factory.
}
Worth noting is that if you wire by interface, Spring will expect there to exist one and exactly one bean in the bean factory implementing that interface. If more than one bean is found, an error will be thrown and you will have to specify which bean to wire (using the #Qualifier annotation).
EDIT:
When wiring, you can either wire a member variable or a setter method.
#Autowired
UserService us;
--
#Autowired
public void setUserService(UserService us) {
this.us = us;
}
These two produce the same result. The difference is that in the former, Spring will use reflection to set the variable us to a bean implementing UserService. In the latter, Spring will invoke the setUserService method, passing the same UserService imlementation.

Resources