I am porting an existing JBOSS JEE application to Quarkus. I am using a number of HV custom validators that require injection.
For that purpose I've defined all custom validators that require injection as bean in my libraries like this:
#ApplicationScoped
public class SomeValidator implements ConstraintValidator<SomeValidation, AnObject> {
#Inject
public BeanUsingEntityManager bean;
Note: It is common code, so it should remain working on JBOSS as well
Next I defined a REST service. The REST service makes use of an application scoped bean like this.
#ApplicationScoped
public class ApplicationContext {
#PersistenceContext( unitName = "A" )
EntityManager em;
#Produces
#EnityManagerA // required qualifier to make datasource unique in JEE context (there are more)
public EntityManager produce() {
return em;
}
// NOTE: quarkus does not allow the #Produces on a field, which is allowed in JBOSS hence the method
#Produces
public BeanUsingEntityManager createBeanUsingEntityManager () {
// some logic that requires the entity manager.
}
}
Now the problem is simplified, but I keep on running into an error message.
Caused by: javax.enterprise.inject.CreationException: Synthetic bean instance for javax.persistence.EntityManager not initialized yet: javax_persistence_EntityManager_b60c51739990fc921960fc78caeb075a811a91a6
- a synthetic bean initialized during RUNTIME_INIT must not be accessed during STATIC_INIT
- RUNTIME_INIT build steps that require access to synthetic beans initialized during RUNTIME_INIT should consume the SyntheticBeansRuntimeInitBuildItem
at javax.persistence.EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.create(EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.zig:167)
at javax.persistence.EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.create(EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.zig:190)
at io.quarkus.arc.impl.AbstractSharedContext.createInstanceHandle(AbstractSharedContext.java:96)
at io.quarkus.arc.impl.AbstractSharedContext.access$000(AbstractSharedContext.java:14)
at io.quarkus.arc.impl.AbstractSharedContext$1.get(AbstractSharedContext.java:29)
at io.quarkus.arc.impl.AbstractSharedContext$1.get(AbstractSharedContext.java:26)
at io.quarkus.arc.impl.LazyValue.get(LazyValue.java:26)
at io.quarkus.arc.impl.ComputingCache.computeIfAbsent(ComputingCache.java:69)
at io.quarkus.arc.impl.AbstractSharedContext.get(AbstractSharedContext.java:26)
at javax.persistence.EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.get(EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.zig:222)
at javax.persistence.EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.get(EntityManager_e1903961aa3b05f292293ca76e991dd812f3e90e_Synthetic_Bean.zig:238)
at nl.bro.gm.gmw.dispatch.resources.ApplicationContext_Bean.create(ApplicationContext_Bean.zig:131)
... 59 more
I'm new to Quarkus. So, not sure to how to handle this issue or even if I make the correct assumptions. I can imagine that Quarkus wants to give me a fresh entitymanager each request (which I understand), but that poses a problem for my application scoped beans.
What am I doing wrong here?
So, the full answer is that the EntityManager is created at the runtime init phase whereas the ValidatorFactory (and the ConstraintValidators) are created at static init time.
The Quarkus bootstrap goes static init -> runtime init.
So in your case, you can't access a #Singleton bean which uses the EntityManager during static init as it's not yet available.
Making your bean #ApplicationScoped will create a proxy and avoid this chicken and egg problem.
You will have only one BeanUsingEntityManager for your whole application.
The EntityManager is a bit different because we wrap it and you will get one new EntityManager/Session per transaction, which is what is expected as EntityManagers/Sessions are not thread safe.
Related
Java configuration allows us to manage bean creation within a configuration file. Annotated #Component, #Service classes used with component scanning does the same. However, I'm concerned about using these two mechanisms at the same time.
Should Java configuration and annotated component scans be avoided in the same project? I ask because the result is unclear in the following scenario:
#Configuration
public class MyConfig {
#Bean
public Foo foo() {
return new Foo(500);
}
}
...
#Component
public class Foo {
private int value;
public Foo() {
}
public Foo(int value) {
this.value = value;
}
}
...
public class Consumer {
#Autowired
Foo foo;
...
}
So, in the above situation, will the Consumer get a Foo instance with a 500 value or 0 value? I've tested locally and it appears that the Java configured Foo (with value 500) is created consistently. However, I'm concerned that my testing isn't thorough enough to be conclusive.
What is the real answer? Using both Java config and component scanning on #Component beans of the same type seems like a bad thing.
I think your concern is more like raised by the following use case:
You have a custom spring-starter-library that have its own #Configuration classes and #Bean definitions, BUT if you have #Component/#Service in this library, you will need to explicitly #ComponentScan these packages from your service, since the default #ComponentScan (see #SpringBootApplication) will perform component scanning from the main class, to all sub-packages of your app, BUT not the packages inside the external library. For that purpose, you only need to have #Bean definitions in your external library, and to inject these external configurations via #EnableSomething annotation used on your app's main class (using #Import(YourConfigurationAnnotatedClass.class) OR via using spring.factories in case you always need the external configuration to be used/injected.
Of course, you CAN have #Components in this library, but the explicit usage of #ComponentScan annotation may lead to unintended behaviour in some cases, so I would recommend to avoid that.
So, to answer your question -> You can have both approaches of defining beans, only if they're inside your app, but bean definitions outside your app (e.g. library) should be explicitly defined with #Bean inside a #Configuration class.
It is perfectly valid to have Java configuration and annotated component scans in the same project because they server different purposes.
#Component (#Service,#Repository etc) are used to auto-detect and auto-configure beans.
#Bean annotation is used to explicitly declare a single bean, instead of letting Spring do it automatically.
You can do the following with #Bean. But, this is not possible with #Component
#Bean
public MyService myService(boolean someCondition) {
if(someCondition) {
return new MyServiceImpl1();
}else{
return new MyServiceImpl2();
}
}
Haven't really faced a situation where both Java config and component scanning on the bean of the same type were required.
As per the spring documentation,
To declare a bean, simply annotate a method with the #Bean annotation.
When JavaConfig encounters such a method, it will execute that method
and register the return value as a bean within a BeanFactory. By
default, the bean name will be the same as the method name
So, As per this, it is returning the correct Foo (with value 500).
In general, there is nothing wrong with component scanning and explicit bean definitions in the same application context. I tend to use component scanning where possible, and create the few beans that need more setup with #Bean methods.
There is no upside to include classes in the component scan when you create beans of their type explicitly. Component scanning can easily be targeted at certain classes and packages. If you design your packages accordingly, you can component scan only the packages without "special" bean classes (or else use more advanced filters on scanning).
In a quick look I didn't find any clear information about bean definition precedence in such a case. Typically there is a deterministic and fairly stable order in which these are processed, but if it is not documented it maybe could change in some future Spring version.
Can we inject a bean to a service during runtime? I'm working on a Spring MVC application and have two different beans which use the same functionality. I need to inject a bean during runtime based on some parameters. How do I do that in Spring?
If you want to switch between the beans which are already created
then use this method
Autowire ApplicationContext in the class
#Autowired ApplicationContext ctx;
And in the method, just get those beans from the ApplicationContext and switch between those. I would use an interface and then have those 2 (or more) classes (which you want to switch at runtime) implement the interface so that there will be a contract.
BeanInterface beanName;
if (x){
beanName = (BeanClass1) ctx.getBean("beanClass1");
}
else{
beanName = (BeanClass2) ctx.getBean("beanClass2");
}
Disclaimer: Did not test this out, you might need some tweaks if this is not working.
If you want even the bean creation to be based on certain runtime parameters, take a look here https://stackoverflow.com/a/34350983/6785908
I have build a web application and scheduled cron job using cron4j. While executing the cron the run() method is calling and in the run() method all other bean objects are showing null. Hence, I am getting NullPointerException. Below is my sample code.
class Employee{
#autowired
IEmployeeService employeeService;
public void run() {
employeeService.getEmployeeDetails();
}
}
The above example employeeService object getting null and all other bean objects inside getEmployeeDetails(); are getting null and getJdbcTemplate() is also null.
How to initialize bean objects in spring while executing cron using cron4j.
You can use #BeanFactoryPostProcessor annotation to order creation of beans and create/run your job at the end of bean initialization.
The definition of BeanFactoryPostProcessor to bean (configuration metadata) processing. That is to say, the Spring IoC container allows configuration of BeanFactoryPostProcessor metadata in the container before the actual read instantiate any other bean, and may modify it. If you want, you can configure multiple BeanFactoryPostProcessor. You can control the BeanFactoryPostProcessor by setting the'order'attribute of the execution order.
From Spring documentation:
The next extension point that we will look at is the org.springframework.beans.factory.config.BeanFactoryPostProcessor. The semantics of this interface are similar to those of the BeanPostProcessor, with one major difference: BeanFactoryPostProcessor operates on the bean configuration metadata; that is, the Spring IoC container allows a BeanFactoryPostProcessor to read the configuration metadata and potentially change it before the container instantiates any beans other than BeanFactoryPostProcessors.
Please find more information on Spring docs: http://docs.spring.io/spring/docs/4.1.3.BUILD-SNAPSHOT/spring-framework-reference/htmlsingle/#beans-factory-extension-factory-postprocessors
The spring framework documentation states:
In the unlikely case that a test may
'dirty' the application context,
requiring reloading - for example, by
changing a bean definition or the
state of an application object -
Spring's testing support provides
mechanisms to cause the test fixture
to reload the configurations and
rebuild the application context before
executing the next test.
Can someone elaborate this? I am just not getting it. Examples would be nice.
Each JUnit test method is assumed to be isolated, that is does not have any side effects that could cause another test method to behave differently. This can be achieved by modifying the state of beans that are managed by spring.
For example, say you have a bean managed by spring of class MySpringBean which has a string property with a value of "string". The following test method testBeanString will have a different result depending if it is called before or after the method testModify.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"/base-context.xml"})
public class SpringTests {
#Autowired
private MySpringBean bean;
#Test public void testModify() {
// dirties the state of a managed bean
bean.setString("newSring");
}
#Test public void testBeanString() {
assertEquals("string", bean.getString());
}
}
use the #DirtiesContext annotation to indicate that the test method may change the state of spring managed beans.
We have a legacy system where something like a Service Locator is used to instantiate and provide all service objects:
class ServiceLocator {
ServiceA serviceA;
ServiceB serviceB;
public ServiceLocator () {
serviceA = ...;
serviceB = ...;
}
public ServiceA getServiceA() {
return serviceA;
}
public ServiceB getServiceB() {
return serviceB;
}
}
(imagine 70 more fields and getters...)
This object is then passed around from class to class to provide access to the service objects.
It is outside the scope of the project to change this design for existing code, but to at least not make things worse, we would like to introduce Spring to progressively instantiate future services with DI similar to Introducing an IoC Container to Legacy Code.
In contrast to the aforementioned situation, we already know how we will access the spring created spring bean objects from our legacy code. Our problem are objects we plan to create with spring, that need any of the service objects created outside of the spring context.
We came up with the following solution:
Create a static accessor for the ServiceLocator and set it in the constructor, load the spring application context object. In the spring configuration create a bean for the ServiceLocator with the static accessor as described in Section 3.3.2.2 in the Spring reference:
<bean id="serviceLocator"
class="ServiceLocator"
factory-method="getInstance"/>
for each Service create another bean using "instance factory method" as described in Section 3.3.2.3:
<bean id="serviceA"
factory-bean="serviceLocator"
factory-method="getServiceA"/>
Create other beans referencing these "dummy beans".
I guess this would work, but creates a lot of seamingly unnessessary pseudo configuration. What I'd rather like is something like this:
"If a bean is referenced and that bean is not explicitly defined, search for a method with the needed signature and name in the ServiceLocator class and use this object."
Is it possible to do so? Are there any entry points into the spring bean instantiation process that I am not aware of and that can be used here? Can I do this by subclassing the spring application context class?
Any help would be greatly appreciated!
You can define a BeanFactoryPostProcessor to populate your application context with beans from ServiceLocator.
In BeanFactoryPostProcessor, use beanFactory.registerSingleton(...) to add a fully instantiated bean, or ((BeanDefinitionRegistry) beanFactory).registerBeanDefinition(...) to add a definition (note that some application contexts may not implement BeanDefinitionRegistry, though all typical contexts implement it).