I am trying to make class as singleton and to make it below changes are done
Beans.xml has this:
<bean id="LdapUti" class="com.amazon.bpmsawsproxy.util.LdapUtil" scope="singleton" />
LdapUtil class:
public class LdapUtil {
private static Log logger = LogFactory.getLog(LdapUtil.class);
public DirContext GetLdapDirContext() throws NamingException {
Hashtable<String, Object> env = new Hashtable<String, Object>(11);
env.put(Context.INITIAL_CONTEXT_FACTORY, "com.sun.jndi.ldap.LdapCtxFactory");
env.put(Context.PROVIDER_URL, "*********");
env.put(Context.SECURITY_CREDENTIALS, "******");
env.put(Context.SECURITY_PROTOCOL, "ssl");
env.put(Context.SECURITY_AUTHENTICATION, "simple");
DirContext ctx = new InitialLdapContext(env, null);
return ctx;}}
This is the unit test case written to test singleton class
#Test
public void testSingleton(){
LdapUtil ctx1 = new LdapUtil();
LdapUtil ctx2 = new LdapUtil();
assertEquals(System.identityHashCode(ctx1), System.identityHashCode(ctx2));
}
From the unit test case: I am getting two different hashcode which I believe it is creating more than one instance. Please let me know if I have missed someting
Kindly note the class in not a singleton, only scope of the class is set to singleton in sping config xml.
You created new instances which are not managed by spring hence you are getting different hashcodes.
Related
I am using spring.
I have a configured ObjectMapper for the entire project and I use it to set up a kafka deserializer.
And then I need a custom kafka deserializer to be used in KafkaListener.
I'm configuring KafkaListener via autoconfiguration, not via #Configuration class.
#Component
#RequiredArgsConstructor
public class CustomMessageDeserializer implements Deserializer<MyMessage> {
private final ObjectMapper objectMapper;
#SneakyThrows
#Override
public MyMessage deserialize(String topic, byte[] data) {
return objectMapper.readValue(data, MyMessage.class);
}
}
If i do like this
#KafkaListener(
topics = {"${topics.invite-user-topic}"},
properties = {"value.deserializer=com.service.deserializer.CustomMessageDeserializer"}
)
public void receiveInviteUserMessages(MyMessage myMessage) {}
I received KafkaException: Could not find a public no-argument constructor
But with public no-argument constructor in CustomMessageDeserializer class i am getting NPE because ObjectMapper = null. It creates and uses a new class, not a spring component.
#KafkaListener supports SpEL expressions.
And I think that this problem can be solved using SpEL.
Do you have any idea how to inject spring bean CustomMessageDeserializer with SpEL?
There are no easy ways to do it with SPeL.
Analysis
To get started, see the JavaDoc for #KafkaListener#properties:
/**
*
* SpEL expressions must resolve to a String ...
*/
The value of value.deserializer is used to instantiate the specified deserializer class. Let's follow the call chain:
You specify this value in the #KafkaListener annotation, then you are probably not creating a bean of the ConsumerFactory.class. So Spring creates this bean class itself - see KafkaAutoConfiguration#kafkaConsumerFactory.
Next is the creation of the returned object new DefaultKafkaConsumerFactory(...) as ConsumerFactory<?,?> using the constructor for default delivery expressions keyDeserializer/valueDeserializer = () -> null
This factory is used to create a Kafka consumer (The entry point is the constructor KafkaMessageListenerContainer#ListenerConsumer, then KafkaMessageListenerContainer.this.consumerFactory.createConsumer...)
In the KafkaConsumer constructor, the valueDeserializer object is being created, because it is null (for the default factory of point 2 above):
if (valueDeserializer == null) {
this.valueDeserializer = config.getConfiguredInstance(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, Deserializer.class);
The implementation of config.getConfiguredInstance involves instantiating your deserializer class via a parameterless constructor using reflection and your String "com.service.deserializer.CustomMessageDeserializer" class name
Solutions
To use value.deserializer with your customized ObjectMapper, you must create the ConsumerFactory bean yourself using the setValueDeserializer(...) method. This is also mentioned in the second Important part of the JSON.Mapping_Types.Important documentation
If you don't want to create a ConsumerFactory bean, and also don't have complicated logic in your deserializer (you only have return objectMapper.readValue(data, MyMessage.class);), then register DefaultKafkaConsumerFactoryCustomizer:
#Bean
// inject your custom objectMapper
public DefaultKafkaConsumerFactoryCustomizer customizeJsonDeserializer(ObjectMapper objectMapper) {
return consumerFactory ->
consumerFactory.setValueDeserializerSupplier(() ->
new org.springframework.kafka.support.serializer.JsonDeserializer<>(objectMapper));
}
In this case, you don't need to create your own CustomMessageDeserializer class (remove it) and Spring will automatically parse the message into your MyMessage.
#KafkaListener annotation should also not contains the property properties = {"value.deserializer=com.my.kafka_test.component.CustomMessageDeserializer"}. This DefaultKafkaConsumerFactoryCustomizer bean will automatically be used to configure the default ConsumerFactory<?, ?> (see the implementation of the KafkaAutoConfiguration#kafkaConsumerFactory method)
Here how it works for me:
#KafkaListener(topics = "${solr.kafka.topic}", containerFactory = "batchFactory")
public void listen(List<SolrInputDocument> docs, #Header(KafkaHeaders.BATCH_CONVERTED_HEADERS) List<Map<String, Object>> headers, Acknowledgment ack) throws IOException {...}
And then I have 2 beans defined in my Configuration
#Profile("!test")
#Bean
#Autowired
public ConsumerFactory<String, SolrInputDocument> consumerFactory(KafkaProperties properties) {
Map<String, Object> props = properties.buildConsumerProperties();
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
DefaultKafkaConsumerFactory<String, SolrInputDocument> result = new DefaultKafkaConsumerFactory<>(props);
String validatedKeyDeserializerName = KafkaMessageType.valueOf(keyDeserializerName).toString();
ZiDeserializer<SolrInputDocument> deserializer = ZiDeserializerFactory.getInstance(validatedKeyDeserializerName);
result.setValueDeserializer(deserializer);
return result;
}
#Profile("!test")
#Bean
#Autowired
public ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> batchFactory(ConsumerFactory<String, SolrInputDocument> consumerFactory) {
ConcurrentKafkaListenerContainerFactory<String, SolrInputDocument> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
factory.setBatchListener(true);
factory.setConcurrency(2);
ExponentialBackOffWithMaxRetries backoff = new ExponentialBackOffWithMaxRetries(10);
backoff.setMultiplier(3); // Default is 1.5 but this seems more reasonable
factory.setCommonErrorHandler(new DefaultErrorHandler(null, backoff));
// Needed for manual commits
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
return factory;
}
Note that the interface ZiDeserializer<SolrInputDocument> deserializeris my interface and ZiDeserializerFactory.getInstance(validatedKeyDeserializerName); returns my custom implementation of ZiDeserializer. And ZiDeserializer extends org.apache.kafka.common.serialization.Deserializer. This works for me
I am using Spring AOP to fire metrics in our application. I have created an annotation #CaptureMetrics which has an #around advice associated with it. The advice is invoked fine from all the methods tagged with #CaptureMetrics except for a case when a method is invoked on a prototype bean.
The annotation has #Target({ElementType.TYPE, ElementType.METHOD})
PointCut expression:
#Around(value = "execution(* *.*(..)) && #annotation(captureMetrics)",
argNames = "joinPoint,captureMetrics")
Prototype bean creation
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public DummyService getDummyServicePrototypeBean(int a, String b) {
return new DummyService(a, b);
}
DummyService has a method called dummyMethod(String dummyString)
#CaptureMetrics(type = MetricType.SOME_TYPE, name = "XYZ")
public Response dummyMethod(id) throws Exception {
// Do some work here
}
When dummyService.dummyMethod("123") is invoked from some other service, the #Around advice is not called.
Config class
#Configuration
public class DummyServiceConfig {
#Bean
public DummyServiceRegistry dummyServiceRegistry(
#Value("${timeout}") Integer timeout,
#Value("${dummy.secrets.path}") Resource dummySecretsPath) throws IOException {
ObjectMapper mapper = new ObjectMapper();
Map<String, String> transactionSourceToTokens = mapper.readValue(
dummySecretsPath.getFile(), new TypeReference<Map<String, String>>() {
});
DummyServiceRegistry registry = new DummyServiceRegistry();
transactionSourceToTokens.forEach((transactionSource, token) ->
registry.register(transactionSource,
getDummyServicePrototypeBean(timeout, token)));
return registry;
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public DummyService getDummyServicePrototypeBean(int a, String b) {
return new DummyService(a, b);
}
}
Singleton Registry class
public class DummyServiceRegistry {
private final Map<String, DummyService> transactionSourceToService = new HashMap<>();
public void register(String transactionSource, DummyService dummyService) {
this.transactionSourceToService.put(transactionSource, dummyService);
}
public Optional<DummyService> lookup(String transactionSource) {
return Optional.ofNullable(transactionSourceToService.get(transactionSource));
}
}
Any advice on this please?
Note:
The prototype Dummy service is used to call a third party client. It is a prototype bean as it has a state that varies based on whose behalf it is going to call the third party.
A singleton registry bean during initialization builds a map of {source_of_request, dummyService_prototype}. To get the dummyService prototype it calls getDummyServicePrototypeBean()
The configuration, registry and prototype dummy bean were correct.
I was testing the flow using an existing integration test and there instead of supplying a prototype Bean, new objects of DummyService were instantiated using the new keyword. It wasn't a spring managed bean.
Spring AOP works only with Spring managed beans.
I have a spring web application which registers multiple spring batch jobs for lengthy background processing at launch time. As near as I can tell, the spring-batch contexts are not aware of the singletons declared in the root AnnotationConfigWebApplicationContext bean, so multiple copies of complex-to-initialize beans are being instantiated for every job both at initial application load time and again at execution time.
I have determined that in AbstractBeanFactory.doGetBean() the bean is being correctly identified as a singleton, but for some reason the bean factory for the job is unaware of the bean factory for the parent context.
I have refactored some of the beans as Application scoped, but the application scope bean is (apparently) not legal in the spring batch context.
Either I am grossly misunderstanding something about spring scopes, or there is something out of kilter with my initialization of the spring-batch elements (code below). I am leaning towards both, as the one would lead to the other.
As I understand Spring scopes, I should see something like this, with each child scope being able to see singletons defined in the parent scope:
AnnotationConfigWebApplicationContext (web application context)
|
v
ResourceXmlApplicationContext (1 per registered job)
|
v
ResourceXmlApplicationContext (1 per registered step)
Initialization code in question:
#Component("mySingletonScopedBean")
#Scope(value = "singleton", proxyMode = ScopedProxyMode.DEFAULT)
#Order(1)
public class MySingletonScopedBean {
// getters, setters, etcetera
}
// EDIT: Added in response to comment below
#Autowired
public ApplicationContext applicationContext;
#Bean
public ClasspathXmlApplicationContextsFactoryBean classpathXmlApplicationContextsFactoryBean () throws IOException
{
String resourcePath = somePath "*.xml";
logger.trace("classpathXmlApplicationContextsFactoryBean() :: {} ", resourcePath);
Resource[] resources = applicationContext.getResources(resourcePath);
ClasspathXmlApplicationContextsFactoryBean bean = new ClasspathXmlApplicationContextsFactoryBean ();
bean.setApplicationContext(applicationContext);
bean.setResources(resources);
return bean;
}
#Bean
public AutomaticJobRegistrar automaticJobRegistrar() throws IOException, Exception {
ClasspathXmlApplicationContextsFactoryBean c = classpathXmlApplicationContextsFactoryBean ();
AutomaticJobRegistrar automaticJobRegistrar = new AutomaticJobRegistrar();
automaticJobRegistrar.setApplicationContext(applicationContext);
automaticJobRegistrar.setApplicationContextFactories(c.getObject());
automaticJobRegistrar.setJobLoader(jobLoader());
return automaticJobRegistrar;
}
#Bean
public JobLoader jobLoader() {
DefaultJobLoader jobLoader = new DefaultJobLoader(jobRegistry(), stepRegistry());
return jobLoader;
}
#Bean
public StepRegistry stepRegistry() {
MapStepRegistry stepRegistry = new MapStepRegistry();
return stepRegistry;
}
#Bean
public JobRegistry jobRegistry() {
JobRegistry jobRegistry = new MapJobRegistry();
return jobRegistry;
}
Edit: Closing
The whole spring environment initialization was messed up.
I'm saving my rants about the "clean" injection model and what a pain it is to figure things out when they break for a massive blog post after this project.
I created a spring bean in a configuration class like this:
#Bean
MyClass getMyClass() {
MyClass mc = new MyClass()
return mc;
}
Whenever MyClass is autowired in another class that needs it injected, will it always create a new object by virtue of new in bean definition?
Is a bean created this way a real singleton ?
Spring guarantees that whatever you do in the method annotated with Bean annotation it will be done only once. Internal Spring factories will take care about that.
Of course it depends from scope but by default scope is singleton. See docs:
Scopes
Bean
Is spring default scope singleton or not?
Small example which should help you to understand how it works:
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import java.time.LocalDateTime;
import java.util.Random;
#Configuration
public class SpringApp {
public static void main(String[] args) {
ApplicationContext ctx = new AnnotationConfigApplicationContext(SpringApp.class);
System.out.println(ctx.getBean(MyClass.class));
System.out.println(ctx.getBean(MyClass.class));
System.out.println(ctx.getBean(MyClass.class));
System.out.println(ctx.getBean(MyClass.class));
}
#Bean
public MyClass getMyClass() {
System.out.println("Create instance of MyClass at " + LocalDateTime.now());
MyClass myClass = new MyClass();
return myClass;
}
}
class MyClass {
private int value = new Random().nextInt();
#Override
public String toString() {
return super.toString() + " with values = " + value;
}
}
prints:
Create instance of MyClass at 2019-01-09T22:54:37.025
com.celoxity.spring.MyClass#32a068d1 with values = -1518464221
com.celoxity.spring.MyClass#32a068d1 with values = -1518464221
com.celoxity.spring.MyClass#32a068d1 with values = -1518464221
com.celoxity.spring.MyClass#32a068d1 with values = -1518464221
When you define bean with scope protoype
#Scope("prototype")
#Bean
public MyClass getMyClass()
App prints:
Create instance of MyClass at 2019-01-09T22:57:12.585
com.celoxity.spring.MyClass#282003e1 with values = -677868705
Create instance of MyClass at 2019-01-09T22:57:12.587
com.celoxity.spring.MyClass#7fad8c79 with values = 18948996
Create instance of MyClass at 2019-01-09T22:57:12.587
com.celoxity.spring.MyClass#71a794e5 with values = 358780038
Create instance of MyClass at 2019-01-09T22:57:12.587
com.celoxity.spring.MyClass#76329302 with values = 868257220
In your example, yes.
What will happen practically is that when Spring starts up, it will call the getMyClass() method which will new up an instance of the object. Spring will then retain that single instance and inject it into all other beans that require an instance of MyClass.
It will work this way since you've not declared a scope on the bean -- the default is singleton as indicated by the other answer.
Spring's concept of a singleton bean is quite different from the
Singleton pattern.
The scope of the Spring singleton is best described as per container
and per bean
Section 4.4.1
This mean that if you create a spring bean, that bean provides its lifecycle from within the IoC Spring Container.
If you want to create with "NEW really a singleton bean". You have this through a new Spring Ioc Container.
Let me illustrate this through an example.
public static void main(String[] args) {
ApplicationContext ctx = new AnnotationConfigApplicationContext(SpringApp.class);
ApplicationContext ctxReallyNewOne = new AnnotationConfigApplicationContext(SpringApp.class);
ApplicationContext ctx2ReallySecondOne = new AnnotationConfigApplicationContext(SpringApp.class);
System.out.println(ThreadColors.Red + "Beans created via same Ioc container with same class");
System.out.println(ctx.getBean(MyClass.class));
System.out.println(ctx.getBean(MyClass.class));
System.out.println(ThreadColors.Cyan + "Beans created via different Ioc container with SAME CLASS");
System.out.println(ctxReallyNewOne.getBean(MyClass.class));
System.out.println(ctx2ReallySecondOne.getBean(MyClass.class));
}
After running this code, the following is written to the console.
Also if you want to get the creation date of a bean.Does not need to use LocalDataTime.now() . It is best practice to use "The Lifecycle of Spring Beans" for this.
The Lifecycle of Spring Beans
I am newbie to spring and I am trying to modify my app to implement spring framework.
My request is to create a new bean for every new request and then refer that bean later in the code, for setting the values to it from a singleton bean.
I am trying to declare the bean as prototype and refer that bean in my singleton bean using lookup method.
But my problem was when trying to get the created prototype bean later for setting the values, I see its creating new bean again when getting the bean.
#Component
public class PersonTransaction {
#Autowired
PersonContext btContext;
#Autowired
PersonMapper personMapper;
public void setPersonMapper(PersonViewMapper personMapper) {
this.personMapper = personMapper;
}
public PersonBTContext createContext() throws ContextException {
return btContext = getInitializedPersonBTInstance();
}
private PersonBTContext getContext() throws ContextException{
return this.btContext;
}
public void populateView(UserProfileBean userProfile) throws ContextException {
personMapper.populateView(userProfile,getContext());
}
#Lookup(value="personContext")
public PersonBTContext getInitializedPersonBTInstance(){
return null;
}
}
below is my prototype class
#Component("personContext")
#Scope(value = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public class PersonContext extends ReporterAdapterContext {
private List<Person> persons = null;
private Person person = null;
private List<String> attributes = null;
private boolean multiplePersons = false;
private boolean attributeSelected = false;
public boolean isMultiple() {
return multiplePersons;
}
public boolean isAttributeSelected() {
return attributeSelected;
}
private void setAttributeSelected(boolean attributeSelected) {
this.attributeSelected = attributeSelected;
}
// remaining getters/setters
}
when i call createContext from singleton PersonTransaction class, it should create new prototype bean and how can get the created prototype bean later by calling getContext() method (what i am doing by this.btContext is returning new bean again, I guess !!)..
Need help in getting the created prototype bean later for setting the values.
appreciate ur help..
You want to create a request scoped bean, not a prototype scoped bean. Take a look at Quick Guide to Spring Bean Scopes which describes different bean scopes, including the request scope:
#Bean
#Scope(value = WebApplicationContext.SCOPE_REQUEST, proxyMode = ScopedProxyMode.TARGET_CLASS)
public PersonContext personContext() {
return new PersonContext();
}
This should simplify your logic as long as you can discard the bean after the request is processed.