#Autowired works but not #Inject - spring

I have a Resource which injects the following class
#Component
public class CustomDozerBeanMapper implements Mapper {
private final DozerBeanMapper beanMapper;
public CustomDozerBeanMapper() {
this.beanMapper = new DozerBeanMapper();
BeanMappingBuilder builder = new BeanMappingBuilder() {
protected void configure() {
//some mapping stuff
}
};
beanMapper.addMapping(builder);
}
#Override
public <T> T map(Object o, Class<T> aClass) throws MappingException {
return beanMapper.map(o, aClass);
}
#Override
public void map(Object o, Object o1) throws MappingException {
beanMapper.map(o, o1);
}
#Override
public <T> T map(Object o, Class<T> aClass, String s) throws MappingException {
return beanMapper.map(o, aClass, s);
}
#Override
public void map(Object o, Object o1, String s) throws MappingException {
beanMapper.map(o, o1, s);
}
}
In my applicationContext.xml I have declared
<context:annotation-config/>
<context:component-scan base-package="foo.bar"/>
<bean id="customDozerMapper" class="foo.bar.CustomDozerBeanMapper" />
Then in our resource I inject it
class SomeResource {
#Inject CustomDozerMapper customDozerMapper;
//We have loads of other Injects which work just fine, only this class has problems
}
Caused by: A MultiException has 1 exceptions. They are:
1. org.glassfish.hk2.api.UnsatisfiedDependencyException: There was no object available for injection at SystemInjecteeImpl(requiredType=CustomDozerBeanMapper,parent=SomeResource,qualifiers={},position=-1,optional=false,self=false,unqualified=null,1098507248)
at org.jvnet.hk2.internal.ThreeThirtyResolver.resolve(ThreeThirtyResolver.java:75)
at org.jvnet.hk2.internal.Utilities.justInject(Utilities.java:947)
at org.jvnet.hk2.internal.ServiceLocatorImpl.inject(ServiceLocatorImpl.java:975)
at org.jvnet.hk2.internal.ServiceLocatorImpl.inject(ServiceLocatorImpl.java:965)
at org.glassfish.jersey.server.spring.SpringComponentProvider$SpringManagedBeanFactory.provide(SpringComponentProvider.java:191)
at org.jvnet.hk2.internal.FactoryCreator.create(FactoryCreator.java:153)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:471)
at org.jvnet.hk2.internal.PerLookupContext.findOrCreate(PerLookupContext.java:70)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2072)
at org.jvnet.hk2.internal.ServiceLocatorImpl.internalGetService(ServiceLocatorImpl.java:761)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getService(ServiceLocatorImpl.java:700)
at org.glassfish.jersey.internal.inject.Injections.getOrCreate(Injections.java:172)
at org.glassfish.jersey.server.model.MethodHandler$ClassBasedMethodHandler.getInstance(MethodHandler.java:284)
at org.glassfish.jersey.server.internal.routing.PushMethodHandlerRouter.apply(PushMethodHandlerRouter.java:74)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:109)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:112)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:112)
Now if I change and use #Autowired, it works fine
We are using Spring for dependency management, but for some reason h2k is being used, and I get the following exception
Can anyone please explain what the problem might be?
Why does it work with #Autowired and not #Inject
Why is h2k being used, and not Spring?

Probably, the problem may be because of 2 bean declarations (one in the XML configuration and the another one with #Component) and a DI container couldn't able to pick up one of them.
All solutions that are available here:
removing one of the bean definitions (I'd prefer the XML one)
specifying a bean by the #Qualifier or #Named annotation

The problem might also be due to the bean name in config file customDozerMapper and actual injection customerDozerMapper are not matching. If #inject does not find matching bean, it throws an exception. However, there is provision for #autowired wherein you can set attribute required=false and it injects null if it does not find matching bean.
Note: Configuration in config xml overrides the annotation
configuration.

Related

Is it possible to read property file values inside #Repository?

Is it possible to read property file values inside #Repository.
Any help is appreciated.
As M.Deinum mentioned in the comment section, #Repository is just a variation of #Component. You can read your property file by injecting Environment or you can just use #Value.
If you use a class repository, you can use the #Value annotation on any non-final field to have it populated automatically.
On the other hand, if you are using an interface repository, you cannot read a property directly, as you have nowhere to inject your components/values.
Anyway, I found a sort of work-around which can be used to achieve a similar result.
You start by creating a class that implements ApplicationContextProvider:
#Component
public class StaticPropertiesProvider implements ApplicationContextAware {
private static ApplicationContext applicationContext;
#Override
public void setApplicationContext(#Nonnull final ApplicationContext applicationContext) throws BeansException {
StaticPropertiesProvider.applicationContext = applicationContext;
}
public static <T> T getProperty(final String property, final Class<T> clazz) {
return applicationContext.getEnvironment().getProperty(property, clazz);
}
}
When your application starts, the static applicationContext field will be populated with your Application Context, so that you can use it afterwards from the static getProperties method.
You can then call it from a default method inside your repository, e.g.
public interface MyRepository extends MongoRepository<MyDocument, String> {
#Query(value = "{'myProperty': ?0}")
List<MyDocument> findByMyProperty(int myProperty);
default List<MyDocument> findByMyProperty() {
final var myPropertyDefault = StaticPropertiesProvider.getProperty("my.property", Integer.class);
return this.findByMyProperty(myProperty);
}
}
Clearly there should be a property name my.property with an int value inside application.properties.
(you can do the same even with JPA repositories)

Spring Data Rest: #Autowire in Custom JsonDeserializer

I am trying to autowire a component into a custom JsonDeserializer but cannot get it right even with the following suggestions I found:
Autowiring in JsonDeserializer: SpringBeanAutowiringSupport vs HandlerInstantiator
Right way to write JSON deserializer in Spring or extend it
How to customise the Jackson JSON mapper implicitly used by Spring Boot?
Spring Boot Autowiring of JsonDeserializer in Integration test
My final goal is to accept URLs to resources in different microservices and store only the ID of the resource locally. But I don't want to just extract the ID from the URL but also verify that the rest of the URL is correct.
I have tried many things and lost track a bit of what I tried but I believe I tried everything mentioned in the links above. I created tons of beans for SpringHandlerInstantiator, Jackson2ObjectMapperBuilder, MappingJackson2HttpMessageConverter, RestTemplate and others and also tried with setting the SpringHandlerInstantiator in RepositoryRestConfigurer#configureJacksonObjectMapper.
I am using Spring Boot 2.1.6.RELEASE which makes me think something might have changed since some of the linked threads are quite old.
Here's my last attempt:
#Configuration
public class JacksonConfig {
#Bean
public HandlerInstantiator handlerInstantiator(ApplicationContext applicationContext) {
return new SpringHandlerInstantiator(applicationContext.getAutowireCapableBeanFactory());
}
}
#Configuration
public class RestConfiguration implements RepositoryRestConfigurer {
#Autowired
private Validator validator;
#Autowired
private HandlerInstantiator handlerInstantiator;
#Override
public void configureValidatingRepositoryEventListener(ValidatingRepositoryEventListener validatingListener) {
validatingListener.addValidator("beforeCreate", validator);
validatingListener.addValidator("beforeSave", validator);
}
#Override
public void configureJacksonObjectMapper(ObjectMapper objectMapper) {
objectMapper.setHandlerInstantiator(handlerInstantiator);
}
}
#Component
public class RestResourceURLSerializer extends JsonDeserializer<Long> {
#Autowired
private MyConfig config;
#Override
public Long deserialize(JsonParser p, DeserializationContext ctxt) throws IOException, JsonProcessingException {
ServiceConfig serviceConfig = config.getServices().get("identity");
URI serviceUri = serviceConfig.getExternalUrl();
String servicePath = serviceUri.getPath();
URL givenUrl = p.readValueAs(URL.class);
String givenPath = givenUrl.getPath();
if (servicePath.equals(givenPath)) {
return Long.parseLong(givenPath.substring(givenPath.lastIndexOf('/') + 1));
}
return null;
}
}
I keep getting a NullPointerException POSTing something to the API endpoint that is deserialized with the JsonDeserializer above.
I was able to solve a similar problem by marking my deserializer constructor accept a parameter (and therefore removing the empty constructor) and marking constructor as #Autowired.
public class MyDeserializer extends JsonDeserializer<MyEntity> {
private final MyBean bean;
// no default constructor
#Autowired
public MyDeserializer(MyBean bean){
this.bean = bean
}
...
}
#JsonDeserialize(using = MyDeserializer.class)
public class MyEntity{...}
My entity is marked with annotation #JsonDeserialize so I don't have to explicitly register it with ObjectMapper.

Dynamic context - autowiring

I am dynamically adding contexts to the application context with the following code:
#Component
#Scope("singleton")
public class DynamicContextLoader implements ApplicationContextAware {
private static ApplicationContext context;
private Map<String, InterfacePropertyDto> contextMap;
#Autowired
IJpaDao jpaDao;
#PostConstruct
public void init() {
contextMap = (Map<String, InterfacePropertyDto>) context.getBean("contextMap");
contextMap.forEach((contextName, property) -> {
String p = jpaDao.getProperty(property.getPropertyName(), property.getPropertyType());
if (p != null) {
ConfigurableApplicationContext ctx = new ClassPathXmlApplicationContext(
new String[]{"/META-INF/spring/integration/" + contextName},
false, context);
ctx.refresh();
}
});
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
context = applicationContext;
}
}
This works well and all the beans defined in the new context are created. However the #Autowired does not work for any of these new beans.
For example a bean defined in the new context as:
<bean id="outboundContractJdbcFileMapper" class="com.......integration.model.contract.ContractMapper"/>
has the following autowiring:
public class ContractMapper implements RowMapper<ContractFile> {
#Autowired
IIntegrationDao integrationDao;
#Override
public ContractFile mapRow(ResultSet rs, int rowNum) throws SQLException {
......
}
}
At runtime the outboundContractJdbcFileMapper property integrationDao is null.
Is there a way to force the autowiring to occur when the beans are created? I was hoping that ctx.refresh() would do this.
That doesn't work automatically for the ClassPathXmlApplicationContext. You have to add <context:annotation-config/> to that child context as well:
<xsd:element name="annotation-config">
<xsd:annotation>
<xsd:documentation><![CDATA[
Activates various annotations to be detected in bean classes: Spring's #Required and
#Autowired, as well as JSR 250's #PostConstruct, #PreDestroy and #Resource (if available),
JAX-WS's #WebServiceRef (if available), EJB 3's #EJB (if available), and JPA's
#PersistenceContext and #PersistenceUnit (if available). Alternatively, you may
choose to activate the individual BeanPostProcessors for those annotations.
Note: This tag does not activate processing of Spring's #Transactional or EJB 3's
#TransactionAttribute annotation. Consider the use of the <tx:annotation-driven>
tag for that purpose.
See javadoc for org.springframework.context.annotation.AnnotationConfigApplicationContext
for information on code-based alternatives to bootstrapping annotation-driven support.
]]></xsd:documentation>
</xsd:annotation>
</xsd:element>

Spring Boot - using Externalized Configuration values in #Configuration classes

I needed to externalize our session storage, so have used spring-session.
Following their examples at https://github.com/spring-projects/spring-session/blob/master/samples/boot/src/main/java/sample/config/EmbeddedRedisConfiguration.java, I created my EmbeddedRedisConfiguration and everything works as it should.
I decided that I wanted optional support to specify the Redis executable path, in the case of pre existing local redis server, so I have added to /resources/config/application.properties the following key value redis.embedded.executable.path==/path/to/redis.
My immediate thought was then to just use #Value annotation in my configuration, and have access to the value
static class RedisServerBean implements InitializingBean, DisposableBean, BeanDefinitionRegistryPostProcessor {
private RedisServer redisServer;
#Value("${redis.embedded.executable.path}")
String executablePath;
public void afterPropertiesSet() throws Exception {
if (executablePath != null) {
redisServer = new RedisServer(new File(executablePath), Protocol.DEFAULT_PORT);
} else {
redisServer = new RedisServer(Protocol.DEFAULT_PORT);
}
redisServer.start();
}
public void destroy() throws Exception {
if(redisServer != null) {
redisServer.stop();
}
}
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {}
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {}
}
However, executablePath is always null. As you know, if you use an #Value in a #Service class or equivalent, the value will be populated.
I assume that this configuration is being invoked before the beans that load the properties, but I also know this is possible, because eg #DatasourceAutoConfiguration can use spring.datasource.* properties
I am obviously overlooking something simple here. Do I require my own #ConfigurationProperties
Change your property file to:
redis.embedded.executable.path=/path/to/redis

Proxy cannot be cast to CLASS

I'm using Spring for wiring dependencies specifically for DAO classes that use Hibernate, but I'm getting an exception that has me puzzled:
$Proxy58 cannot be cast to UserDao
My DAO is configured as follows:
<bean id="userDao" class="com.domain.app.dao.UserDao">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
And I have an interface, abstract base class and a final implementation as follows.
Interface:
public interface Dao {
public void save(Object object);
public Object load(long id);
public void delete(Object object);
public void setSessionFactory(SessionFactory sessionFactory);
}
Abstract Base Class:
public abstract class BaseDao implements Dao {
private SessionFactory sessionFactory;
#Transactional
#Override
public void save(Object object) {
PersistentEntity obj = (PersistentEntity) object;
currentSession().saveOrUpdate(obj);
}
#Transactional
#Override
public abstract Object load(long id);
#Transactional
#Override
public void delete(Object object) {
// TODO: this method!
}
public void setSessionFactory(SessionFactory sessionFactory) {
this.sessionFactory = sessionFactory;
}
public Session currentSession() {
return sessionFactory.getCurrentSession();
}
}
Implementation:
public class UserDao extends BaseDao implements Dao {
#Transactional(readOnly=true)
#Override
public Object load(long id) {
Object user = currentSession().get(User.class, id);
return user;
}
}
The following throws the exception mentioned above:
UserDao dao = (UserDao) context.getBean("userDao");
This, however, does not throw an exception:
Dao dao = (Dao) context.getBean("userDao");
If anyone can offer any assistance or guidance as to why this exception is happening, I would be very appreciative.
Spring uses JDK dynamic proxies by default ($Proxy58 is one of them), that can only proxy interfaces. This means that the dynamically created type $Proxy58 will implement one or more of the interfaces implemented by the wrapped/target class (UserDao), but it won't be an actual subclass of it. That's basically why you can cast the userDao bean to the Dao interface, but not to the UserDao class.
You can use <tx:annotation-driven proxy-target-class="true"/> to instruct Spring to use CGLIB proxies that are actual subclasses of the proxied class, but I think it's better practice to program against interfaces. If you need to access some methods from the proxied class which are not declared in one of it's interfaces, you should ask yourself first, why this is the case?
(Also, in your code above there are no new methods introduced in UserDao, so there is no point in casting the bean to this concrete implementation type anyway.)
See more about different proxying mechanisms in the official Spring reference.
I was writing unit tests and needed to be able to stub out the DAOs for some calls.
Per This guys post:
http://www.techper.net/2009/06/05/how-to-acess-target-object-behind-a-spring-proxy/
I used his method provided:
#SuppressWarnings({"unchecked"})
protected <T> T getTargetObject(Object proxy, Class<T> targetClass) throws Exception {
if (AopUtils.isJdkDynamicProxy(proxy)) {
return (T) ((Advised)proxy).getTargetSource().getTarget();
} else {
return (T) proxy; // expected to be cglib proxy then, which is simply a specialized class
}
}
Then you can easily call it with the proxy and get the object behind the proxy and manipulate the objects in it directly as needed.

Resources