Is it necessary to add getter and setter at bean definition of osgi blueprint file - osgi

My project based on Apache karaf osgi and dependency inject is done through blueprint file. I wish to know if getter and setter are really required for such beans. I tested it without getter and setter methods and it works, but not sure if that follow best practice. My motive is to just reduce LOC from that file.
<?xml version="1.0" encoding="UTF-8"?>
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0"..>
.
.
<bean id="emailServiceImpl" class="com.mycompany.EmailServiceImpl">
<property name="applicationEnvironment" value="$(staging)" />
.
.
<bean id="orderDispatcherImpl" class="com.myCompany.OrderDispatcherImpl"
ext:field-injection="true" init-method="init">
<property name="emailService" ref="emailServiceImpl"/>

The Blueprint specification defines only injection of properties via JavaBeans-style setter methods.
Field injection is an extension which is specific to the Apache Aries implementation of Blueprint as used in Karaf. Therefore it will not work in other Blueprint implementations.
If you wish your blueprint container definition to be portable across implementations then it would be better to use JavaBeans-style setter methods. If you don't care about this, then you can use field injection and forget about the setter methods.
However, note that another reason to keep the setter methods may be for unit testing purposes.

Related

Spring AOP not working on all annotation methods

I have created a custom annotation in my spring mvc project.
The annotation is used to do an AOP
#Around("execution(#Cached * * (..)) && #annotation(cache)")
Here the annotation that I have created is "Cached", any method with the annotation is cached in couch base with the response as its value and the method argument as its key.
The problem is the annotation works (AOP works) on the controllers well. However from controllers, I am making call to different callable classes and utils. When I add the annotation" #Cached" on the callable classes or the util funcations the AOP doesn't work.
In the XML file, the following is what I have declared.
<aop:aspectj-autoproxy/>
<context:spring-configured/>
<context:component-scan base-package="com.abc.xyz">
<!--<context:include-filter type="annotation" expression="org.aspectj.lang.annotation.Aspect"/>-->
</context:component-scan>
<bean id="universalController" class="com.abc.xyz.misc.UniversalController"/>
<bean class="com.abc.xyz.api.metric.SystemTiming"/>
<bean class="com.abc.xyz.api.annotations.URLCacheImpl"/>
With Spring AOP, your classes which match the pointcut (where you have placed your #Cached annotation in this specific case) should be Spring beans. So the best guess that I can make is that your utility classes are very likely not Spring beans and that is reason why they are not getting woven in. You have two options that I can think of:
Make your utility classes also clean Spring beans
Use full Aspectj support - this way even though your utility classes are not Spring beans they would be woven with the advice.

Combine OSGi blueprint and spring configuration

Are there any good/best practices regarding the combination of Spring configuration and OSGi Blueprint (e.g. Gemini Blueprint)? Which XML files do you use? Where do you put them in your OSGi bundles (META-INF/spring, OSGi-INF)? Which of these practices will allow you to reuse your bundles in combination with a non-Gemini-implementation of Blueprint?
Background: We are in the process of switching from Spring/Spring DM to Spring/Blueprint. I am aware of Blueprint defining a <bean> element. However we occasionally face the situation that the limited bean definition capabilities of the Blueprint specification do not meet all our needs. So it seems to be a good choice to use Spring configuration within our bundles and Blueprint for wiring bundles via OSGi services.
Which XML files do you use? Where do you put them in your OSGi bundles
(META-INF/spring, OSGi-INF)? Which of these practices will allow you
to reuse your bundles in combination with a non-Gemini-implementation
of Blueprint?
Gemini Blueprint treats both of these directories equally, but OSGI-INF/blueprint/*.xml is the only one specified in the generic OSGi Blueprint specification.
A suggested practice from the Gemini Blueprint documentation is:
[...] A
suggested practice is to split the application context configuration
into at least two files, named by convention modulename-context.xml
and modulename-osgi-context.xml. The modulename-context.xml file
contains regular bean definitions independent of any knowledge of
OSGi. The modulename-osgi-context.xml file contains the bean
definitions for importing and exporting OSGi services. It may (but is
not required to) use the Gemini Blueprint OSGi schema as the top-level
namespace instead of the Spring 'beans' namespace.
I tried this, and it works great. I use Gemini Blueprint for one of my projects which has the files META-INF/spring/context.xml, which defines my beans and their relationships, and META-INF/spring/osgi-context.xml, which defines which beans to expose as/import from OSGi services and how. context.xml looks like
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.0.xsd">
<bean id="myOrdinarySpringBean" class="com.acme.impl.Foo"/>
</beans>
and is a regular ordinary Spring application context with no Blueprint/OSGi configuration at all. osgi-context.xml looks like
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">
<service id="myOsgiService" ref="myOrdinarySpringBean" interface="com.acme.Foo"/>
</blueprint>
You could, of course, use the <beans> namespace and root element here as well, but you'd have to define a xmlns:osgi and prefix the service like so: <osgi:service .../> for that to work. In my case I don't need the Gemini specific Blueprint stuff, so I'm happy with this generic Blueprint configuration. Likewise, I could use the <blueprint> namespace in context.xml as well, but this particular application is an old one being ported to OSGi, so I prefer to keep that configuration Spring specific for now.
Another application in turn has its own osgi-context.xml like
<blueprint xmlns="http://www.osgi.org/xmlns/blueprint/v1.0.0">
<reference id="myOrdinarySpringBeanImportedFromOsgi" interface="com.acme.Foo" availability="mandatory"/>
</blueprint>
and at this time doesn't, but could, have its own context.xml like
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.0.xsd">
<bean id="myOrdinaryOtherSpringBean" class="com.acme.impl.Bar">
<property name="foo" ref="myOrdinarySpringBeanImportedFromOsgi"/>
</bean>
</beans>
and couldn't really care less whether myOrdinarySpringBeanImportedFromOsgi is imported from an OSGi service or defined as a regular ordinary Spring bean in the same application context.
These META-INF/osgi-context.xml configurations could trivially be moved to OSGI-INF/blueprint/ if I want to decouple yourself from the Gemini Blueprint implementation, but for the time being I prefer to keep the two halves in the same place to avoid making a mess of the directory structure.
Blueprint files should go under OSGI-INF/blueprint/ and are named *.xml (typically blueprint.xml). This location is per the OSGi 4.2 Blueprint spec and will work with Aries or Gemini.
Spring-DM files (as you probably know) go under META-INF/spring/ and are also named *.xml (typically beans.xml)
Both files should be able to peacefully co-exist. They'll only work, though, if you have support for each container installed.
Wiring should be done via the OSGi Service Registry.
As for migration, we have stayed on Spring-DM for capabilities that we couldn't do in Blueprint. Everything else has been migrated to Blueprint.

Injecting static references in Struts action class

Our application contains struts and spring. Struts action classes are also configured as spring beans in applicationContext.xml. Spring class references are wired to action classes using 'property'.
For Ex.,
applicationContext.xml
<bean id="sampleAction" class="com.arizona.sample.action.SampleAction">
<property name="sampleManager" ref="sampleManager" />
</bean>
In SampleAction, I got to write a static method where it uses 'sampleManager' reference. So, I have configured 'sampleManager' as static variable. At runtime I got a NullPointerExcpetion at the place where 'sampleManager' is used. I have concluded that 'sampleManager' ain't get initialized.
Can anyone please help me in this regard?
P.S.: I have provided setSampleManager(..) and also tried with #Autowired.
If you find yourself trying to interact with an inherently non-static object (sampleManager) from a static context (method), then your design has some fundamental flaw in it. Go back and refactor your solution to employ proper OO design, don't try to fix it with some ugly hack.

Using EasyMock 3 IMockBuilder with Spring

I've been looking into using EasyMock 3's IMockBuilder as a means of generating partial mocks (I know partial mocking may suggest a design flaw, but I'm writing tests for old code). Presumably I can use the deprecated static EasyMock.createMock() methods to create beans in my Spring config, like this:
<bean id="myBean" class="org.easymock.EasyMock" factory-method="createMock">
<constructor-arg value="org.mypackage.MyClass.class" />
</bean>
When creating a partial mock using an IMockBuilder, I need to make several calls to addMockedMethod() in order to define the methods I want mocked. Is there a way I can do this in a Spring XML configuration file? Ideally I'd like all the dependencies of all my beans set by Spring, and don't want to have to override them in my test cases in order to pass in mock objects created in this way.
Thanks
No, XML config isn't capable of that sort of flexibility.
You have two options:
Write an implementation of FactoryBean which creates the mock, configures it, and returns the mock to Spring. See Customizing instantiation logic with the FactoryBean Interface.
Use #Configuration-style configuration in Java, instead of XML style configuration. This is the most flexible approach, and is generally better than XML config. See Java-based container configuration

Spring custom annotation - how to make it part of a library?

I've created a custom annotation (in Spring 3.05) that works great. I'd like to take that code and make it part of a library, packaged in a jar file, so I don't have to include my custom annotation code in each web app I write.
I'm unable to get Spring to act on the annotation, however. My library jar is in my web app's classpath and I tried scanning for it in applicationContext.xml:
<context:component-scan base-package="my.annotation.pkg" />
The field annotated with my custom annotation continues to be null.
Ideally I'd like to this to just work with a minimum of fuss and configuration, but so far I haven't had any success.
What part of Spring's wiring am I missing to get my custom annotation recognized when it's part of an external library?
Update
Here is how I "solved" it...just had to read a little more closely. In each context file (i.e. applicationContext.xml, dispatch-servlet.xml) I added the line:
<bean class="my.annotation.CustomInjector" />
...where my CustomInjector implements BeanPostProcessor. I based this on the code at this blog post: Implementing Seam style #Logger injection with Spring.
The author says I needed to do exactly what I did, so bad on me for not reading thoroughly. Why, though, is adding that bean definition required? Maybe Spring annotations are configured similarly under the hood - I just don't get why having the jar file on the classpath isn't enough.
Is your custom annotation annotated with the #Component annotation? From the Spring reference manual:
By default, classes annotated with #Component, #Repository, #Service, #Controller, or a custom annotation that itself is annotated with #Component are the only detected candidate components.
Alternatively, you could add a custom include-filter to the component-scan element in your XML configuration.

Resources