Spring - usage of alias vs names - spring

I am confused on the usage of alias. I do understand what alias is and how it is being used but i don't see how it can be any different than using names on a bean definition.
<bean id="xyx" name="abc,def" .. />
<alias name="xyx" alias="pqr"/>
Why the alias when i can use abc or def?

In my mind bean aliasing can be helpful in large system, where you can not manipulate bean names. You have option to create your own name (alias) specific for your part of the system...
from Spring documentation (3.0.x)
http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/htmlsingle/
...it is sometimes desirable to give a single bean multiple names,
otherwise known as bean aliasing...
therefore creating multiple names or/and aliasing are the same thing.

A use case maybe when you want to customize some beans that are already defined somewhere in a modular application (each module is a spring project for example), the bean maybe defined by a third-party framework/API or even your team. In that case you want that only inside your spring project call the customized version without altering other modules (projects), to do that just add the alias in your spring configuration which is indeed a powerful feature:
<alias alias="globalBeanService" name="customizedBeanService" />
Hence, whenever spring find a call to the globalBeanService, it will inject customizedBeanService for you inside your specific module.
Without this feature, you should go through all classes and modify the bean manually!!

An aliased bean will always have higher priority over a non-aliased one, and in case of having different beans with the same alias then the last one declared will have the priority. In other words, the aliased bean will override the non-aliased beans.
This can be particularly useful when creating big projects or when you are building extensions to your project and don't want to touch the original bean definition.

Alias has a specific using scenario which multiple names don't have:
Imagine multiple config xml files in your project, most of which are authored by your colleagues, and you need to add your own config.xml file. Using you'll be able to refer to a bean defined in another config file with a different name that's maybe more meaningful to your config, without having to touch your colleagues' config files.

I recently found another use case where alias easily solved a problem.
When auto configuration is active, Spring Boot provides the bean serverProperties which can be used to access information about the server currently running the web app.
In integration tests (i.e. when #SpringBootTest annotation is present) the same bean is available under the name org.springframework.boot.autoconfigure.web.ServerProperties.
Of course it is possible to use a different profile for integration testing, but that would require manual change of configuration at multiple places. However, simply by adding
<alias name="serverProperties" alias="org.springframework.boot.autoconfigure.web.ServerProperties"/>
the same configuration files can be used for integration tests and in production.
This might be a bug in Spring Boot, however alias easily solve the problem without waiting for a new release. And most certainly I have no possibility to alter the Boot configuration myself.

Related

#Value config injection failing

I am trying to run a spring JUnit test case using -
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration({ "classpath:some.xml" })
The xml has bean defined along with in memory db details
<bean id="orderService" class="com.example.OrderServiceImpl">
<!-- set properties, etc. -->
</bean>
I am doing #Value injection inside the bean class OrderServiceImpl, but it does not happen while executing the test case but the same runs fine when I run the application. Can you please help ?
You need to add a PropertySourcesPlaceholderConfigurer or PropertyPlaceholderConfigurer to your test context. This SO question may give you a hint: Populating Spring #Value during Unit Test.
Check to see if a some.xml exists in both main and test trees. If it exists in both, the one in the test tree should override the one in the main branch.
Make sure the some.xml you are actually loading has a property-placeholder, such as
<context:property-placeholder location="classpath:some.properties"/>
I realize that there are more modern ways to manage properties, but this is simple and easy to specify for unit tests.
I find that multiple tests become very awkward with config files on the classpath, so I like to take advantage of a feature of the #ContextConfiguration that lets me create a dedicated minimal config for each test. The way it works is that for each test class, by convention, it can look for a config file in the same relative directory path as your test class package, and named after your test case. That way you can completely control the config and properties for each test case. You might try it--it can eliminate confusion caused by shared config files. To do it, remove the value in the#ContextConfiguration. Then, say you have a test case com.myCompany.SomeTest located in src/test/java/com/myCompany/. Create a file called SomeTest-context.xml in directory src/test/resources/com/myCompany and put the minimal config you need for that unit in the file. #ContextConfiguration will, by convention, find the config file of that name in that location and use it for your test. Although not part of the conventions I just spoke of, I put a properties file for each test in the same directory with just the properties I need for that test, named after the test case as well (e.g. SomeTest.properties). In your test case-specific context, add a property-placeholder line like this to get your test-specific properties:
<context:property-placeholder location="classpath:com/myCompany/SomeTest.properties"/>
At the top of your test case, you would put
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration // no parameters
public class FileEncryptionUtilsTest { ...
If you do that, you'll be able to inject beans and values to your heart's content, without worrying about side-effects of things being added to a shared context or dealing with errors arising from multiple context files with the same name.
The key advantage is that you are testing a unit here, not the integration represented by an application context file. If you share an application context file in your unit tests, you're testing the application context along with your bean, and that's more of an integration test goal, not a unit test need.

Deploying BEAN in OSGi plugin

I am currently deploying my custom controls as OSGi plugins and I wanted to do the same thing with my beans. I have tried putting them into the OSGi plugin and it works fine but the only problem I have is the faces-config.
It seems it has to be called faces-config in the OSGi plugin to work but that means i can't use beans in the NSF anymore because it seems to ignore the local faces-config.
Is there a way to change the name of the faces-config in the OSGi plugin?
Something like FEATURE-faces-config.xml?
In the class in your plugin that extends AbstractXspLibrary, you can override "getFacesConfigFiles", which should return an array of strings representing paths within the plugin to additional files of any name to load as faces-config additions. For example:
#Override
public String[] getFacesConfigFiles() {
return new String[] {
"com/example/config/beans.xml"
};
}
Then you can put the config file in that path within your Java source folder (or another folder that is included in build.properties) and it will be loaded in addition to your app's normal faces-config, beans and all.
The NSFs are running as separate, distinct Java applications. The OSGi plugin is running in the OSGi layer, above all those distinct Java applications, as a single code base. Consequently, the faces-config is only at that level.
It's possible to load them dynamically, by using an ImplicitObjectFactory, loaded from an XspContributor. That's what is done in OpenNTF Domino API for e.g. userScope (which is a bean stored in applicationScope of an NSF). See org.openntf.domino.xsp.helpers.OpenntfDominoImplicitObjectFactory, which is referenced in OpenntfDominoXspContributor, loaded via the extension point of type "com.ibm.xsp.library.Contributor".
A few caveats:
You have no control over what happens if you try to register your bean with a name the developer also uses for a different variable in that scope.
Unless you add code to check if the library is enabled, as we do, you'll be adding the bean to every database on the server.
You still need to add the library to the NSF. Unless you also provide a component that those databases will all use, there's no way you can programmatically add it, as far as I know.
It might be easier to skip the bean approach and just add an instance of the Java class in beforePageLoad, page controller class, or however you're managing the backing to the relevant XPage (if viewScope) or application (if sessionScope / applicationScope).

Attribute 'local' is not allowed to appear in element 'ref' in Spring 4.1.5

I am currently going through Mkyong Spring 3.0 tutorial.
He goes on to clarify these 2 basic things.
1. Bean in different XML files
If you are referring to a bean in different XML file,
you can reference it with a ‘ref‘ tag, ‘bean‘ attribute.
2. Bean in same XML file
If you are referring to a bean in same XML file,
you can reference it with ‘ref‘ tag, ‘local‘ attribute.
The only thing I can make out is that the local attribute is not supported any more in version 4.1.5. For the timebeing, I have sticked to the
<ref bean="thebeaninstance"/>
and it works for both. <ref bean="..."> requires only it to be in the same context, or in a parent context.
But i am looking for something by which i will be able to resolve the difference once if somehow i encounter same ids in 2 different beans, and wish to refer to the one defined in different XML, since preference goes to the one defines in the same XML. Please suggest.
Spring dropped support for the local ref elements in version 4.0.1. This is from the notes:
Removed the ref 'local' attribute in spring-beans-4.0.xsd since 'local' lost its differentiating role to a regular bean ref back in the 3.1 days when we started allowing for the same bean id to reappear in a different beans section of the same configuration file (with a different profile). Issue: SPR-10437
See here: link
Basically, you should use the bean element.

Look up a dynamic property at run-time in Spring from PropertySourcesPlaceholderConfigurer?

Not sure of the best approach to this. We've created a jar that could be used by different projects. The other projects relying on this jar need to provide certain properties defined in one of their spring properties files. (Our jar shouldn't care what they name those property files.)
Using #Value("${some.prop}") works great for most properties, however we now have the requirement that the name of the property to look up is dynamic. For example:
int val = getSomeVal();
String propNeeded = foo.getProperty("foo."+val+".dynamic.prop");
Not sure what "foo" should be to get my access. I looked into injecting Environment, however from all my googling it looks like that will not load from an xml property-placeholder definition (even if defined as a bean def for PropertySourcesPlaceholderConfigurer.) You seem to have to use #PropertySource, yet my main config is an XML file so not sure how to get Environment to work. (I can't really go 'old skool' and look up the property file as a class path Resource either since I'm not aware of the name of the file the users defined.)
I don't mind making this particular Service class ApplicationContextAware, but if I did that how could I get access to the underlying PropertySourcesPlaceholderConfigurer ? which I would 'seem?' to need in order to get access to a property dynamically?
The other option is that I force users of the jar to declare a bean by a name that I can look up
<util:properties id="appProps" location="classpath:application.properties" />
And I then inject appProps as Properties and look up from there. I don't like this approach though since it forces the users of the library to name an file by a common id. I would think the best solution is to just get a handle in some way to the underlying PropertySourcesPlaceholderConfigurer in my service class... I'm just not sure how to do it?
Why doesn't Spring simply allow PropertySource to be defined some how via your XML config and then I could just inject Environment?
Thanks for any suggestions how to accomplish what I want.
You could have a ReloadableResourceBundleMessageSource declared to read from the same source as the PropertySourcesPlaceholderConfigurer. This way you could just #Autowire MessageSource (or make your bean implement MessageSourceAware) and use that to retrieve your properties.
Main reason for using ReloadableResourceBundleMessageSource is to retrieve I18N messages, so that would kind of hacky...

Spring namespaces and placeholder in springs.schemas

Spring-namespaces allows you to define your own structure how spring beans could be configured. Very cool.
I have to use a 3rd party software (Assentis Docbase) which defines in its spring.schemas the following (example below simplified)
http\://com.apress.prospring2/ch07/custom.xsd=custDir:/custom.xsd
Meaning: If user defines in its spring-xml with schema-location: "http://com.apress.prospring2/ch07/custom.xsd" spring will validate this file against custom.xsd.
custDir is a directory OUTSIDE the provided jar. Does anyone have an idea how I can set this custDir to point to a valid path during junit test? I already tried -DcustDir=/pathToXsd/ but it did not work.
If I remove custDir than everything works as expected, but I can not remove it from provided spring.schemas since it is 3rd party software.
Maybe this is an issue how property-files are handled in java but I have no idea.
You may be able to "override" this entry by providing your own custom spring.schemas file with the same entry but with a location to your custom xsd file. The catch is that this is highly dependent on the order in which the spring.schemas are loaded up, but could be worth a try.
Since custDir is not a place holder, you cannot replace it the way you are doing it, I am surprised that the third party schema location is outside of the classpath.
The syntax of spring.schemas I provided in my question is a properitary definition of 3.rd party software. They implemented there own EntityResolver which manually reacts on "custDir:" and starts some magic algorithm. So I came to the following workaround.
You have to create your own my_spring.schemas which must be live in META-INF/. Than you have to make sure that spring loads my_spring.schemas and NOT spring.schemas.
I achieved it with implementing my own TestingContext which is a subclass of ClassPathXmlApplicationContext. In TestingContext I overwrote method protected void loadBeanDefinitions(DefaultListableBeanFactory beanFactory) throws IOException and filled it with implementation from org.springframework.context.support.AbstractXmlApplicationContext. The only change I made was to line beanDefinitionReader.setEntityResolver(new ResourceEntityResolver(this)) to beanDefinitionReader.setEntityResolver(new PluggableSchemaResolver(getClassLoader(), "META-INF/my_spring.schemas). And voila if I use TestingContext my own my_spring.schemas is loaded.
Drawback with this solution is that you have to provide all xsd in your jar because the default name, where spring looks up definitions has been changed.

Resources