Deploying BEAN in OSGi plugin - osgi

I am currently deploying my custom controls as OSGi plugins and I wanted to do the same thing with my beans. I have tried putting them into the OSGi plugin and it works fine but the only problem I have is the faces-config.
It seems it has to be called faces-config in the OSGi plugin to work but that means i can't use beans in the NSF anymore because it seems to ignore the local faces-config.
Is there a way to change the name of the faces-config in the OSGi plugin?
Something like FEATURE-faces-config.xml?

In the class in your plugin that extends AbstractXspLibrary, you can override "getFacesConfigFiles", which should return an array of strings representing paths within the plugin to additional files of any name to load as faces-config additions. For example:
#Override
public String[] getFacesConfigFiles() {
return new String[] {
"com/example/config/beans.xml"
};
}
Then you can put the config file in that path within your Java source folder (or another folder that is included in build.properties) and it will be loaded in addition to your app's normal faces-config, beans and all.

The NSFs are running as separate, distinct Java applications. The OSGi plugin is running in the OSGi layer, above all those distinct Java applications, as a single code base. Consequently, the faces-config is only at that level.
It's possible to load them dynamically, by using an ImplicitObjectFactory, loaded from an XspContributor. That's what is done in OpenNTF Domino API for e.g. userScope (which is a bean stored in applicationScope of an NSF). See org.openntf.domino.xsp.helpers.OpenntfDominoImplicitObjectFactory, which is referenced in OpenntfDominoXspContributor, loaded via the extension point of type "com.ibm.xsp.library.Contributor".
A few caveats:
You have no control over what happens if you try to register your bean with a name the developer also uses for a different variable in that scope.
Unless you add code to check if the library is enabled, as we do, you'll be adding the bean to every database on the server.
You still need to add the library to the NSF. Unless you also provide a component that those databases will all use, there's no way you can programmatically add it, as far as I know.
It might be easier to skip the bean approach and just add an instance of the Java class in beforePageLoad, page controller class, or however you're managing the backing to the relevant XPage (if viewScope) or application (if sessionScope / applicationScope).

Related

Read Properties File in Karaf From Another Bundle

In Karaf, is there a way for a bundle to read a properties file from another bundle?
I have bundle1, which contains some classes that bundle2 uses (bundle1 exports the package containing those classes in its maven pom via maven-bundle-plugin and bundle2 imports it). But bundle2 also needs to use a properties file from bundle1. Is there a way that in addition to classes, bundle2 can access a file from bundle1?
From what I've read, one option is to deploy the properties to the karaf etc folder via the features file and then it can be accessed from bundle2 via blueprint. I would like to avoid that if possible, as bundle1 is currently not deployed as a feature. So hoping for an alternate approach.
The nicest way is to wrap the access through a class of bundle1. Assume bundle1 contains a class named MyClass. Inside this class you can do this.getClass().getResourceAsStream(path). The path is relative to the package of the class.
So a method of this class could return an Inputstream for the properties file or allow access to the actual properties.
In fact you can also access the properties file from bundle2. Simply use MyClass.getResourceAsStream(path) from a class in bundle2. This works as each class is by default loaded by the classloader of the bundle it resides in.

Look up a dynamic property at run-time in Spring from PropertySourcesPlaceholderConfigurer?

Not sure of the best approach to this. We've created a jar that could be used by different projects. The other projects relying on this jar need to provide certain properties defined in one of their spring properties files. (Our jar shouldn't care what they name those property files.)
Using #Value("${some.prop}") works great for most properties, however we now have the requirement that the name of the property to look up is dynamic. For example:
int val = getSomeVal();
String propNeeded = foo.getProperty("foo."+val+".dynamic.prop");
Not sure what "foo" should be to get my access. I looked into injecting Environment, however from all my googling it looks like that will not load from an xml property-placeholder definition (even if defined as a bean def for PropertySourcesPlaceholderConfigurer.) You seem to have to use #PropertySource, yet my main config is an XML file so not sure how to get Environment to work. (I can't really go 'old skool' and look up the property file as a class path Resource either since I'm not aware of the name of the file the users defined.)
I don't mind making this particular Service class ApplicationContextAware, but if I did that how could I get access to the underlying PropertySourcesPlaceholderConfigurer ? which I would 'seem?' to need in order to get access to a property dynamically?
The other option is that I force users of the jar to declare a bean by a name that I can look up
<util:properties id="appProps" location="classpath:application.properties" />
And I then inject appProps as Properties and look up from there. I don't like this approach though since it forces the users of the library to name an file by a common id. I would think the best solution is to just get a handle in some way to the underlying PropertySourcesPlaceholderConfigurer in my service class... I'm just not sure how to do it?
Why doesn't Spring simply allow PropertySource to be defined some how via your XML config and then I could just inject Environment?
Thanks for any suggestions how to accomplish what I want.
You could have a ReloadableResourceBundleMessageSource declared to read from the same source as the PropertySourcesPlaceholderConfigurer. This way you could just #Autowire MessageSource (or make your bean implement MessageSourceAware) and use that to retrieve your properties.
Main reason for using ReloadableResourceBundleMessageSource is to retrieve I18N messages, so that would kind of hacky...

Spring environment validation

We're building a Spring-based application which will be delivered to end users as a distribution package. Users are responsible for properly configuring whatever needs to be configured (it's mostly about various filesystem locations, folder access permissions, etc). There's a good idea to make the app help users understand what is not configured or which parts of configuration are invalid.
Our current approach is a custom ApplicationContextInitializer which does all the environment validation "manually" and then registers few "low level" beans in the application context explicitly. If something is wrong, initializer throws, exception is caught somewhere in main(), interpreted (converted into plain English) and then displayed.
While this approach works fine, I'm wondering if there are any best practices to minimize hand-written code and use Spring whenever possible.
Here's an illustrative example. The application requires a folder for file uploads. This means:
There should be a configuration file
This file should be accessible by the app
This file should have no syntax errors
This file should explicitly define some specific property (let it be app.uploads.folder)
This property should describe the existing filesystem entity
This entity should be a folder
The app should have read/write access to this folder
Does Spring provide any tools to implement this sort of validation easily?
Spring Boot has a nice feature for context and external configuration validation. If you define a POJO class and declare it as #ConfigurationProperties then Spring will bind the Environment (external properties and System/OS typically) to its properties using a DataBinder. E.g.
#ConfigurationProperties(name="app.uploads")
public class FileUploadProperties {
private File folder;
// getters and setters ommitted
}
will bind to app.uploads.folder and ensure that it is a File. For extra validation you can do it manually in the setter, or you can implement Validator in your FileUploadProperties or you can use JSR-303 annotations on the fields. By default an external property in app.uploads.* that doesn't bind will throw an exception (e.g. a mis-spelled property name, or a conversion/format error).
If you use Spring Boot Autoconfigure #EnableAutoConfigure you don't have to do anything else, but if it's just vanilla Spring (Boot) you need to say #EnableConfigurationProperties in your #Configuration somewhere as well.
A bonus feature: if you also use the Spring Boot Actuator you will also get JMX and HTTP support (in a webapp) for inspecting the bindable and bound properties of #ConfigurationProperties beans. The HTTP endpoint is "/configprops".

Spring - usage of alias vs names

I am confused on the usage of alias. I do understand what alias is and how it is being used but i don't see how it can be any different than using names on a bean definition.
<bean id="xyx" name="abc,def" .. />
<alias name="xyx" alias="pqr"/>
Why the alias when i can use abc or def?
In my mind bean aliasing can be helpful in large system, where you can not manipulate bean names. You have option to create your own name (alias) specific for your part of the system...
from Spring documentation (3.0.x)
http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/htmlsingle/
...it is sometimes desirable to give a single bean multiple names,
otherwise known as bean aliasing...
therefore creating multiple names or/and aliasing are the same thing.
A use case maybe when you want to customize some beans that are already defined somewhere in a modular application (each module is a spring project for example), the bean maybe defined by a third-party framework/API or even your team. In that case you want that only inside your spring project call the customized version without altering other modules (projects), to do that just add the alias in your spring configuration which is indeed a powerful feature:
<alias alias="globalBeanService" name="customizedBeanService" />
Hence, whenever spring find a call to the globalBeanService, it will inject customizedBeanService for you inside your specific module.
Without this feature, you should go through all classes and modify the bean manually!!
An aliased bean will always have higher priority over a non-aliased one, and in case of having different beans with the same alias then the last one declared will have the priority. In other words, the aliased bean will override the non-aliased beans.
This can be particularly useful when creating big projects or when you are building extensions to your project and don't want to touch the original bean definition.
Alias has a specific using scenario which multiple names don't have:
Imagine multiple config xml files in your project, most of which are authored by your colleagues, and you need to add your own config.xml file. Using you'll be able to refer to a bean defined in another config file with a different name that's maybe more meaningful to your config, without having to touch your colleagues' config files.
I recently found another use case where alias easily solved a problem.
When auto configuration is active, Spring Boot provides the bean serverProperties which can be used to access information about the server currently running the web app.
In integration tests (i.e. when #SpringBootTest annotation is present) the same bean is available under the name org.springframework.boot.autoconfigure.web.ServerProperties.
Of course it is possible to use a different profile for integration testing, but that would require manual change of configuration at multiple places. However, simply by adding
<alias name="serverProperties" alias="org.springframework.boot.autoconfigure.web.ServerProperties"/>
the same configuration files can be used for integration tests and in production.
This might be a bug in Spring Boot, however alias easily solve the problem without waiting for a new release. And most certainly I have no possibility to alter the Boot configuration myself.

Spring namespaces and placeholder in springs.schemas

Spring-namespaces allows you to define your own structure how spring beans could be configured. Very cool.
I have to use a 3rd party software (Assentis Docbase) which defines in its spring.schemas the following (example below simplified)
http\://com.apress.prospring2/ch07/custom.xsd=custDir:/custom.xsd
Meaning: If user defines in its spring-xml with schema-location: "http://com.apress.prospring2/ch07/custom.xsd" spring will validate this file against custom.xsd.
custDir is a directory OUTSIDE the provided jar. Does anyone have an idea how I can set this custDir to point to a valid path during junit test? I already tried -DcustDir=/pathToXsd/ but it did not work.
If I remove custDir than everything works as expected, but I can not remove it from provided spring.schemas since it is 3rd party software.
Maybe this is an issue how property-files are handled in java but I have no idea.
You may be able to "override" this entry by providing your own custom spring.schemas file with the same entry but with a location to your custom xsd file. The catch is that this is highly dependent on the order in which the spring.schemas are loaded up, but could be worth a try.
Since custDir is not a place holder, you cannot replace it the way you are doing it, I am surprised that the third party schema location is outside of the classpath.
The syntax of spring.schemas I provided in my question is a properitary definition of 3.rd party software. They implemented there own EntityResolver which manually reacts on "custDir:" and starts some magic algorithm. So I came to the following workaround.
You have to create your own my_spring.schemas which must be live in META-INF/. Than you have to make sure that spring loads my_spring.schemas and NOT spring.schemas.
I achieved it with implementing my own TestingContext which is a subclass of ClassPathXmlApplicationContext. In TestingContext I overwrote method protected void loadBeanDefinitions(DefaultListableBeanFactory beanFactory) throws IOException and filled it with implementation from org.springframework.context.support.AbstractXmlApplicationContext. The only change I made was to line beanDefinitionReader.setEntityResolver(new ResourceEntityResolver(this)) to beanDefinitionReader.setEntityResolver(new PluggableSchemaResolver(getClassLoader(), "META-INF/my_spring.schemas). And voila if I use TestingContext my own my_spring.schemas is loaded.
Drawback with this solution is that you have to provide all xsd in your jar because the default name, where spring looks up definitions has been changed.

Resources