Using map of maps as Maven plugin parameters - maven

Is it possible to use a map of maps as a Maven plugin parameter?, e.g.
#Parameter
private Map<String, Map<String, String>> converters;
and then to use it like
<converters>
<json>
<indent>true</indent>
<strict>true</strict>
</json>
<yaml>
<stripComments>false</stripComments>
</yaml>
<converters>
If I use it like this, converters only contain the keys json and yaml with null as values.
I know it is possible to have complex objects as values, but is it also somehow possible to use maps for variable element values like in this example?

This is apparently a limitation of the sisu.plexus project internally used by the Mojo API. If you peek inside the MapConverter source, you'll find out that it first tries to fetch the value of the map by trying to interpret the configuration as a String (invoking fromExpression), and when this fails, looks up the expected type of the value. However this method doesn't check for parameterized types, which is our case here (since the type of the map value is Map<String, String>). I filed the bug 498757 on the Bugzilla of this project to track this.
Using a custom wrapper object
One workaround would be to not use a Map<String, String> as value but use a custom object:
#Parameter
private Map<String, Converter> converters;
with a class Converter, located in the same package as the Mojo, being:
public class Converter {
#Parameter
private Map<String, String> properties;
#Override
public String toString() { return properties.toString(); } // to test
}
You can then configure your Mojo with:
<converters>
<json>
<properties>
<indent>true</indent>
<strict>true</strict>
</properties>
</json>
<yaml>
<properties>
<stripComments>false</stripComments>
</properties>
</yaml>
</converters>
This configuration will correctly inject the values in the inner-maps. It also keeps the variable aspect: the object is only introduced as a wrapper around the inner-map. I tested this with a simple test mojo having
public void execute() throws MojoExecutionException, MojoFailureException {
getLog().info(converters.toString());
}
and the output was the expected {json={indent=true, strict=true}, yaml={stripComments=false}}.
Using a custom configurator
I also found a way to keep a Map<String, Map<String, String>> by using a custom ComponentConfigurator.
So we want to fix MapConverter by inhering it, the trouble is how to register this new FixedMapConverter. By default, Maven uses a BasicComponentConfigurator to configure the Mojo and it relies on a DefaultConverterLookup to look-up for converters to use for a specific class. In this case, we want to provide a custom converted for Map that will return our fixed version. Therefore, we need to extend this basic configurator and register our new converter.
import org.codehaus.plexus.classworlds.realm.ClassRealm;
import org.codehaus.plexus.component.configurator.BasicComponentConfigurator;
import org.codehaus.plexus.component.configurator.ComponentConfigurationException;
import org.codehaus.plexus.component.configurator.ConfigurationListener;
import org.codehaus.plexus.component.configurator.expression.ExpressionEvaluator;
import org.codehaus.plexus.configuration.PlexusConfiguration;
public class CustomBasicComponentConfigurator extends BasicComponentConfigurator {
#Override
public void configureComponent(final Object component, final PlexusConfiguration configuration,
final ExpressionEvaluator evaluator, final ClassRealm realm, final ConfigurationListener listener)
throws ComponentConfigurationException {
converterLookup.registerConverter(new FixedMapConverter());
super.configureComponent(component, configuration, evaluator, realm, listener);
}
}
Then we need to tell Maven to use this new configurator instead of the basic one. This is a 2-step process:
Inside your Maven plugin, create a file src/main/resources/META-INF/plexus/components.xml registering the new component:
<?xml version="1.0" encoding="UTF-8"?>
<component-set>
<components>
<component>
<role>org.codehaus.plexus.component.configurator.ComponentConfigurator</role>
<role-hint>custom-basic</role-hint>
<implementation>package.to.CustomBasicComponentConfigurator</implementation>
</component>
</components>
</component-set>
Note a few things: we declare a new component having the hint "custom-basic", this will serve as an id to refer to it and the <implementation> refers to the fully qualified class name of our configurator.
Tell our Mojo to use this configurator with the configurator attribute of the #Mojo annotation:
#Mojo(name = "test", configurator = "custom-basic")
The configurator passed here corresponds to the role-hint specified in the components.xml above.
With such a set-up, you can finally declare
#Parameter
private Map<String, Map<String, String>> converters;
and everything will be injected properly: Maven will use our custom configurator, that will register our fixed version of the map converter and will correctly convert the inner-maps.
Full code of FixedMapConverter (which pretty much copy-pastes MapConverter because we can't override the faulty method):
public class FixedMapConverter extends MapConverter {
public Object fromConfiguration(final ConverterLookup lookup, final PlexusConfiguration configuration,
final Class<?> type, final Type[] typeArguments, final Class<?> enclosingType, final ClassLoader loader,
final ExpressionEvaluator evaluator, final ConfigurationListener listener)
throws ComponentConfigurationException {
final Object value = fromExpression(configuration, evaluator, type);
if (null != value) {
return value;
}
try {
final Map<Object, Object> map = instantiateMap(configuration, type, loader);
final Class<?> elementType = findElementType(typeArguments);
if (Object.class == elementType || String.class == elementType) {
for (int i = 0, size = configuration.getChildCount(); i < size; i++) {
final PlexusConfiguration element = configuration.getChild(i);
map.put(element.getName(), fromExpression(element, evaluator));
}
return map;
}
// handle maps with complex element types...
final ConfigurationConverter converter = lookup.lookupConverterForType(elementType);
for (int i = 0, size = configuration.getChildCount(); i < size; i++) {
Object elementValue;
final PlexusConfiguration element = configuration.getChild(i);
try {
elementValue = converter.fromConfiguration(lookup, element, elementType, enclosingType, //
loader, evaluator, listener);
}
// TEMP: remove when http://jira.codehaus.org/browse/MSHADE-168
// is fixed
catch (final ComponentConfigurationException e) {
elementValue = fromExpression(element, evaluator);
Logs.warn("Map in " + enclosingType + " declares value type as: {} but saw: {} at runtime",
elementType, null != elementValue ? elementValue.getClass() : null);
}
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
map.put(element.getName(), elementValue);
}
return map;
} catch (final ComponentConfigurationException e) {
if (null == e.getFailedConfiguration()) {
e.setFailedConfiguration(configuration);
}
throw e;
}
}
#SuppressWarnings("unchecked")
private Map<Object, Object> instantiateMap(final PlexusConfiguration configuration, final Class<?> type,
final ClassLoader loader) throws ComponentConfigurationException {
final Class<?> implType = getClassForImplementationHint(type, configuration, loader);
if (null == implType || Modifier.isAbstract(implType.getModifiers())) {
return new TreeMap<Object, Object>();
}
final Object impl = instantiateObject(implType);
failIfNotTypeCompatible(impl, type, configuration);
return (Map<Object, Object>) impl;
}
private static Class<?> findElementType( final Type[] typeArguments )
{
if ( null != typeArguments && typeArguments.length > 1 )
{
if ( typeArguments[1] instanceof Class<?> )
{
return (Class<?>) typeArguments[1];
}
// begin fix here
if ( typeArguments[1] instanceof ParameterizedType )
{
return (Class<?>) ((ParameterizedType) typeArguments[1]).getRawType();
}
// end fix here
}
return Object.class;
}
}

One solution is quite simple and works for 1-level nesting. A more sophisticated approach can be found in the alternative answer which possibly also allows for deeper nesting of Maps.
Instead of using an interface as type parameter, simply use a concrete class like TreeMap
#Parameter
private Map<String, TreeMap> converters.
The reason is this check in MapConverter which fails for an interface but suceeds for a concrete class:
private static Class<?> findElementType( final Type[] typeArguments )
{
if ( null != typeArguments && typeArguments.length > 1
&& typeArguments[1] instanceof Class<?> )
{
return (Class<?>) typeArguments[1];
}
return Object.class;
}
As a side-note, an as it is also related to this answer for Maven > 3.3.x it also works to install a custom converter by subclassing BasicComponentConfigurator and using it as a Plexus component. BasicComponentConfigurator has the DefaultConverterLookup as a protected member variable and is hence easily accessible for registering custom converters.

Related

JAXBElement: providing codec (/converter?) for class java.lang.Class

I have been evaluating to adopt spring-data-mongodb for a project. In summary, my aim is:
Using existing XML schema files to generate Java classes.
This is achieved using JAXB xjc
The root class is TSDProductDataType and is further modeled as below:
The thing to note here is that ExtensionType contains protected List<Object> any; allowing it to store Objects of any class. In my case, it is amongst the classes named TSDModule_Name_HereModuleType and can be browsed here
Use spring-data-mongodb as persistence store
This is achieved using a simple ProductDataRepository
#RepositoryRestResource(collectionResourceRel = "product", path = "product")
public interface ProductDataRepository extends MongoRepository<TSDProductDataType, String> {
TSDProductDataType queryByGtin(#Param("gtin") String gtin);
}
The unmarshalled TSDProductDataType, however, contains JAXBElement which spring-data-mongodb doesn't seem to handle by itself and throws a CodecConfigurationException org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class java.lang.Class.
Here is the faulty statement:
TSDProductDataType tsdProductDataType = jaxbElement.getValue();
repository.save(tsdProductDataType);
I tried playing around with Converters for spring-data-mongodb as explained here, however, it seems I am missing something since the exception is about "Codecs" and not "Converters".
Any help is appreciated.
EDIT:
Adding converters for JAXBElement
Note: Works with version 1.5.6.RELEASE of org.springframework.boot::spring-boot-starter-parent. With version 2.0.0.M3, hell breaks loose
It seems that I missed something while trying to add converter earlier. So, I added it like below for testing:
#Component
#ReadingConverter
public class JAXBElementReadConverter implements Converter<DBObject, JAXBElement> {
//#Autowired
//MongoConverter converter;
#Override
public JAXBElement convert(DBObject dbObject) {
Class declaredType, scope;
QName name = qNameFromString((String)dbObject.get("name"));
Object rawValue = dbObject.get("value");
try {
declaredType = Class.forName((String)dbObject.get("declaredType"));
} catch (ClassNotFoundException e) {
if (rawValue.getClass().isArray()) declaredType = List.class;
else declaredType = LinkedHashMap.class;
}
try {
scope = Class.forName((String) dbObject.get("scope"));
} catch (ClassNotFoundException e) {
scope = JAXBElement.GlobalScope.class;
}
//Object value = rawValue instanceof DBObject ? converter.read(declaredType, (DBObject) rawValue) : rawValue;
Object value = "TODO";
return new JAXBElement(name, declaredType, scope, value);
}
QName qNameFromString(String s) {
String[] parts = s.split("[{}]");
if (parts.length > 2) return new QName(parts[1], parts[2], parts[0]);
if (parts.length == 1) return new QName(parts[0]);
return new QName("undef");
}
}
#Component
#WritingConverter
public class JAXBElementWriteConverter implements Converter<JAXBElement, DBObject> {
//#Autowired
//MongoConverter converter;
#Override
public DBObject convert(JAXBElement jaxbElement) {
DBObject dbObject = new BasicDBObject();
dbObject.put("name", qNameToString(jaxbElement.getName()));
dbObject.put("declaredType", jaxbElement.getDeclaredType().getName());
dbObject.put("scope", jaxbElement.getScope().getCanonicalName());
//dbObject.put("value", converter.convertToMongoType(jaxbElement.getValue()));
dbObject.put("value", "TODO");
dbObject.put("_class", JAXBElement.class.getName());
return dbObject;
}
public String qNameToString(QName name) {
if (name.getNamespaceURI() == XMLConstants.NULL_NS_URI) return name.getLocalPart();
return name.getPrefix() + '{' + name.getNamespaceURI() + '}' + name.getLocalPart();
}
}
#SpringBootApplication
public class TsdApplication {
public static void main(String[] args) {
SpringApplication.run(TsdApplication.class, args);
}
#Bean
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(),
new JAXBElementWriteConverter()
));
}
}
So far so good. However, how do I instantiate MongoConverter converter;?
MongoConverter is an interface so I guess I need an instantiable class adhering to this interface. Any suggestions?
I understand the desire for convenience in being able to just map an existing domain object to the database layer with no boilerplate, but even if you weren't having the JAXB class structure issue, I would still be recommending away from using it verbatim. Unless this is a simple one-off project, you almost definitely will hit a point where your domain models will need to change but your persisted data need to remain in an existing state. If you are just straight persisting the data, you have no mechanism to convert between a newer domain schema and an older persisted data scheme. Versioning of the persisted data scheme would be wise too.
The link you posted for writing the customer converters is one way to achieve this and fits in nicely with the Spring ecosystem. That method should also solve the issue you are experiencing (about the underlying messy JAXB data structure not converting cleanly).
Are you unable to get that method working? Ensure you are loading them into the Spring context with #Component plus auto-class scanning or manually via some Configuration class.
EDIT to address your EDIT:
Add the following to each of your converters:
private final MongoConverter converter;
public JAXBElement____Converter(MongoConverter converter) {
this.converter = converter;
}
Try changing your bean definition to:
#Bean
public CustomConversions customConversions(#Lazy MongoConverter converter) {
return new CustomConversions(Arrays.asList(
new JAXBElementReadConverter(converter),
new JAXBElementWriteConverter(converter)
));
}

Configuring snake_case query parameter using Gson in Spring Boot

I try to configure Gson as my JSON mapper to accept "snake_case" query parameter, and translate them into standard Java "camelCase" parameters.
First of all, I know I could use the #SerializedName annotation to customise the serialized name of each field, but this will involve some manual work.
After doing some search, I believe the following approach should work (please correct me if I am wrong).
Use Gson as the default JSON mapper of Spring Boot
spring.http.converters.preferred-json-mapper=gson
Configuring Gson before GsonHttpMessageConverter is created as described here
Customising the Gson naming policy in step 2 according to GSON Field Naming Policy
private GsonHttpMessageConverter createGsonHttpMessageConverter() {
Gson gson = new GsonBuilder()
.setFieldNamingPolicy(FieldNamingPolicy.LOWER_CASE_WITH_UNDERSCORES)
.create();
GsonHttpMessageConverter gsonConverter = new GsonHttpMessageConverter();
gsonConverter.setGson(gson);
return gsonConverter;
}
Then I create a simple controller like this:
#RequestMapping(value = "/example/gson-naming-policy")
public Object testNamingPolicy(ExampleParam data) {
return data.getCamelCase();
}
With the following Param class:
import lombok.Data;
#Data
public class ExampleParam {
private String camelCase;
}
But when I call the controller with query parameter ?camel_case=hello, the data.camelCase could not been populated (and it's null). When I change the query parameters to ?camelCase=hello then it could be set, which mean my setting is not working as expected.
Any hint would be highly appreciated. Thanks in advance!
It's a nice question. If I understand how Spring MVC works behind the scenes, no HTTP converters are used for #ModelAttribute-driven. It can be inspected easily when throwing an exception from your ExampleParam constructor or the ExampleParam.setCamelCase method (de-Lombok first) -- Spring uses its bean utilities that use public (!) ExampleParam.setCamelCase to set the DTO value. Another proof is that no Gson.fromJson is never invoked regardless how your Gson converter is configured. So, your camelCase confuses you because the default Gson instance uses this strategy as well as Spring does -- so this is just a matter of confusion.
In order to make it work, you have to create a custom Gson-aware HandlerMethodArgumentResolver implementation. Let's assume we support POJO only (not lists, maps or primitives).
#Configuration
#EnableWebMvc
class WebMvcConfiguration
extends WebMvcConfigurerAdapter {
private static final Gson gson = new GsonBuilder()
.setFieldNamingPolicy(LOWER_CASE_WITH_UNDERSCORES)
.create();
#Override
public void addArgumentResolvers(final List<HandlerMethodArgumentResolver> argumentResolvers) {
argumentResolvers.add(new HandlerMethodArgumentResolver() {
#Override
public boolean supportsParameter(final MethodParameter parameter) {
// It must be never a primitive, array, string, boxed number, map or list -- and whatever you configure ;)
final Class<?> parameterType = parameter.getParameterType();
return !parameterType.isPrimitive()
&& !parameterType.isArray()
&& parameterType != String.class
&& !Number.class.isAssignableFrom(parameterType)
&& !Map.class.isAssignableFrom(parameterType)
&& !List.class.isAssignableFrom(parameterType);
}
#Override
public Object resolveArgument(final MethodParameter parameter, final ModelAndViewContainer mavContainer, final NativeWebRequest webRequest,
final WebDataBinderFactory binderFactory) {
// Now we're deconstructing the request parameters creating a JSON tree, because Gson can convert from JSON trees to POJOs transparently
// Also note parameter.getGenericParameterType() -- it's better that Class<?> that cannot hold generic types parameterization
return gson.fromJson(
parameterMapToJsonElement(webRequest.getParameterMap()),
parameter.getGenericParameterType()
);
}
});
}
...
private static JsonElement parameterMapToJsonElement(final Map<String, String[]> parameters) {
final JsonObject jsonObject = new JsonObject();
for ( final Entry<String, String[]> e : parameters.entrySet() ) {
final String key = e.getKey();
final String[] value = e.getValue();
final JsonElement jsonValue;
switch ( value.length ) {
case 0:
// As far as I understand, this must never happen, but I'm not sure
jsonValue = JsonNull.INSTANCE;
break;
case 1:
// If there's a single value only, let's convert it to a string literal
// Gson is good at "weak typing": strings can be parsed automatically to numbers and booleans
jsonValue = new JsonPrimitive(value[0]);
break;
default:
// If there are more than 1 element -- make it an array
final JsonArray jsonArray = new JsonArray();
for ( int i = 0; i < value.length; i++ ) {
jsonArray.add(value[i]);
}
jsonValue = jsonArray;
break;
}
jsonObject.add(key, jsonValue);
}
return jsonObject;
}
}
So, here are the results:
http://localhost:8080/?camelCase=hello => (empty)
http://localhost:8080/?camel_case=hello => "hello"

Get a specific service implementation based on a parameter

In my Sling app I have data presenting documents, with pages, and content nodes. We mostly server those documents as HTML, but now I would like to have a servlet to serve these documents as PDF and PPT.
Basically, I thought about implementing the factory pattern : in my servlet, dependending on the extension of the request (pdf or ppt), I would get from a DocumentBuilderFactory, the proper DocumentBuilder implementation, either PdfDocumentBuilder or PptDocumentBuilder.
So first I had this:
public class PlanExportBuilderFactory {
public PlanExportBuilder getBuilder(String type) {
PlanExportBuilder builder = null;
switch (type) {
case "pdf":
builder = new PdfPlanExportBuilder();
break;
default:
logger.error("Unsupported plan export builder, type: " + type);
}
return builder;
}
}
In the servlet:
#Component(metatype = false)
#Service(Servlet.class)
#Properties({
#Property(name = "sling.servlet.resourceTypes", value = "myApp/document"),
#Property(name = "sling.servlet.extensions", value = { "ppt", "pdf" }),
#Property(name = "sling.servlet.methods", value = "GET")
})
public class PlanExportServlet extends SlingSafeMethodsServlet {
#Reference
PlanExportBuilderFactory builderFactory;
#Override
protected void doGet(SlingHttpServletRequest request, SlingHttpServletResponse response) throws ServletException, IOException {
Resource resource = request.getResource();
PlanExportBuilder builder = builderFactory.getBuilder(request.getRequestPathInfo().getExtension());
}
}
But the problem is that in the builder I would like to reference other services to access Sling resources, and with this solution, they're not bound.
I looked at Services Factory with OSGi but from what I've understood, you use them to configure differently the same implementation of a service.
Then I found that you can get a specific implementation by naming it, or use a property and a filter.
So I've ended up with this:
public class PlanExportBuilderFactory {
#Reference(target = "(builderType=pdf)")
PlanExportBuilder pdfPlanExportBuilder;
public PlanExportBuilder getBuilder(String type) {
PlanExportBuilder builder = null;
switch (type) {
case "pdf":
return pdfPlanExportBuilder;
default:
logger.error("Unsupported plan export builder, type: " + type);
}
return builder;
}
}
The builder defining a "builderType" property :
// AbstractPlanExportBuilder implements PlanExportBuilder interface
#Component
#Service(value=PlanExportBuilder.class)
public class PdfPlanExportBuilder extends AbstractPlanExportBuilder {
#Property(name="builderType", value="pdf")
public PdfPlanExportBuilder() {
planDocument = new PdfPlanDocument();
}
}
I would like to know if it's a good way to retrieve my PDF builder implementation regarding OSGi good practices.
EDIT 1
From Peter's answer I've tried to add multiple references but with Felix it doesn't seem to work:
#Reference(name = "planExportBuilder", cardinality = ReferenceCardinality.MANDATORY_MULTIPLE, policy = ReferencePolicy.DYNAMIC)
private Map<String, PlanExportBuilder> builders = new ConcurrentHashMap<String, PlanExportBuilder>();
protected final void bindPlanExportBuilder(PlanExportBuilder b, Map<String, Object> props) {
final String type = PropertiesUtil.toString(props.get("type"), null);
if (type != null) {
this.builders.put((String) props.get("type"), b);
}
}
protected final void unbindPlanExportBuilder(final PlanExportBuilder b, Map<String, Object> props) {
final String type = PropertiesUtil.toString(props.get("type"), null);
if (type != null) {
this.builders.remove(type);
}
}
I get these errors :
#Reference(builders) : Missing method bind for reference planExportBuilder
#Reference(builders) : Something went wrong: false - true - MANDATORY_MULTIPLE
#Reference(builders) : Missing method unbind for reference planExportBuilder
The Felix documentation here http://felix.apache.org/documentation/subprojects/apache-felix-maven-scr-plugin/scr-annotations.html#reference says for the bind method:
The default value is the name created by appending the reference name to the string bind. The method must be declared public or protected and take single argument which is declared with the service interface type
So according to this, I understand it cannot work with Felix, as I'm trying to pass two arguments. However, I found an example here that seems to match what you've suggested but I cannot make it work: https://github.com/Adobe-Consulting-Services/acs-aem-samples/blob/master/bundle/src/main/java/com/adobe/acs/samples/services/impl/SampleMultiReferenceServiceImpl.java
EDIT 2
Just had to move the reference above the class to make it work:
#References({
#Reference(
name = "planExportBuilder",
referenceInterface = PlanExportBuilder.class,
policy = ReferencePolicy.DYNAMIC,
cardinality = ReferenceCardinality.OPTIONAL_MULTIPLE)
})
public class PlanExportServlet extends SlingSafeMethodsServlet {
Factories are evil :-) Main reason is of course the yucky class loading hacks that are usually used but also because they tend to have global knowledge. In general, you want to be able to add a bundle with a new DocumentBuilder and then that type should become available.
A more OSGi oriented solution is therefore to use service properties. This could look like:
#Component( property=HTTP_WHITEBOARD_FILTER_REGEX+"=/as")
public class DocumentServlet {
final Map<String,DocBuilder> builders = new ConcurrentHashMap<>();
public void doGet( HttpServletRequest rq, HttpServletResponse rsp )
throws IOException, ServletException {
InputStream in = getInputStream( rq.getPathInfo() );
if ( in == null )
....
String type = toType( rq.getPathInfo(), rq.getParameter("type") );
DocBuilder docbuilder = builders.get( type );
if ( docbuilder == null)
....
docbuilder.convert( type, in, rsp.getOutputStream() );
}
#Reference( cardinality=MULTIPLE, policy=DYNAMIC )
void addDocBuilder( DocBuilder db, Map<String,Object> props ) {
docbuilders.put(props.get("type"), db );
}
void removeDocBuilder(Map<String,Object> props ) {
docbuilders.remove(props.get("type"));
}
}
A DocBuilder could look like:
#Component( property = "type=ppt-pdf" )
public class PowerPointToPdf implements DocBuilder {
...
}

#MessageMapping with placeholders

I am working with Spring-websocket and I have the following problem:
I am trying to put a placeholder inside a #MessageMapping annotation in order to get the url from properties. It works with #RequestMapping but not with #MessageMapping.
If I use this placeholder, the URL is null. Any idea or suggestion?
Example:
#RequestMapping(value= "${myProperty}")
#MessageMapping("${myProperty}")
Rossen Stoyanchev added placeholder support for #MessageMapping and #SubscribeMapping methods.
See Jira issue: https://jira.spring.io/browse/SPR-13271
Spring allows you to use property placeholders in #RequestMapping, but not in #MessageMapping. This is 'cause the MessageHandler. So, we need to override the default MessageHandler to do this.
WebSocketAnnotationMethodMessageHandler does not support placeholders and you need add this support yourself.
For simplicity I just created another WebSocketAnnotationMethodMessageHandler class in my project at the same package of the original, org.springframework.web.socket.messaging, and override getMappingForMethod method from SimpAnnotationMethodMessageHandler with same content, changing only how SimpMessageMappingInfo is contructed using this with this methods (private in WebSocketAnnotationMethodMessageHandler):
private SimpMessageMappingInfo createMessageMappingCondition(final MessageMapping annotation) {
return new SimpMessageMappingInfo(SimpMessageTypeMessageCondition.MESSAGE, new DestinationPatternsMessageCondition(
this.resolveAnnotationValues(annotation.value()), this.getPathMatcher()));
}
private SimpMessageMappingInfo createSubscribeCondition(final SubscribeMapping annotation) {
final SimpMessageTypeMessageCondition messageTypeMessageCondition = SimpMessageTypeMessageCondition.SUBSCRIBE;
return new SimpMessageMappingInfo(messageTypeMessageCondition, new DestinationPatternsMessageCondition(
this.resolveAnnotationValues(annotation.value()), this.getPathMatcher()));
}
These methods now will resolve value considering properties (calling resolveAnnotationValues method), so we need use something like this:
private String[] resolveAnnotationValues(final String[] destinationNames) {
final int length = destinationNames.length;
final String[] result = new String[length];
for (int i = 0; i < length; i++) {
result[i] = this.resolveAnnotationValue(destinationNames[i]);
}
return result;
}
private String resolveAnnotationValue(final String name) {
if (!(this.getApplicationContext() instanceof ConfigurableApplicationContext)) {
return name;
}
final ConfigurableApplicationContext applicationContext = (ConfigurableApplicationContext) this.getApplicationContext();
final ConfigurableBeanFactory configurableBeanFactory = applicationContext.getBeanFactory();
final String placeholdersResolved = configurableBeanFactory.resolveEmbeddedValue(name);
final BeanExpressionResolver exprResolver = configurableBeanFactory.getBeanExpressionResolver();
if (exprResolver == null) {
return name;
}
final Object result = exprResolver.evaluate(placeholdersResolved, new BeanExpressionContext(configurableBeanFactory, null));
return result != null ? result.toString() : name;
}
You still need to define a PropertySourcesPlaceholderConfigurer bean in your configuration.
If you are using XML based configuration, include something like this:
<context:property-placeholder location="classpath:/META-INF/spring/url-mapping-config.properties" />
If you are using Java based configuration, you can try in this way:
#Configuration
#PropertySources(value = #PropertySource("classpath:/META-INF/spring/url-mapping-config.properties"))
public class URLMappingConfig {
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
Obs.: in this case, url-mapping-config.properties file are in a gradle/maven project in src\main\resources\META-INF\spring folder and content look like this:
myPropertyWS=urlvaluews
This is my sample controller:
#Controller
public class WebSocketController {
#SendTo("/topic/test")
#MessageMapping("${myPropertyWS}")
public String test() throws Exception {
Thread.sleep(4000); // simulated delay
return "OK";
}
}
With default MessageHandler startup log will print something like this:
INFO: Mapped "{[/${myPropertyWS}],messageType=[MESSAGE]}" onto public java.lang.String com.brunocesar.controller.WebSocketController.test() throws java.lang.Exception
And with our MessageHandler now print this:
INFO: Mapped "{[/urlvaluews],messageType=[MESSAGE]}" onto public java.lang.String com.brunocesar.controller.WebSocketController.test() throws java.lang.Exception
See in this gist the full WebSocketAnnotationMethodMessageHandler implementation.
EDIT: this solution resolves the problem for versions before 4.2 GA. For more information, see this jira.
Update :
Now I understood what you mean, but I think that is not possible(yet).
Documentation does not mention anything related to Path mapping URIs.
Old answer
Use
#MessageMapping("/handler/{myProperty}")
instead of
#MessageMapping("/handler/${myProperty}")
And use it like this:
#MessageMapping("/myHandler/{username}")
public void handleTextMessage(#DestinationVariable String username,Message message) {
//do something
}
#MessageMapping("/chat/{roomId}")
public Message handleMessages(#DestinationVariable("roomId") String roomId, #Payload Message message, Traveler traveler) throws Exception {
System.out.println("Message received for room: " + roomId);
System.out.println("User: " + traveler.toString());
// store message in database
message.setAuthor(traveler);
message.setChatRoomId(Integer.parseInt(roomId));
int id = MessageRepository.getInstance().save(message);
message.setId(id);
return message;
}

Access all Environment properties as a Map or Properties object

I am using annotations to configure my spring environment like this:
#Configuration
...
#PropertySource("classpath:/config/default.properties")
...
public class GeneralApplicationConfiguration implements WebApplicationInitializer
{
#Autowired
Environment env;
}
This leads to my properties from default.properties being part of the Environment. I want to use the #PropertySource mechanism here, because it already provides the possibility to overload properties through several fallback layers and different dynamic locations, based on the environment settings (e.g. config_dir location). I just stripped the fallback to make the example easier.
However, my problem now is that I want to configure for example my datasource properties in default.properties. You can pass the settings to the datasource without knowing in detail what settings the datasource expects using
Properties p = ...
datasource.setProperties(p);
However, the problem is, the Environment object is neither a Properties object nor a Map nor anything comparable. From my point of view it is simply not possible to access all values of the environment, because there is no keySet or iterator method or anything comparable.
Properties p <=== Environment env?
Am I missing something? Is it possible to access all entries of the Environment object somehow? If yes, I could map the entries to a Map or Properties object, I could even filter or map them by prefix - create subsets as a standard java Map ... This is what I would like to do. Any suggestions?
You need something like this, maybe it can be improved. This is a first attempt:
...
import org.springframework.core.env.PropertySource;
import org.springframework.core.env.AbstractEnvironment;
import org.springframework.core.env.Environment;
import org.springframework.core.env.MapPropertySource;
...
#Configuration
...
#org.springframework.context.annotation.PropertySource("classpath:/config/default.properties")
...
public class GeneralApplicationConfiguration implements WebApplicationInitializer
{
#Autowired
Environment env;
public void someMethod() {
...
Map<String, Object> map = new HashMap();
for(Iterator it = ((AbstractEnvironment) env).getPropertySources().iterator(); it.hasNext(); ) {
PropertySource propertySource = (PropertySource) it.next();
if (propertySource instanceof MapPropertySource) {
map.putAll(((MapPropertySource) propertySource).getSource());
}
}
...
}
...
Basically, everything from the Environment that's a MapPropertySource (and there are quite a lot of implementations) can be accessed as a Map of properties.
This is an old question, but the accepted answer has a serious flaw. If the Spring Environment object contains any overriding values (as described in Externalized Configuration), there is no guarantee that the map of property values it produces will match those returned from the Environment object. I found that simply iterating through the PropertySources of the Environment did not, in fact, give any overriding values. Instead it produced the original value, the one that should have been overridden.
Here is a better solution. This uses the EnumerablePropertySources of the Environment to iterate through the known property names, but then reads the actual value out of the real Spring environment. This guarantees that the value is the one actually resolved by Spring, including any overriding values.
Properties props = new Properties();
MutablePropertySources propSrcs = ((AbstractEnvironment) springEnv).getPropertySources();
StreamSupport.stream(propSrcs.spliterator(), false)
.filter(ps -> ps instanceof EnumerablePropertySource)
.map(ps -> ((EnumerablePropertySource) ps).getPropertyNames())
.flatMap(Arrays::<String>stream)
.forEach(propName -> props.setProperty(propName, springEnv.getProperty(propName)));
I had the requirement to retrieve all properties whose key starts with a distinct prefix (e.g. all properties starting with "log4j.appender.") and wrote following Code (using streams and lamdas of Java 8).
public static Map<String,Object> getPropertiesStartingWith( ConfigurableEnvironment aEnv,
String aKeyPrefix )
{
Map<String,Object> result = new HashMap<>();
Map<String,Object> map = getAllProperties( aEnv );
for (Entry<String, Object> entry : map.entrySet())
{
String key = entry.getKey();
if ( key.startsWith( aKeyPrefix ) )
{
result.put( key, entry.getValue() );
}
}
return result;
}
public static Map<String,Object> getAllProperties( ConfigurableEnvironment aEnv )
{
Map<String,Object> result = new HashMap<>();
aEnv.getPropertySources().forEach( ps -> addAll( result, getAllProperties( ps ) ) );
return result;
}
public static Map<String,Object> getAllProperties( PropertySource<?> aPropSource )
{
Map<String,Object> result = new HashMap<>();
if ( aPropSource instanceof CompositePropertySource)
{
CompositePropertySource cps = (CompositePropertySource) aPropSource;
cps.getPropertySources().forEach( ps -> addAll( result, getAllProperties( ps ) ) );
return result;
}
if ( aPropSource instanceof EnumerablePropertySource<?> )
{
EnumerablePropertySource<?> ps = (EnumerablePropertySource<?>) aPropSource;
Arrays.asList( ps.getPropertyNames() ).forEach( key -> result.put( key, ps.getProperty( key ) ) );
return result;
}
// note: Most descendants of PropertySource are EnumerablePropertySource. There are some
// few others like JndiPropertySource or StubPropertySource
myLog.debug( "Given PropertySource is instanceof " + aPropSource.getClass().getName()
+ " and cannot be iterated" );
return result;
}
private static void addAll( Map<String, Object> aBase, Map<String, Object> aToBeAdded )
{
for (Entry<String, Object> entry : aToBeAdded.entrySet())
{
if ( aBase.containsKey( entry.getKey() ) )
{
continue;
}
aBase.put( entry.getKey(), entry.getValue() );
}
}
Note that the starting point is the ConfigurableEnvironment which is able to return the embedded PropertySources (the ConfigurableEnvironment is a direct descendant of Environment). You can autowire it by:
#Autowired
private ConfigurableEnvironment myEnv;
If you not using very special kinds of property sources (like JndiPropertySource, which is usually not used in spring autoconfiguration) you can retrieve all properties held in the environment.
The implementation relies on the iteration order which spring itself provides and takes the first found property, all later found properties with the same name are discarded. This should ensure the same behaviour as if the environment were asked directly for a property (returning the first found one).
Note also that the returned properties are not yet resolved if they contain aliases with the ${...} operator. If you want to have a particular key resolved you have to ask the Environment directly again:
myEnv.getProperty( key );
The original question hinted that it would be nice to be able to filter all the properties based on a prefix. I have just confirmed that this works as of Spring Boot 2.1.1.RELEASE, for Properties or Map<String,String>. I'm sure it's worked for while now. Interestingly, it does not work without the prefix = qualification, i.e. I do not know how to get the entire environment loaded into a map. As I said, this might actually be what OP wanted to begin with. The prefix and the following '.' will be stripped off, which might or might not be what one wants:
#ConfigurationProperties(prefix = "abc")
#Bean
public Properties getAsProperties() {
return new Properties();
}
#Bean
public MyService createService() {
Properties properties = getAsProperties();
return new MyService(properties);
}
Postscript: It is indeed possible, and shamefully easy, to get the entire environment. I don't know how this escaped me:
#ConfigurationProperties
#Bean
public Properties getProperties() {
return new Properties();
}
As this Spring's Jira ticket, it is an intentional design. But the following code works for me.
public static Map<String, Object> getAllKnownProperties(Environment env) {
Map<String, Object> rtn = new HashMap<>();
if (env instanceof ConfigurableEnvironment) {
for (PropertySource<?> propertySource : ((ConfigurableEnvironment) env).getPropertySources()) {
if (propertySource instanceof EnumerablePropertySource) {
for (String key : ((EnumerablePropertySource) propertySource).getPropertyNames()) {
rtn.put(key, propertySource.getProperty(key));
}
}
}
}
return rtn;
}
Spring won't allow to decouple via java.util.Properties from Spring Environment.
But Properties.load() still works in a Spring boot application:
Properties p = new Properties();
try (InputStream is = getClass().getResourceAsStream("/my.properties")) {
p.load(is);
}
The other answers have pointed out the solution for the majority of cases involving PropertySources, but none have mentioned that certain property sources are unable to be casted into useful types.
One such example is the property source for command line arguments. The class that is used is SimpleCommandLinePropertySource. This private class is returned by a public method, thus making it extremely tricky to access the data inside the object. I had to use reflection in order to read the data and eventually replace the property source.
If anyone out there has a better solution, I would really like to see it; however, this is the only hack I have gotten to work.
Working with Spring Boot 2, I needed to do something similar. Most of the answers above work fine, just beware that at various phases in the app lifecycles the results will be different.
For example, after a ApplicationEnvironmentPreparedEvent any properties inside application.properties are not present. However, after a ApplicationPreparedEvent event they are.
For Spring Boot, the accepted answer will overwrite duplicate properties with lower priority ones. This solution will collect the properties into a SortedMap and take only the highest priority duplicate properties.
final SortedMap<String, String> sortedMap = new TreeMap<>();
for (final PropertySource<?> propertySource : env.getPropertySources()) {
if (!(propertySource instanceof EnumerablePropertySource))
continue;
for (final String name : ((EnumerablePropertySource<?>) propertySource).getPropertyNames())
sortedMap.computeIfAbsent(name, propertySource::getProperty);
}
I though I'd add one more way. In my case I supply this to com.hazelcast.config.XmlConfigBuilder which only needs java.util.Properties to resolve some properties inside the Hazelcast XML configuration file, i.e. it only calls getProperty(String) method. So, this allowed me to do what I needed:
#RequiredArgsConstructor
public class SpringReadOnlyProperties extends Properties {
private final org.springframework.core.env.Environment delegate;
#Override
public String getProperty(String key) {
return delegate.getProperty(key);
}
#Override
public String getProperty(String key, String defaultValue) {
return delegate.getProperty(key, defaultValue);
}
#Override
public synchronized String toString() {
return getClass().getName() + "{" + delegate + "}";
}
#Override
public boolean equals(Object o) {
if (this == o) return true;
if (o == null || getClass() != o.getClass()) return false;
if (!super.equals(o)) return false;
SpringReadOnlyProperties that = (SpringReadOnlyProperties) o;
return delegate.equals(that.delegate);
}
#Override
public int hashCode() {
return Objects.hash(super.hashCode(), delegate);
}
private void throwException() {
throw new RuntimeException("This method is not supported");
}
//all methods below throw the exception
* override all methods *
}
P.S. I ended up not using this specifically for Hazelcast because it only resolves properties for XML file but not at runtime. Since I also use Spring, I decided to go with a custom org.springframework.cache.interceptor.AbstractCacheResolver#getCacheNames. This resolves properties for both situations, at least if you use properties in cache names.
To get ONLY properties, defined in my hibernate.properteies file:
#PropertySource(SomeClass.HIBERNATE_PROPERTIES)
public class SomeClass {
public static final String HIBERNATE_PROPERTIES = "hibernate.properties";
#Autowired
private Environment env;
public void someMethod() {
final Properties hibProps = asProperties(HIBERNATE_PROPERTIES);
}
private Properties asProperties(String fileName) {
return StreamSupport.stream(
((AbstractEnvironment) env).getPropertySources().spliterator(), false)
.filter(ps -> ps instanceof ResourcePropertySource)
.map(ps -> (ResourcePropertySource) ps)
.filter(rps -> rps.getName().contains(fileName))
.collect(
Properties::new,
(props, rps) -> props.putAll(rps.getSource()),
Properties::putAll);
}
}
A little helper to analyze the sources of a property, which sometimes drive me crazy . I used this discussion to write SpringConfigurableEnvironment.java on github.
It could be used in a test:
#SpringBootTest
public class SpringConfigurableEnvironmentTest {
#Autowired
private ConfigurableEnvironment springEnv;
#Test
public void testProperties() {
SpringConfigurableEnvironment properties = new SpringConfigurableEnvironment(springEnv);
SpringConfigurableEnvironment.PropertyInfo info = properties.get("profile.env");
assertEquals("default", properties.get(info.getValue());
assertEquals(
"Config resource 'class path resource [application.properties]' via location 'optional:classpath:/'",
info.getSourceList.get(0));
}
}
All answers above have pretty much covers everything, but be aware of overridden values from environment variables. They may have different key values.
For example, if a user override my.property[1].value using environment variable MY_PROPERTY[1]_VALUE, iterating through EnumerablePropertySources.getPropertyNames() would give you both my.property[1].value and MY_PROPERTY[1]_VALUE key values.
What even worse is that if my.property[1].value is not defined in applications.conf (or applications.yml), a MY_PROPERTY[1]_VALUE in environment variables would not give you my.property[1].value but only MY_PROPERTY[1]_VALUE key value from EnumerablePropertySources.getPropertyNames().
So it is developers' job to cover the those properties from environment variables. Unfortunately, there is no one-on-one mapping between environment variables schema vs the normal schema, see the source code of SystemEnvironmentPropertySource. For example, MY_PROPERTY[1]_VALUE could be either my.property[1].value or my-property[1].value

Resources