Get a specific service implementation based on a parameter - osgi

In my Sling app I have data presenting documents, with pages, and content nodes. We mostly server those documents as HTML, but now I would like to have a servlet to serve these documents as PDF and PPT.
Basically, I thought about implementing the factory pattern : in my servlet, dependending on the extension of the request (pdf or ppt), I would get from a DocumentBuilderFactory, the proper DocumentBuilder implementation, either PdfDocumentBuilder or PptDocumentBuilder.
So first I had this:
public class PlanExportBuilderFactory {
public PlanExportBuilder getBuilder(String type) {
PlanExportBuilder builder = null;
switch (type) {
case "pdf":
builder = new PdfPlanExportBuilder();
break;
default:
logger.error("Unsupported plan export builder, type: " + type);
}
return builder;
}
}
In the servlet:
#Component(metatype = false)
#Service(Servlet.class)
#Properties({
#Property(name = "sling.servlet.resourceTypes", value = "myApp/document"),
#Property(name = "sling.servlet.extensions", value = { "ppt", "pdf" }),
#Property(name = "sling.servlet.methods", value = "GET")
})
public class PlanExportServlet extends SlingSafeMethodsServlet {
#Reference
PlanExportBuilderFactory builderFactory;
#Override
protected void doGet(SlingHttpServletRequest request, SlingHttpServletResponse response) throws ServletException, IOException {
Resource resource = request.getResource();
PlanExportBuilder builder = builderFactory.getBuilder(request.getRequestPathInfo().getExtension());
}
}
But the problem is that in the builder I would like to reference other services to access Sling resources, and with this solution, they're not bound.
I looked at Services Factory with OSGi but from what I've understood, you use them to configure differently the same implementation of a service.
Then I found that you can get a specific implementation by naming it, or use a property and a filter.
So I've ended up with this:
public class PlanExportBuilderFactory {
#Reference(target = "(builderType=pdf)")
PlanExportBuilder pdfPlanExportBuilder;
public PlanExportBuilder getBuilder(String type) {
PlanExportBuilder builder = null;
switch (type) {
case "pdf":
return pdfPlanExportBuilder;
default:
logger.error("Unsupported plan export builder, type: " + type);
}
return builder;
}
}
The builder defining a "builderType" property :
// AbstractPlanExportBuilder implements PlanExportBuilder interface
#Component
#Service(value=PlanExportBuilder.class)
public class PdfPlanExportBuilder extends AbstractPlanExportBuilder {
#Property(name="builderType", value="pdf")
public PdfPlanExportBuilder() {
planDocument = new PdfPlanDocument();
}
}
I would like to know if it's a good way to retrieve my PDF builder implementation regarding OSGi good practices.
EDIT 1
From Peter's answer I've tried to add multiple references but with Felix it doesn't seem to work:
#Reference(name = "planExportBuilder", cardinality = ReferenceCardinality.MANDATORY_MULTIPLE, policy = ReferencePolicy.DYNAMIC)
private Map<String, PlanExportBuilder> builders = new ConcurrentHashMap<String, PlanExportBuilder>();
protected final void bindPlanExportBuilder(PlanExportBuilder b, Map<String, Object> props) {
final String type = PropertiesUtil.toString(props.get("type"), null);
if (type != null) {
this.builders.put((String) props.get("type"), b);
}
}
protected final void unbindPlanExportBuilder(final PlanExportBuilder b, Map<String, Object> props) {
final String type = PropertiesUtil.toString(props.get("type"), null);
if (type != null) {
this.builders.remove(type);
}
}
I get these errors :
#Reference(builders) : Missing method bind for reference planExportBuilder
#Reference(builders) : Something went wrong: false - true - MANDATORY_MULTIPLE
#Reference(builders) : Missing method unbind for reference planExportBuilder
The Felix documentation here http://felix.apache.org/documentation/subprojects/apache-felix-maven-scr-plugin/scr-annotations.html#reference says for the bind method:
The default value is the name created by appending the reference name to the string bind. The method must be declared public or protected and take single argument which is declared with the service interface type
So according to this, I understand it cannot work with Felix, as I'm trying to pass two arguments. However, I found an example here that seems to match what you've suggested but I cannot make it work: https://github.com/Adobe-Consulting-Services/acs-aem-samples/blob/master/bundle/src/main/java/com/adobe/acs/samples/services/impl/SampleMultiReferenceServiceImpl.java
EDIT 2
Just had to move the reference above the class to make it work:
#References({
#Reference(
name = "planExportBuilder",
referenceInterface = PlanExportBuilder.class,
policy = ReferencePolicy.DYNAMIC,
cardinality = ReferenceCardinality.OPTIONAL_MULTIPLE)
})
public class PlanExportServlet extends SlingSafeMethodsServlet {

Factories are evil :-) Main reason is of course the yucky class loading hacks that are usually used but also because they tend to have global knowledge. In general, you want to be able to add a bundle with a new DocumentBuilder and then that type should become available.
A more OSGi oriented solution is therefore to use service properties. This could look like:
#Component( property=HTTP_WHITEBOARD_FILTER_REGEX+"=/as")
public class DocumentServlet {
final Map<String,DocBuilder> builders = new ConcurrentHashMap<>();
public void doGet( HttpServletRequest rq, HttpServletResponse rsp )
throws IOException, ServletException {
InputStream in = getInputStream( rq.getPathInfo() );
if ( in == null )
....
String type = toType( rq.getPathInfo(), rq.getParameter("type") );
DocBuilder docbuilder = builders.get( type );
if ( docbuilder == null)
....
docbuilder.convert( type, in, rsp.getOutputStream() );
}
#Reference( cardinality=MULTIPLE, policy=DYNAMIC )
void addDocBuilder( DocBuilder db, Map<String,Object> props ) {
docbuilders.put(props.get("type"), db );
}
void removeDocBuilder(Map<String,Object> props ) {
docbuilders.remove(props.get("type"));
}
}
A DocBuilder could look like:
#Component( property = "type=ppt-pdf" )
public class PowerPointToPdf implements DocBuilder {
...
}

Related

Update list items in PagingLibrary w/o using Room (Network only)

I'm using Paging Library to load data from network using ItemKeyedDataSource. After fetching items user can edit them, this updates are done inside in Memory cache (no database like Room is used).
Now since the PagedList itself cannot be updated (discussed here) I have to recreate PagedList and pass it to the PagedListAdapter.
The update itself is no problem but after updating the recyclerView with the new PagedList, the list jumps to the beginning of the list destroying previous scroll position. Is there anyway to update PagedList while keeping scroll position (like how it works with Room)?
DataSource is implemented this way:
public class MentionKeyedDataSource extends ItemKeyedDataSource<Long, Mention> {
private Repository repository;
...
private List<Mention> cachedItems;
public MentionKeyedDataSource(Repository repository, ..., List<Mention> cachedItems){
super();
this.repository = repository;
this.teamId = teamId;
this.inboxId = inboxId;
this.filter = filter;
this.cachedItems = new ArrayList<>(cachedItems);
}
#Override
public void loadInitial(#NonNull LoadInitialParams<Long> params, final #NonNull ItemKeyedDataSource.LoadInitialCallback<Mention> callback) {
Observable.just(cachedItems)
.filter(() -> return cachedItems != null && !cachedItems.isEmpty())
.switchIfEmpty(repository.getItems(..., params.requestedLoadSize).map(...))
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe(response -> callback.onResult(response.data.list));
}
#Override
public void loadAfter(#NonNull LoadParams<Long> params, final #NonNull ItemKeyedDataSource.LoadCallback<Mention> callback) {
repository.getOlderItems(..., params.key, params.requestedLoadSize)
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe(response -> callback.onResult(response.data.list));
}
#Override
public void loadBefore(#NonNull LoadParams<Long> params, final #NonNull ItemKeyedDataSource.LoadCallback<Mention> callback) {
repository.getNewerItems(..., params.key, params.requestedLoadSize)
.subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe(response -> callback.onResult(response.data.list));
}
#NonNull
#Override
public Long getKey(#NonNull Mention item) {
return item.id;
}
}
The PagedList created like this:
PagedList.Config config = new PagedList.Config.Builder()
.setPageSize(PAGE_SIZE)
.setInitialLoadSizeHint(preFetchedItems != null && !preFetchedItems.isEmpty()
? preFetchedItems.size()
: PAGE_SIZE * 2
).build();
pagedMentionsList = new PagedList.Builder<>(new MentionKeyedDataSource(mRepository, team.id, inbox.id, mCurrentFilter, preFetchedItems)
, config)
.setFetchExecutor(ApplicationThreadPool.getBackgroundThreadExecutor())
.setNotifyExecutor(ApplicationThreadPool.getUIThreadExecutor())
.build();
The PagedListAdapter is created like this:
public class ItemAdapter extends PagedListAdapter<Item, ItemAdapter.ItemHolder> { //Adapter from google guide, Nothing special here.. }
mAdapter = new ItemAdapter(new DiffUtil.ItemCallback<Mention>() {
#Override
public boolean areItemsTheSame(Item oldItem, Item newItem) {
return oldItem.id == newItem.id;
}
#Override
public boolean areContentsTheSame(Item oldItem, Item newItem) {
return oldItem.equals(newItem);
}
});
, and updated like this:
mAdapter.submitList(pagedList);
P.S: If there is a better way to update list items without using Room please share.
All you need to do is to invalidate current DataSource each time you update your data.
What I would do:
move networking logic from MentionKeyedDataSource to new class that extends PagedList.BoundaryCallback and set it when building your PagedList.
move all you data to some DataProvider that holds all your downloaded data and has reference to DataSource. Each time data is updated in DataProvider invalidate current DataSource
Something like that maybe
val dataProvider = PagedDataProvider()
val dataSourceFactory = InMemoryDataSourceFactory(dataProvider = dataProvider)
where
class PagedDataProvider : DataProvider<Int, DataRow> {
private val dataCache = mutableMapOf<Int, List<DataRow>>()
override val sourceLiveData = MutableLiveData<DataSource<Int, DataRow>>()
//implement data add/remove/update here
//and after each modification call
//sourceLiveData.value?.invalidate()
}
and
class InMemoryDataSourceFactory<Key, Value>(
private val dataProvider: DataProvider<Key, Value>
) : DataSource.Factory<Key, Value>() {
override fun create(): DataSource<Key, Value> {
val source = InMemoryDataSource(dataProvider = dataProvider)
dataProvider.sourceLiveData.postValue(source)
return source
}
}
This approach is very similar to what Room does - every time table row is updated - current DataSource is invalidated and DataSourceFactory creates new data source.
You can modify directly in your adapter if you called currentList like that
class ItemsAdapter(): PagedListAdapter<Item, ItemsAdapter.ViewHolder(ITEMS_COMPARATOR) {
...
fun changeItem(position: Int,newData:String) {
currentList?.get(position)?.data = newData
notifyItemChanged(position)
}
}

Using map of maps as Maven plugin parameters

Is it possible to use a map of maps as a Maven plugin parameter?, e.g.
#Parameter
private Map<String, Map<String, String>> converters;
and then to use it like
<converters>
<json>
<indent>true</indent>
<strict>true</strict>
</json>
<yaml>
<stripComments>false</stripComments>
</yaml>
<converters>
If I use it like this, converters only contain the keys json and yaml with null as values.
I know it is possible to have complex objects as values, but is it also somehow possible to use maps for variable element values like in this example?
This is apparently a limitation of the sisu.plexus project internally used by the Mojo API. If you peek inside the MapConverter source, you'll find out that it first tries to fetch the value of the map by trying to interpret the configuration as a String (invoking fromExpression), and when this fails, looks up the expected type of the value. However this method doesn't check for parameterized types, which is our case here (since the type of the map value is Map<String, String>). I filed the bug 498757 on the Bugzilla of this project to track this.
Using a custom wrapper object
One workaround would be to not use a Map<String, String> as value but use a custom object:
#Parameter
private Map<String, Converter> converters;
with a class Converter, located in the same package as the Mojo, being:
public class Converter {
#Parameter
private Map<String, String> properties;
#Override
public String toString() { return properties.toString(); } // to test
}
You can then configure your Mojo with:
<converters>
<json>
<properties>
<indent>true</indent>
<strict>true</strict>
</properties>
</json>
<yaml>
<properties>
<stripComments>false</stripComments>
</properties>
</yaml>
</converters>
This configuration will correctly inject the values in the inner-maps. It also keeps the variable aspect: the object is only introduced as a wrapper around the inner-map. I tested this with a simple test mojo having
public void execute() throws MojoExecutionException, MojoFailureException {
getLog().info(converters.toString());
}
and the output was the expected {json={indent=true, strict=true}, yaml={stripComments=false}}.
Using a custom configurator
I also found a way to keep a Map<String, Map<String, String>> by using a custom ComponentConfigurator.
So we want to fix MapConverter by inhering it, the trouble is how to register this new FixedMapConverter. By default, Maven uses a BasicComponentConfigurator to configure the Mojo and it relies on a DefaultConverterLookup to look-up for converters to use for a specific class. In this case, we want to provide a custom converted for Map that will return our fixed version. Therefore, we need to extend this basic configurator and register our new converter.
import org.codehaus.plexus.classworlds.realm.ClassRealm;
import org.codehaus.plexus.component.configurator.BasicComponentConfigurator;
import org.codehaus.plexus.component.configurator.ComponentConfigurationException;
import org.codehaus.plexus.component.configurator.ConfigurationListener;
import org.codehaus.plexus.component.configurator.expression.ExpressionEvaluator;
import org.codehaus.plexus.configuration.PlexusConfiguration;
public class CustomBasicComponentConfigurator extends BasicComponentConfigurator {
#Override
public void configureComponent(final Object component, final PlexusConfiguration configuration,
final ExpressionEvaluator evaluator, final ClassRealm realm, final ConfigurationListener listener)
throws ComponentConfigurationException {
converterLookup.registerConverter(new FixedMapConverter());
super.configureComponent(component, configuration, evaluator, realm, listener);
}
}
Then we need to tell Maven to use this new configurator instead of the basic one. This is a 2-step process:
Inside your Maven plugin, create a file src/main/resources/META-INF/plexus/components.xml registering the new component:
<?xml version="1.0" encoding="UTF-8"?>
<component-set>
<components>
<component>
<role>org.codehaus.plexus.component.configurator.ComponentConfigurator</role>
<role-hint>custom-basic</role-hint>
<implementation>package.to.CustomBasicComponentConfigurator</implementation>
</component>
</components>
</component-set>
Note a few things: we declare a new component having the hint "custom-basic", this will serve as an id to refer to it and the <implementation> refers to the fully qualified class name of our configurator.
Tell our Mojo to use this configurator with the configurator attribute of the #Mojo annotation:
#Mojo(name = "test", configurator = "custom-basic")
The configurator passed here corresponds to the role-hint specified in the components.xml above.
With such a set-up, you can finally declare
#Parameter
private Map<String, Map<String, String>> converters;
and everything will be injected properly: Maven will use our custom configurator, that will register our fixed version of the map converter and will correctly convert the inner-maps.
Full code of FixedMapConverter (which pretty much copy-pastes MapConverter because we can't override the faulty method):
public class FixedMapConverter extends MapConverter {
public Object fromConfiguration(final ConverterLookup lookup, final PlexusConfiguration configuration,
final Class<?> type, final Type[] typeArguments, final Class<?> enclosingType, final ClassLoader loader,
final ExpressionEvaluator evaluator, final ConfigurationListener listener)
throws ComponentConfigurationException {
final Object value = fromExpression(configuration, evaluator, type);
if (null != value) {
return value;
}
try {
final Map<Object, Object> map = instantiateMap(configuration, type, loader);
final Class<?> elementType = findElementType(typeArguments);
if (Object.class == elementType || String.class == elementType) {
for (int i = 0, size = configuration.getChildCount(); i < size; i++) {
final PlexusConfiguration element = configuration.getChild(i);
map.put(element.getName(), fromExpression(element, evaluator));
}
return map;
}
// handle maps with complex element types...
final ConfigurationConverter converter = lookup.lookupConverterForType(elementType);
for (int i = 0, size = configuration.getChildCount(); i < size; i++) {
Object elementValue;
final PlexusConfiguration element = configuration.getChild(i);
try {
elementValue = converter.fromConfiguration(lookup, element, elementType, enclosingType, //
loader, evaluator, listener);
}
// TEMP: remove when http://jira.codehaus.org/browse/MSHADE-168
// is fixed
catch (final ComponentConfigurationException e) {
elementValue = fromExpression(element, evaluator);
Logs.warn("Map in " + enclosingType + " declares value type as: {} but saw: {} at runtime",
elementType, null != elementValue ? elementValue.getClass() : null);
}
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
map.put(element.getName(), elementValue);
}
return map;
} catch (final ComponentConfigurationException e) {
if (null == e.getFailedConfiguration()) {
e.setFailedConfiguration(configuration);
}
throw e;
}
}
#SuppressWarnings("unchecked")
private Map<Object, Object> instantiateMap(final PlexusConfiguration configuration, final Class<?> type,
final ClassLoader loader) throws ComponentConfigurationException {
final Class<?> implType = getClassForImplementationHint(type, configuration, loader);
if (null == implType || Modifier.isAbstract(implType.getModifiers())) {
return new TreeMap<Object, Object>();
}
final Object impl = instantiateObject(implType);
failIfNotTypeCompatible(impl, type, configuration);
return (Map<Object, Object>) impl;
}
private static Class<?> findElementType( final Type[] typeArguments )
{
if ( null != typeArguments && typeArguments.length > 1 )
{
if ( typeArguments[1] instanceof Class<?> )
{
return (Class<?>) typeArguments[1];
}
// begin fix here
if ( typeArguments[1] instanceof ParameterizedType )
{
return (Class<?>) ((ParameterizedType) typeArguments[1]).getRawType();
}
// end fix here
}
return Object.class;
}
}
One solution is quite simple and works for 1-level nesting. A more sophisticated approach can be found in the alternative answer which possibly also allows for deeper nesting of Maps.
Instead of using an interface as type parameter, simply use a concrete class like TreeMap
#Parameter
private Map<String, TreeMap> converters.
The reason is this check in MapConverter which fails for an interface but suceeds for a concrete class:
private static Class<?> findElementType( final Type[] typeArguments )
{
if ( null != typeArguments && typeArguments.length > 1
&& typeArguments[1] instanceof Class<?> )
{
return (Class<?>) typeArguments[1];
}
return Object.class;
}
As a side-note, an as it is also related to this answer for Maven > 3.3.x it also works to install a custom converter by subclassing BasicComponentConfigurator and using it as a Plexus component. BasicComponentConfigurator has the DefaultConverterLookup as a protected member variable and is hence easily accessible for registering custom converters.

Read Application Object from GemFire using Spring Data GemFire. Data stored using SpringXD's gemfire-json-server

I'm using the gemfire-json-server module in SpringXD to populate a GemFire grid with json representation of “Order” objects. I understand the gemfire-json-server module saves data in Pdx form in GemFire. I’d like to read the contents of the GemFire grid into an “Order” object in my application. I get a ClassCastException that reads:
java.lang.ClassCastException: com.gemstone.gemfire.pdx.internal.PdxInstanceImpl cannot be cast to org.apache.geode.demo.cc.model.Order
I’m using the Spring Data GemFire libraries to read contents of the cluster. The code snippet to read the contents of the Grid follows:
public interface OrderRepository extends GemfireRepository<Order, String>{
Order findByTransactionId(String transactionId);
}
How can I use Spring Data GemFire to convert data read from the GemFire cluster into an Order object?
Note: The data was initially stored in GemFire using SpringXD's gemfire-json-server-module
Still waiting to hear back from the GemFire PDX engineering team, specifically on Region.get(key), but, interestingly enough if you annotate your application domain object with...
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public class Order ... {
...
}
This works!
Under-the-hood I knew the GemFire JSONFormatter class (see here) used Jackson's API to un/marshal (de/serialize) JSON data to and from PDX.
However, the orderRepository.findOne(ID) and ordersRegion.get(key) still do not function as I would expect. See updated test class below for more details.
Will report back again when I have more information.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = GemFireConfiguration.class)
#SuppressWarnings("unused")
public class JsonToPdxToObjectDataAccessIntegrationTest {
protected static final AtomicLong ID_SEQUENCE = new AtomicLong(0l);
private Order amazon;
private Order bestBuy;
private Order target;
private Order walmart;
#Autowired
private OrderRepository orderRepository;
#Resource(name = "Orders")
private com.gemstone.gemfire.cache.Region<Long, Object> orders;
protected Order createOrder(String name) {
return createOrder(ID_SEQUENCE.incrementAndGet(), name);
}
protected Order createOrder(Long id, String name) {
return new Order(id, name);
}
protected <T> T fromPdx(Object pdxInstance, Class<T> toType) {
try {
if (pdxInstance == null) {
return null;
}
else if (toType.isInstance(pdxInstance)) {
return toType.cast(pdxInstance);
}
else if (pdxInstance instanceof PdxInstance) {
return new ObjectMapper().readValue(JSONFormatter.toJSON(((PdxInstance) pdxInstance)), toType);
}
else {
throw new IllegalArgumentException(String.format("Expected object of type PdxInstance; but was (%1$s)",
pdxInstance.getClass().getName()));
}
}
catch (IOException e) {
throw new RuntimeException(String.format("Failed to convert PDX to object of type (%1$s)", toType), e);
}
}
protected void log(Object value) {
System.out.printf("Object of Type (%1$s) has Value (%2$s)", ObjectUtils.nullSafeClassName(value), value);
}
protected Order put(Order order) {
Object existingOrder = orders.putIfAbsent(order.getTransactionId(), toPdx(order));
return (existingOrder != null ? fromPdx(existingOrder, Order.class) : order);
}
protected PdxInstance toPdx(Object obj) {
try {
return JSONFormatter.fromJSON(new ObjectMapper().writeValueAsString(obj));
}
catch (JsonProcessingException e) {
throw new RuntimeException(String.format("Failed to convert object (%1$s) to JSON", obj), e);
}
}
#Before
public void setup() {
amazon = put(createOrder("Amazon Order"));
bestBuy = put(createOrder("BestBuy Order"));
target = put(createOrder("Target Order"));
walmart = put(createOrder("Wal-Mart Order"));
}
#Test
public void regionGet() {
assertThat((Order) orders.get(amazon.getTransactionId()), is(equalTo(amazon)));
}
#Test
public void repositoryFindOneMethod() {
log(orderRepository.findOne(target.getTransactionId()));
assertThat(orderRepository.findOne(target.getTransactionId()), is(equalTo(target)));
}
#Test
public void repositoryQueryMethod() {
assertThat(orderRepository.findByTransactionId(amazon.getTransactionId()), is(equalTo(amazon)));
assertThat(orderRepository.findByTransactionId(bestBuy.getTransactionId()), is(equalTo(bestBuy)));
assertThat(orderRepository.findByTransactionId(target.getTransactionId()), is(equalTo(target)));
assertThat(orderRepository.findByTransactionId(walmart.getTransactionId()), is(equalTo(walmart)));
}
#Region("Orders")
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public static class Order implements PdxSerializable {
protected static final OrderPdxSerializer pdxSerializer = new OrderPdxSerializer();
#Id
private Long transactionId;
private String name;
public Order() {
}
public Order(Long transactionId) {
this.transactionId = transactionId;
}
public Order(Long transactionId, String name) {
this.transactionId = transactionId;
this.name = name;
}
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public Long getTransactionId() {
return transactionId;
}
public void setTransactionId(final Long transactionId) {
this.transactionId = transactionId;
}
#Override
public void fromData(PdxReader reader) {
Order order = (Order) pdxSerializer.fromData(Order.class, reader);
if (order != null) {
this.transactionId = order.getTransactionId();
this.name = order.getName();
}
}
#Override
public void toData(PdxWriter writer) {
pdxSerializer.toData(this, writer);
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Order)) {
return false;
}
Order that = (Order) obj;
return ObjectUtils.nullSafeEquals(this.getTransactionId(), that.getTransactionId());
}
#Override
public int hashCode() {
int hashValue = 17;
hashValue = 37 * hashValue + ObjectUtils.nullSafeHashCode(getTransactionId());
return hashValue;
}
#Override
public String toString() {
return String.format("{ #type = %1$s, id = %2$d, name = %3$s }",
getClass().getName(), getTransactionId(), getName());
}
}
public static class OrderPdxSerializer implements PdxSerializer {
#Override
public Object fromData(Class<?> type, PdxReader in) {
if (Order.class.equals(type)) {
return new Order(in.readLong("transactionId"), in.readString("name"));
}
return null;
}
#Override
public boolean toData(Object obj, PdxWriter out) {
if (obj instanceof Order) {
Order order = (Order) obj;
out.writeLong("transactionId", order.getTransactionId());
out.writeString("name", order.getName());
return true;
}
return false;
}
}
public interface OrderRepository extends GemfireRepository<Order, Long> {
Order findByTransactionId(Long transactionId);
}
#Configuration
protected static class GemFireConfiguration {
#Bean
public Properties gemfireProperties() {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", JsonToPdxToObjectDataAccessIntegrationTest.class.getSimpleName());
gemfireProperties.setProperty("mcast-port", "0");
gemfireProperties.setProperty("log-level", "warning");
return gemfireProperties;
}
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
//cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxSerializer(new OrderPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
#Bean(name = "Orders")
public PartitionedRegionFactoryBean ordersRegion(Cache gemfireCache) {
PartitionedRegionFactoryBean regionFactoryBean = new PartitionedRegionFactoryBean();
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setName("Orders");
regionFactoryBean.setPersistent(false);
return regionFactoryBean;
}
#Bean
public GemfireRepositoryFactoryBean orderRepository() {
GemfireRepositoryFactoryBean<OrderRepository, Order, Long> repositoryFactoryBean =
new GemfireRepositoryFactoryBean<>();
repositoryFactoryBean.setRepositoryInterface(OrderRepository.class);
return repositoryFactoryBean;
}
}
}
So, as you are aware, GemFire (and by extension, Apache Geode) stores JSON in PDX format (as a PdxInstance). This is so GemFire can interoperate with many different language-based clients (native C++/C#, web-oriented (JavaScript, Pyhton, Ruby, etc) using the Developer REST API, in addition to Java) and also to be able to use OQL to query the JSON data.
After a bit of experimentation, I am surprised GemFire is not behaving as I would expect. I created an example, self-contained test class (i.e. no Spring XD, of course) that simulates your use case... essentially storing JSON data in GemFire as PDX and then attempting to read the data back out as the Order application domain object type using the Repository abstraction, logical enough.
Given the use of the Repository abstraction and implementation from Spring Data GemFire, the infrastructure will attempt to access the application domain object based on the Repository generic type parameter (in this case "Order" from the "OrderRepository" definition).
However, the data is stored in PDX, so now what?
No matter, Spring Data GemFire provides the MappingPdxSerializer class to convert PDX instances back to application domain objects using the same "mapping meta-data" that the Repository infrastructure uses. Cool, so I plug that in...
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
You will also notice, I set the PDX 'read-serialized' property (cacheFactoryBean.setPdxReadSerialized(false);) to false in order to ensure data access operations return the domain object and not the PDX instance.
However, this had no affect on the query method. In fact, it had no affect on the following operations either...
orderRepository.findOne(amazonOrder.getTransactionId());
ordersRegion.get(amazonOrder.getTransactionId());
Both calls returned a PdxInstance. Note, the implementation of OrderRepository.findOne(..) is based on SimpleGemfireRepository.findOne(key), which uses GemfireTemplate.get(key), which just performs Region.get(key), and so is effectively the same as (ordersRegion.get(amazonOrder.getTransactionId();). The outcome should not be, especially with Region.get() and read-serialized set to false.
With the OQL query (SELECT * FROM /Orders WHERE transactionId = $1) generated from the findByTransactionId(String id), the Repository infrastructure has a bit less control over what the GemFire query engine will return based on what the caller (OrderRepository) expects (based on the generic type parameter), so running OQL statements could potentially behave differently than direct Region access using get.
Next, I went onto try modifying the Order type to implement PdxSerializable, to handle the conversion during data access operations (direct Region access with get, OQL, or otherwise). This had no affect.
So, I tried to implement a custom PdxSerializer for Order objects. This had no affect either.
The only thing I can conclude at this point is something is getting lost in translation between Order -> JSON -> PDX and then from PDX -> Order. Seemingly, GemFire needs additional type meta-data required by PDX (something like #JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type") in the JSON data that PDXFormatter recognizes, though I am not certain it does.
Note, in my test class, I used Jackson's ObjectMapper to serialize the Order to JSON and then GemFire's JSONFormatter to serialize the JSON to PDX, which I suspect Spring XD is doing similarly under-the-hood. In fact, Spring XD uses Spring Data GemFire and is most likely using the JSON Region Auto Proxy support. That is exactly what SDG's JSONRegionAdvice object does (see here).
Anyway, I have an inquiry out to the rest of the GemFire engineering team. There are also things that could be done in Spring Data GemFire to ensure the PDX data is converted, such as making use of the MappingPdxSerializer directly to convert the data automatically on behalf of the caller if the data is indeed of type PdxInstance. Similar to how JSON Region Auto Proxying works, you could write AOP interceptor for the Orders Region to automagicaly convert PDX to an Order.
Though, I don't think any of this should be necessary as GemFire should be doing the right thing in this case. Sorry I don't have a better answer right now. Let's see what I find out.
Cheers and stay tuned!
See subsequent post for test code.

#MessageMapping with placeholders

I am working with Spring-websocket and I have the following problem:
I am trying to put a placeholder inside a #MessageMapping annotation in order to get the url from properties. It works with #RequestMapping but not with #MessageMapping.
If I use this placeholder, the URL is null. Any idea or suggestion?
Example:
#RequestMapping(value= "${myProperty}")
#MessageMapping("${myProperty}")
Rossen Stoyanchev added placeholder support for #MessageMapping and #SubscribeMapping methods.
See Jira issue: https://jira.spring.io/browse/SPR-13271
Spring allows you to use property placeholders in #RequestMapping, but not in #MessageMapping. This is 'cause the MessageHandler. So, we need to override the default MessageHandler to do this.
WebSocketAnnotationMethodMessageHandler does not support placeholders and you need add this support yourself.
For simplicity I just created another WebSocketAnnotationMethodMessageHandler class in my project at the same package of the original, org.springframework.web.socket.messaging, and override getMappingForMethod method from SimpAnnotationMethodMessageHandler with same content, changing only how SimpMessageMappingInfo is contructed using this with this methods (private in WebSocketAnnotationMethodMessageHandler):
private SimpMessageMappingInfo createMessageMappingCondition(final MessageMapping annotation) {
return new SimpMessageMappingInfo(SimpMessageTypeMessageCondition.MESSAGE, new DestinationPatternsMessageCondition(
this.resolveAnnotationValues(annotation.value()), this.getPathMatcher()));
}
private SimpMessageMappingInfo createSubscribeCondition(final SubscribeMapping annotation) {
final SimpMessageTypeMessageCondition messageTypeMessageCondition = SimpMessageTypeMessageCondition.SUBSCRIBE;
return new SimpMessageMappingInfo(messageTypeMessageCondition, new DestinationPatternsMessageCondition(
this.resolveAnnotationValues(annotation.value()), this.getPathMatcher()));
}
These methods now will resolve value considering properties (calling resolveAnnotationValues method), so we need use something like this:
private String[] resolveAnnotationValues(final String[] destinationNames) {
final int length = destinationNames.length;
final String[] result = new String[length];
for (int i = 0; i < length; i++) {
result[i] = this.resolveAnnotationValue(destinationNames[i]);
}
return result;
}
private String resolveAnnotationValue(final String name) {
if (!(this.getApplicationContext() instanceof ConfigurableApplicationContext)) {
return name;
}
final ConfigurableApplicationContext applicationContext = (ConfigurableApplicationContext) this.getApplicationContext();
final ConfigurableBeanFactory configurableBeanFactory = applicationContext.getBeanFactory();
final String placeholdersResolved = configurableBeanFactory.resolveEmbeddedValue(name);
final BeanExpressionResolver exprResolver = configurableBeanFactory.getBeanExpressionResolver();
if (exprResolver == null) {
return name;
}
final Object result = exprResolver.evaluate(placeholdersResolved, new BeanExpressionContext(configurableBeanFactory, null));
return result != null ? result.toString() : name;
}
You still need to define a PropertySourcesPlaceholderConfigurer bean in your configuration.
If you are using XML based configuration, include something like this:
<context:property-placeholder location="classpath:/META-INF/spring/url-mapping-config.properties" />
If you are using Java based configuration, you can try in this way:
#Configuration
#PropertySources(value = #PropertySource("classpath:/META-INF/spring/url-mapping-config.properties"))
public class URLMappingConfig {
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
Obs.: in this case, url-mapping-config.properties file are in a gradle/maven project in src\main\resources\META-INF\spring folder and content look like this:
myPropertyWS=urlvaluews
This is my sample controller:
#Controller
public class WebSocketController {
#SendTo("/topic/test")
#MessageMapping("${myPropertyWS}")
public String test() throws Exception {
Thread.sleep(4000); // simulated delay
return "OK";
}
}
With default MessageHandler startup log will print something like this:
INFO: Mapped "{[/${myPropertyWS}],messageType=[MESSAGE]}" onto public java.lang.String com.brunocesar.controller.WebSocketController.test() throws java.lang.Exception
And with our MessageHandler now print this:
INFO: Mapped "{[/urlvaluews],messageType=[MESSAGE]}" onto public java.lang.String com.brunocesar.controller.WebSocketController.test() throws java.lang.Exception
See in this gist the full WebSocketAnnotationMethodMessageHandler implementation.
EDIT: this solution resolves the problem for versions before 4.2 GA. For more information, see this jira.
Update :
Now I understood what you mean, but I think that is not possible(yet).
Documentation does not mention anything related to Path mapping URIs.
Old answer
Use
#MessageMapping("/handler/{myProperty}")
instead of
#MessageMapping("/handler/${myProperty}")
And use it like this:
#MessageMapping("/myHandler/{username}")
public void handleTextMessage(#DestinationVariable String username,Message message) {
//do something
}
#MessageMapping("/chat/{roomId}")
public Message handleMessages(#DestinationVariable("roomId") String roomId, #Payload Message message, Traveler traveler) throws Exception {
System.out.println("Message received for room: " + roomId);
System.out.println("User: " + traveler.toString());
// store message in database
message.setAuthor(traveler);
message.setChatRoomId(Integer.parseInt(roomId));
int id = MessageRepository.getInstance().save(message);
message.setId(id);
return message;
}

Template variables with ControllerLinkBuilder

I want my response to include this:
"keyMaps":{
"href":"http://localhost/api/keyMaps{/keyMapId}",
"templated":true
}
That's easy enough to achieve:
add(new Link("http://localhost/api/keyMaps{/keyMapId}", "keyMaps"));
But, of course, I'd rather use the ControllerLinkBuilder, like this:
add(linkTo(methodOn(KeyMapController.class).getKeyMap("{keyMapId}")).withRel("keyMaps"));
The problem is that by the time the variable "{keyMapId}" reaches the UriTemplate constructor, it's been included in an encoded URL:
http://localhost/api/keyMaps/%7BkeyMapId%7D
So UriTemplate's constructor doesn't recognise it as containing a variable.
How can I persuade ControllerLinkBuilder that I want to use template variables?
It looks to me like the current state of Spring-HATEOAS doesn't allow this via the ControllerLinkBuilder (I'd very much like to be proven wrong), so I have implemented this myself using the following classes for templating query parameters:
public class TemplatedLinkBuilder {
private static final TemplatedLinkBuilderFactory FACTORY = new TemplatedLinkBuilderFactory();
public static final String ENCODED_LEFT_BRACE = "%7B";
public static final String ENCODED_RIGHT_BRACE = "%7D";
private UriComponentsBuilder uriComponentsBuilder;
TemplatedLinkBuilder(UriComponentsBuilder builder) {
uriComponentsBuilder = builder;
}
public static TemplatedLinkBuilder linkTo(Object invocationValue) {
return FACTORY.linkTo(invocationValue);
}
public static <T> T methodOn(Class<T> controller, Object... parameters) {
return DummyInvocationUtils.methodOn(controller, parameters);
}
public Link withRel(String rel) {
return new Link(replaceTemplateMarkers(uriComponentsBuilder.build().toString()), rel);
}
public Link withSelfRel() {
return withRel(Link.REL_SELF);
}
private String replaceTemplateMarkers(String encodedUri) {
return encodedUri.replaceAll(ENCODED_LEFT_BRACE, "{").replaceAll(ENCODED_RIGHT_BRACE, "}");
}
}
and
public class TemplatedLinkBuilderFactory {
private final ControllerLinkBuilderFactory controllerLinkBuilderFactory;
public TemplatedLinkBuilderFactory() {
this.controllerLinkBuilderFactory = new ControllerLinkBuilderFactory();
}
public TemplatedLinkBuilder linkTo(Object invocationValue) {
ControllerLinkBuilder controllerLinkBuilder = controllerLinkBuilderFactory.linkTo(invocationValue);
UriComponentsBuilder uriComponentsBuilder = controllerLinkBuilder.toUriComponentsBuilder();
Assert.isInstanceOf(DummyInvocationUtils.LastInvocationAware.class, invocationValue);
DummyInvocationUtils.LastInvocationAware invocations = (DummyInvocationUtils.LastInvocationAware) invocationValue;
DummyInvocationUtils.MethodInvocation invocation = invocations.getLastInvocation();
Object[] arguments = invocation.getArguments();
MethodParameters parameters = new MethodParameters(invocation.getMethod());
for (MethodParameter requestParameter : parameters.getParametersWith(RequestParam.class)) {
Object value = arguments[requestParameter.getParameterIndex()];
if (value == null) {
uriComponentsBuilder.queryParam(requestParameter.getParameterName(), "{" + requestParameter.getParameterName() + "}");
}
}
return new TemplatedLinkBuilder(uriComponentsBuilder);
}
}
Which embeds the normal ControllerLinkBuilder and then uses similar logic to parse for #RequestParam annotated parameters that are null and add these on to the query parameters. Also, our client resuses these templated URIs to perform further requests to the server. To achieve this and not need to worry about stripping out the unused templated params, I have to perform the reverse operation (swapping {params} with null), which I'm doing using a custom Spring RequestParamMethodArgumentResolver as follows
public class TemplatedRequestParamResolver extends RequestParamMethodArgumentResolver {
public TemplatedRequestParamResolver() {
super(false);
}
#Override
protected Object resolveName(String name, MethodParameter parameter, NativeWebRequest webRequest) throws Exception {
Object value = super.resolveName(name, parameter, webRequest);
if (value instanceof Object[]) {
Object[] valueAsCollection = (Object[])value;
List<Object> resultList = new LinkedList<Object>();
for (Object collectionEntry : valueAsCollection) {
if (nullifyTemplatedValue(collectionEntry) != null) {
resultList.add(collectionEntry);
}
}
if (resultList.isEmpty()) {
value = null;
} else {
value = resultList.toArray();
}
} else{
value = nullifyTemplatedValue(value);
}
return value;
}
private Object nullifyTemplatedValue(Object value) {
if (value != null && value.toString().startsWith("{") && value.toString().endsWith("}")) {
value = null;
}
return value;
}
}
Also this needs to replace the existing RequestParamMethodArgumentResolver which I do with:
#Configuration
public class ConfigureTemplatedRequestParamResolver {
private #Autowired RequestMappingHandlerAdapter adapter;
#PostConstruct
public void replaceArgumentMethodHandlers() {
List<HandlerMethodArgumentResolver> argumentResolvers = new ArrayList<HandlerMethodArgumentResolver>(adapter.getArgumentResolvers());
for (int cursor = 0; cursor < argumentResolvers.size(); ++cursor) {
HandlerMethodArgumentResolver handlerMethodArgumentResolver = argumentResolvers.get(cursor);
if (handlerMethodArgumentResolver instanceof RequestParamMethodArgumentResolver) {
argumentResolvers.remove(cursor);
argumentResolvers.add(cursor, new TemplatedRequestParamResolver());
break;
}
}
adapter.setArgumentResolvers(argumentResolvers);
}
}
Unfortunately, although { and } are valid characters in a templated URI, they are not valid in a URI, which may be a problem for your client code depending on how strict it is. I'd much prefer a neater solution built into Spring-HATEOAS!
With latest versions of spring-hateoas you can do the following:
UriComponents uriComponents = UriComponentsBuilder.fromUri(linkBuilder.toUri()).build();
UriTemplate template = new UriTemplate(uriComponents.toUriString())
.with("keyMapId", TemplateVariable.SEGMENT);
will give you: http://localhost:8080/bla{/keyMapId}",
Starting with this commit:
https://github.com/spring-projects/spring-hateoas/commit/2daf8aabfb78b6767bf27ac3e473832c872302c7
You can now pass null where path variable is expected. It works for me, without workarounds.
resource.add(linkTo(methodOn(UsersController.class).someMethod(null)).withRel("someMethod"));
And the result:
"someMethod": {
"href": "http://localhost:8080/api/v1/users/{userId}",
"templated": true
},
Also check related issues: https://github.com/spring-projects/spring-hateoas/issues/545
We've run into the same problem. General workaround is we have our own LinkBuilder class with a bunch of static helpers. Templated ones look like this:
public static Link linkToSubcategoriesTemplated(String categoryId){
return new Link(
new UriTemplate(
linkTo(methodOn(CategoryController.class).subcategories(null, null, categoryId))
.toUriComponentsBuilder().build().toUriString(),
// register it as variable
getBaseTemplateVariables()
),
REL_SUBCATEGORIES
);
}
private static TemplateVariables getBaseTemplateVariables() {
return new TemplateVariables(
new TemplateVariable("page", TemplateVariable.VariableType.REQUEST_PARAM),
new TemplateVariable("sort", TemplateVariable.VariableType.REQUEST_PARAM),
new TemplateVariable("size", TemplateVariable.VariableType.REQUEST_PARAM)
);
}
This is for exposing the parameters of a controller response of a PagedResource.
then in the controllers we call this an append a withRel as needed.
According to this issue comment, this will be addressed in an upcoming release of spring-hateoas.
For now, there's a drop-in replacement for ControllerLinkBuilder available from de.escalon.hypermedia:spring-hateoas-ext in Maven Central.
I can now do this:
import static de.escalon.hypermedia.spring.AffordanceBuilder.*
...
add(linkTo(methodOn(KeyMapController.class).getKeyMap(null)).withRel("keyMaps"));
I pass in null as the parameter value to indicate I want to use a template variable. The name of the variable is automatically pulled from the controller.
I needed to include a link with template variables in the root of a spring data rest application, to get access via traverson to an oauth2 token. This is working fine, maybe useful:
#Component
class RepositoryLinksResourceProcessor implements ResourceProcessor<RepositoryLinksResource> {
#Override
RepositoryLinksResource process(RepositoryLinksResource resource) {
UriTemplate uriTemplate = new UriTemplate(
ControllerLinkBuilder.
linkTo(
TokenEndpoint,
TokenEndpoint.getDeclaredMethod("postAccessToken", java.security.Principal, Map )).
toUriComponentsBuilder().
build().
toString(),
new TemplateVariables([
new TemplateVariable("username", TemplateVariable.VariableType.REQUEST_PARAM),
new TemplateVariable("password", TemplateVariable.VariableType.REQUEST_PARAM),
new TemplateVariable("clientId", TemplateVariable.VariableType.REQUEST_PARAM),
new TemplateVariable("clientSecret", TemplateVariable.VariableType.REQUEST_PARAM)
])
)
resource.add(
new Link( uriTemplate,
"token"
)
)
return resource
}
}
Based on the previous comments I have implemented a generic helper method (against spring-hateoas-0.20.0) as a "temporary" workaround. The implementation does consider only RequestParameters and is far from being optimized or well tested. It might come handy to some other poor soul traveling down the same rabbit hole though:
public static Link getTemplatedLink(final Method m, final String rel) {
DefaultParameterNameDiscoverer disco = new DefaultParameterNameDiscoverer();
ControllerLinkBuilder builder = ControllerLinkBuilder.linkTo(m.getDeclaringClass(), m);
UriTemplate uriTemplate = new UriTemplate(UriComponentsBuilder.fromUri(builder.toUri()).build().toUriString());
Annotation[][] parameterAnnotations = m.getParameterAnnotations();
int param = 0;
for (Annotation[] parameterAnnotation : parameterAnnotations) {
for (Annotation annotation : parameterAnnotation) {
if (annotation.annotationType().equals(RequestParam.class)) {
RequestParam rpa = (RequestParam) annotation;
String parameterName = rpa.name();
if (StringUtils.isEmpty(parameterName)) parameterName = disco.getParameterNames(m)[param];
uriTemplate = uriTemplate.with(parameterName, TemplateVariable.VariableType.REQUEST_PARAM);
}
}
param++;
}
return new Link(uriTemplate, rel);
}

Resources