MTOM support ws:outbound-gateway - spring

We have a dynamic ws-service client, like this:
<si-ws:outbound-gateway id="wsOutboundGateway" uri="{uri}" reply-timeout="120000" fault-message-resolver="wsFaultMessageResolver">
<si-ws:uri-variable name="uri" expression="headers.URL"/>
<si-ws:request-handler-advice-chain>
<ref bean="wsRetryAdvice"/>
</si-ws:request-handler-advice-chain>
</si-ws:outbound-gateway>
is it possible handle mtom attachments in dynamic way, without a custom marshalling?
or for example it could, like this, but if answer have a mtom, we have another case for processing?
private static class DefaultSourceExtractor extends TransformerObjectSupport implements SourceExtractor<DOMSource> {
#Override
public DOMSource extractData(Source source) throws IOException, TransformerException {
if (source instanceof DOMSource) {
return (DOMSource)source;
}
DOMResult result = new DOMResult();
this.transform(source, result);
return new DOMSource(result.getNode());
}

Related

Custom FileInputFormat always assign one filesplit to one slot

I have been writing protobuf records to our s3 buckets. And I want to use flink dataset api to read from it. So I implemented a custom FileInputFormat to achieve this. The code is as below.
public class ProtobufInputFormat extends FileInputFormat<StandardLog.Pageview> {
public ProtobufInputFormat() {
}
private transient boolean reachedEnd = false;
#Override
public boolean reachedEnd() throws IOException {
return reachedEnd;
}
#Override
public StandardLog.Pageview nextRecord(StandardLog.Pageview reuse) throws IOException {
StandardLog.Pageview pageview = StandardLog.Pageview.parseDelimitedFrom(stream);
if (pageview == null) {
reachedEnd = true;
}
return pageview;
}
#Override
public boolean supportsMultiPaths() {
return true;
}
}
public class BatchReadJob {
public static void main(String... args) throws Exception {
String readPath1 = args[0];
ExecutionEnvironment env = ExecutionEnvironment.getExecutionEnvironment();
ProtobufInputFormat inputFormat = new ProtobufInputFormat();
inputFormat.setNestedFileEnumeration(true);
inputFormat.setFilePaths(readPath1);
DataSet<StandardLog.Pageview> dataSource = env.createInput(inputFormat);
dataSource.map(new MapFunction<StandardLog.Pageview, String>() {
#Override
public String map(StandardLog.Pageview value) throws Exception {
return value.getId();
}
}).writeAsText("s3://xxx", FileSystem.WriteMode.OVERWRITE);
env.execute();
}
}
The problem is that flink always assign one filesplit to one parallelism slot. In other word, it always process the same number of file split as the number of the parallelism.
I want to know what's the correct way of implementing custom FileInputFormat.
Thanks.
I believe the behavior you're seeing is because ExecutionJobVertex calls the FileInputFormat. createInputSplits() method with a minNumSplits parameter equal to the vertex (data source) parallelism. So if you want a different behavior, then you'd have to override the createInputSplits method.
Though you didn't say what behavior you actually wanted. If, for example, you just want one split per file, then you could override the testForUnsplittable() method in your subclass of FileInputFormat to always return true; it should also set the (protected) unsplittable boolean to true.

Spring-WS - return valid response on Exception

I have soap andpoint which should return response type A based on request type B. But during processing of request, i'm expecting errors (like unable to call downastream service) which throw cutom exception for example type ExpEx. And now i wat to do custom error mapping, because in case of errors I don't want to return type A but want to return type CFault (which is defined in wsd also).
Now question:
- is is possible to create custom eero handle which rturn CFault instead A
- or is it possible to make enpoint allow to return two types of response A and CFault (I think Object) ?
my enpoint:
public class FantasticEndpoint extend WebServiceEndpoint {
private static final String NAMESPACE = "http://www.fantastic.com/SOA/tmp/FantasticService/v_2_4";
#PayloadRoot(namespace = NAMESPACE, localPart = "handleBOperation")
#ResponsePayload
public A createConsumers(#RequestPayload B b{
//do some dangerous logic possility throw EXCEPTION
// if EXCEPTION return CFault or return A if normal processing
}
}
First of all, take a look on that: https://docs.spring.io/spring-ws/sites/2.0/reference/html/server.html#server-endpoint-exception-resolver
I am going to highlight some of that:
According to documentation, you can create your own exception class to indicate the SOAP Fault that should be returned whenever that exception is thrown. Just annotate class with #SoapFault annotation.
import org.springframework.ws.soap.server.endpoint.annotation.FaultCode;
import org.springframework.ws.soap.server.endpoint.annotation.SoapFault;
#SoapFault(faultCode = FaultCode.SERVER)
public class MyCustomException extends Exception {
public MyClientException(String message) {
super(message);
}
}
If this doesn't suit you, you can mess with SoapFaultAnnotationExceptionResolver (https://docs.spring.io/spring-ws/site/apidocs/org/springframework/ws/soap/server/endpoint/SoapFaultAnnotationExceptionResolver.html). This resolver lets you map exception classes to SOAP Fault:
<beans>
<bean id="exceptionResolver" class="org.springframework.ws.soap.server.endpoint.SoapFaultMappingExceptionResolver">
<property name="defaultFault" value="SERVER"/>
<property name="exceptionMappings">
<value>
org.springframework.oxm.ValidationFailureException=CLIENT,Invalid request
</value>
</property>
</bean>
You can use this to add SoapFaultDetail:
public class MySoapFaultDefinitionExceptionResolver extends SoapFaultMappingExceptionResolver {
private static final QName CODE = new QName("code");
private static final QName DESCRIPTION = new QName("description");
#Override
protected void customizeFault(Object endpoint, Exception ex, SoapFault fault) {
if (ex instanceof MyCustomException) {
SoapFaultDetail detail = fault.addFaultDetail();
detail.addFaultDetailElement(CODE).addText("SOMECODE");
detail.addFaultDetailElement(DESCRIPTION).addText(ex.getMessage());
}
}
I have once used EndpointInterceptor to mess with SOAPHeader. Perhaps you can use it to do the work with SOAPFault (https://docs.spring.io/spring-ws/site/apidocs/org/springframework/ws/server/EndpointInterceptor.html).
You can extract SOAPFault from MessageContext like this:
#Override
public boolean handleFault(MessageContext messageContext, Object o) throws Exception {
SaajSoapMessage soapResponse = (SaajSoapMessage) messageContext.getResponse();
SOAPMessage soapMessage = soapResponse.getSaajMessage();
SOAPBody body = soapMessage.getSOAPBody();
SOAPFault fault = body.getFault();
//do something with fault here
return true
}
You can read about SOAPFault interface here https://docs.spring.io/spring-ws/site/apidocs/org/springframework/ws/soap/SoapFault.html

Read Application Object from GemFire using Spring Data GemFire. Data stored using SpringXD's gemfire-json-server

I'm using the gemfire-json-server module in SpringXD to populate a GemFire grid with json representation of “Order” objects. I understand the gemfire-json-server module saves data in Pdx form in GemFire. I’d like to read the contents of the GemFire grid into an “Order” object in my application. I get a ClassCastException that reads:
java.lang.ClassCastException: com.gemstone.gemfire.pdx.internal.PdxInstanceImpl cannot be cast to org.apache.geode.demo.cc.model.Order
I’m using the Spring Data GemFire libraries to read contents of the cluster. The code snippet to read the contents of the Grid follows:
public interface OrderRepository extends GemfireRepository<Order, String>{
Order findByTransactionId(String transactionId);
}
How can I use Spring Data GemFire to convert data read from the GemFire cluster into an Order object?
Note: The data was initially stored in GemFire using SpringXD's gemfire-json-server-module
Still waiting to hear back from the GemFire PDX engineering team, specifically on Region.get(key), but, interestingly enough if you annotate your application domain object with...
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public class Order ... {
...
}
This works!
Under-the-hood I knew the GemFire JSONFormatter class (see here) used Jackson's API to un/marshal (de/serialize) JSON data to and from PDX.
However, the orderRepository.findOne(ID) and ordersRegion.get(key) still do not function as I would expect. See updated test class below for more details.
Will report back again when I have more information.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = GemFireConfiguration.class)
#SuppressWarnings("unused")
public class JsonToPdxToObjectDataAccessIntegrationTest {
protected static final AtomicLong ID_SEQUENCE = new AtomicLong(0l);
private Order amazon;
private Order bestBuy;
private Order target;
private Order walmart;
#Autowired
private OrderRepository orderRepository;
#Resource(name = "Orders")
private com.gemstone.gemfire.cache.Region<Long, Object> orders;
protected Order createOrder(String name) {
return createOrder(ID_SEQUENCE.incrementAndGet(), name);
}
protected Order createOrder(Long id, String name) {
return new Order(id, name);
}
protected <T> T fromPdx(Object pdxInstance, Class<T> toType) {
try {
if (pdxInstance == null) {
return null;
}
else if (toType.isInstance(pdxInstance)) {
return toType.cast(pdxInstance);
}
else if (pdxInstance instanceof PdxInstance) {
return new ObjectMapper().readValue(JSONFormatter.toJSON(((PdxInstance) pdxInstance)), toType);
}
else {
throw new IllegalArgumentException(String.format("Expected object of type PdxInstance; but was (%1$s)",
pdxInstance.getClass().getName()));
}
}
catch (IOException e) {
throw new RuntimeException(String.format("Failed to convert PDX to object of type (%1$s)", toType), e);
}
}
protected void log(Object value) {
System.out.printf("Object of Type (%1$s) has Value (%2$s)", ObjectUtils.nullSafeClassName(value), value);
}
protected Order put(Order order) {
Object existingOrder = orders.putIfAbsent(order.getTransactionId(), toPdx(order));
return (existingOrder != null ? fromPdx(existingOrder, Order.class) : order);
}
protected PdxInstance toPdx(Object obj) {
try {
return JSONFormatter.fromJSON(new ObjectMapper().writeValueAsString(obj));
}
catch (JsonProcessingException e) {
throw new RuntimeException(String.format("Failed to convert object (%1$s) to JSON", obj), e);
}
}
#Before
public void setup() {
amazon = put(createOrder("Amazon Order"));
bestBuy = put(createOrder("BestBuy Order"));
target = put(createOrder("Target Order"));
walmart = put(createOrder("Wal-Mart Order"));
}
#Test
public void regionGet() {
assertThat((Order) orders.get(amazon.getTransactionId()), is(equalTo(amazon)));
}
#Test
public void repositoryFindOneMethod() {
log(orderRepository.findOne(target.getTransactionId()));
assertThat(orderRepository.findOne(target.getTransactionId()), is(equalTo(target)));
}
#Test
public void repositoryQueryMethod() {
assertThat(orderRepository.findByTransactionId(amazon.getTransactionId()), is(equalTo(amazon)));
assertThat(orderRepository.findByTransactionId(bestBuy.getTransactionId()), is(equalTo(bestBuy)));
assertThat(orderRepository.findByTransactionId(target.getTransactionId()), is(equalTo(target)));
assertThat(orderRepository.findByTransactionId(walmart.getTransactionId()), is(equalTo(walmart)));
}
#Region("Orders")
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public static class Order implements PdxSerializable {
protected static final OrderPdxSerializer pdxSerializer = new OrderPdxSerializer();
#Id
private Long transactionId;
private String name;
public Order() {
}
public Order(Long transactionId) {
this.transactionId = transactionId;
}
public Order(Long transactionId, String name) {
this.transactionId = transactionId;
this.name = name;
}
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public Long getTransactionId() {
return transactionId;
}
public void setTransactionId(final Long transactionId) {
this.transactionId = transactionId;
}
#Override
public void fromData(PdxReader reader) {
Order order = (Order) pdxSerializer.fromData(Order.class, reader);
if (order != null) {
this.transactionId = order.getTransactionId();
this.name = order.getName();
}
}
#Override
public void toData(PdxWriter writer) {
pdxSerializer.toData(this, writer);
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Order)) {
return false;
}
Order that = (Order) obj;
return ObjectUtils.nullSafeEquals(this.getTransactionId(), that.getTransactionId());
}
#Override
public int hashCode() {
int hashValue = 17;
hashValue = 37 * hashValue + ObjectUtils.nullSafeHashCode(getTransactionId());
return hashValue;
}
#Override
public String toString() {
return String.format("{ #type = %1$s, id = %2$d, name = %3$s }",
getClass().getName(), getTransactionId(), getName());
}
}
public static class OrderPdxSerializer implements PdxSerializer {
#Override
public Object fromData(Class<?> type, PdxReader in) {
if (Order.class.equals(type)) {
return new Order(in.readLong("transactionId"), in.readString("name"));
}
return null;
}
#Override
public boolean toData(Object obj, PdxWriter out) {
if (obj instanceof Order) {
Order order = (Order) obj;
out.writeLong("transactionId", order.getTransactionId());
out.writeString("name", order.getName());
return true;
}
return false;
}
}
public interface OrderRepository extends GemfireRepository<Order, Long> {
Order findByTransactionId(Long transactionId);
}
#Configuration
protected static class GemFireConfiguration {
#Bean
public Properties gemfireProperties() {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", JsonToPdxToObjectDataAccessIntegrationTest.class.getSimpleName());
gemfireProperties.setProperty("mcast-port", "0");
gemfireProperties.setProperty("log-level", "warning");
return gemfireProperties;
}
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
//cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxSerializer(new OrderPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
#Bean(name = "Orders")
public PartitionedRegionFactoryBean ordersRegion(Cache gemfireCache) {
PartitionedRegionFactoryBean regionFactoryBean = new PartitionedRegionFactoryBean();
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setName("Orders");
regionFactoryBean.setPersistent(false);
return regionFactoryBean;
}
#Bean
public GemfireRepositoryFactoryBean orderRepository() {
GemfireRepositoryFactoryBean<OrderRepository, Order, Long> repositoryFactoryBean =
new GemfireRepositoryFactoryBean<>();
repositoryFactoryBean.setRepositoryInterface(OrderRepository.class);
return repositoryFactoryBean;
}
}
}
So, as you are aware, GemFire (and by extension, Apache Geode) stores JSON in PDX format (as a PdxInstance). This is so GemFire can interoperate with many different language-based clients (native C++/C#, web-oriented (JavaScript, Pyhton, Ruby, etc) using the Developer REST API, in addition to Java) and also to be able to use OQL to query the JSON data.
After a bit of experimentation, I am surprised GemFire is not behaving as I would expect. I created an example, self-contained test class (i.e. no Spring XD, of course) that simulates your use case... essentially storing JSON data in GemFire as PDX and then attempting to read the data back out as the Order application domain object type using the Repository abstraction, logical enough.
Given the use of the Repository abstraction and implementation from Spring Data GemFire, the infrastructure will attempt to access the application domain object based on the Repository generic type parameter (in this case "Order" from the "OrderRepository" definition).
However, the data is stored in PDX, so now what?
No matter, Spring Data GemFire provides the MappingPdxSerializer class to convert PDX instances back to application domain objects using the same "mapping meta-data" that the Repository infrastructure uses. Cool, so I plug that in...
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
You will also notice, I set the PDX 'read-serialized' property (cacheFactoryBean.setPdxReadSerialized(false);) to false in order to ensure data access operations return the domain object and not the PDX instance.
However, this had no affect on the query method. In fact, it had no affect on the following operations either...
orderRepository.findOne(amazonOrder.getTransactionId());
ordersRegion.get(amazonOrder.getTransactionId());
Both calls returned a PdxInstance. Note, the implementation of OrderRepository.findOne(..) is based on SimpleGemfireRepository.findOne(key), which uses GemfireTemplate.get(key), which just performs Region.get(key), and so is effectively the same as (ordersRegion.get(amazonOrder.getTransactionId();). The outcome should not be, especially with Region.get() and read-serialized set to false.
With the OQL query (SELECT * FROM /Orders WHERE transactionId = $1) generated from the findByTransactionId(String id), the Repository infrastructure has a bit less control over what the GemFire query engine will return based on what the caller (OrderRepository) expects (based on the generic type parameter), so running OQL statements could potentially behave differently than direct Region access using get.
Next, I went onto try modifying the Order type to implement PdxSerializable, to handle the conversion during data access operations (direct Region access with get, OQL, or otherwise). This had no affect.
So, I tried to implement a custom PdxSerializer for Order objects. This had no affect either.
The only thing I can conclude at this point is something is getting lost in translation between Order -> JSON -> PDX and then from PDX -> Order. Seemingly, GemFire needs additional type meta-data required by PDX (something like #JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type") in the JSON data that PDXFormatter recognizes, though I am not certain it does.
Note, in my test class, I used Jackson's ObjectMapper to serialize the Order to JSON and then GemFire's JSONFormatter to serialize the JSON to PDX, which I suspect Spring XD is doing similarly under-the-hood. In fact, Spring XD uses Spring Data GemFire and is most likely using the JSON Region Auto Proxy support. That is exactly what SDG's JSONRegionAdvice object does (see here).
Anyway, I have an inquiry out to the rest of the GemFire engineering team. There are also things that could be done in Spring Data GemFire to ensure the PDX data is converted, such as making use of the MappingPdxSerializer directly to convert the data automatically on behalf of the caller if the data is indeed of type PdxInstance. Similar to how JSON Region Auto Proxying works, you could write AOP interceptor for the Orders Region to automagicaly convert PDX to an Order.
Though, I don't think any of this should be necessary as GemFire should be doing the right thing in this case. Sorry I don't have a better answer right now. Let's see what I find out.
Cheers and stay tuned!
See subsequent post for test code.

Looking for a solution to extend Spring MVC with another Component/Annotation

Suppose I have a Website that is used in normal mode (browser) and in some other mode, like a MobileView mode (inside a mobile app). For each Controller I create, there might be correspondent controller for MobileView, processing the same url.
The easiest solution is to create ifs in all the Controllers that have MobileView logic. Another solution would be to use a correspondent url for MobileView (similar to the normal url) and two separate Controllers (possible where one extends from another; or use some other way to recycle common code)
But, a more elegant solution would be to have some extra annotations, like #SupportsMobileView (to mark a controller, and tell the app that this will have a correspondent MobileView Controller) and #MobileViewController (to mark a second controller, and tell the app that this controller needs to run immediately after the initial controller marked with #SupportsMobileView). The link between a normal controller and a MobileView controller would be through the url they process (defined with #RequestMapping).
Is it possible to extend Spring MVC (A)? Where to inject new annotation scanners (B) and annotation handlers / component handlers (C)? How should the MobileView controller be executed (D) (right now I am thinking that it could be executed through AOP, where the new handler of my new controller type programatically creates a Join-Point on the corresponding normal controller)
Note that I did not mention how this MobileView mode is triggered and detected. Let's just say that there a Session boolean variable (flag) for that.
Critics on any points (A), (B), (C) or (D) are welcomed, as well as technical hints and alternative solution to any point or the whole solution.
HandlerInterceptor can be used to intercept the RequestMapping handling. This is a simple example how to configure and implement one.
You can check for your session variable and will have a bunch of methods that will allow you to do custom processing or just exchange the view from the normal controller handling with your mobile view.
Ok, warnings:
this is only a proof of concept of what I understood must be done so:
+#MobileViewEnable and #MobileView annotated (and related) methods need to stay in the same controller
+there's no check for the httpAction used
+the two methods must have the same signature
+mobileView annotation value and requestMapping annotation value must be equals and uniques
+the logic inside callYourLogic(..) defines which method is going to be called, at the moment there's a very simple logic that check if exist the parameter ("mobile") in the request, just to test
+this code is not intended to be used as is (at all)
+don't know if it works at all outside my pc (joke :D, ehm..)
SO:
Annotations:
#Retention(RetentionPolicy.RUNTIME)
public #interface MobileView {
String value() default "";
}
#Retention(RetentionPolicy.RUNTIME)
public #interface MobileViewEnable {
}
ExampleController:
#Controller
public class MainController extends BaseController {
private final static Logger logger = LoggerFactory.getLogger(MainController.class);
private final static String PROVA_ROUTE = "prova";
#MobileViewEnable
#RequestMapping(PROVA_ROUTE)
public String prova() {
logger.debug("inside prova!!!");
return "provaview";
}
#MobileView(PROVA_ROUTE)
public String prova2() {
logger.debug("inside prova2!!!");
return "prova2view";
}
}
Aspect definition:
<bean id="viewAspect" class="xxx.yyy.ViewAspect" />
<aop:config>
<aop:pointcut expression="#annotation(xxx.yyy.MobileViewEnable)" id="viewAspectPointcut" />
<aop:aspect ref="viewAspect" order="1">
<aop:around method="around" pointcut-ref="viewAspectPointcut" arg-names="viewAspectPointcut"/>
</aop:aspect>
</aop:config>
Aspect implementation:
public class ViewAspect implements BeforeAdvice, ApplicationContextAware {
private final static Logger logger = LoggerFactory.getLogger(ViewAspect.class);
private ApplicationContext applicationContext;
public Object around(ProceedingJoinPoint joinPoint) {
Method mobileViewAnnotatedMethod = null;
HttpServletRequest request = getCurrentHttpRequest();
String controllerName = getSimpleClassNameWithFirstLetterLowercase(joinPoint);
Object[] interceptedMethodArgs = getInterceptedMethodArgs(joinPoint);
String methodName = getCurrentMethodName(joinPoint);
Method[] methods = getAllControllerMethods(joinPoint);
Method interceptedMethod = getInterceptedMethod(methods, methodName);
String interceptedMethodRoute = getRouteFromInterceptedMethod(interceptedMethod);
if (callYourLogic(request)) {
mobileViewAnnotatedMethod = getMobileViewAnnotatedMethodWithRouteName(methods, interceptedMethodRoute);
if (mobileViewAnnotatedMethod != null)
return invokeMethod(mobileViewAnnotatedMethod, interceptedMethodArgs, controllerName);
}
return continueInterceptedMethodExecution(joinPoint, interceptedMethodArgs);
}
private Object continueInterceptedMethodExecution(ProceedingJoinPoint joinPoint, Object[] interceptedMethodArgs) {
try {
return joinPoint.proceed(interceptedMethodArgs);
} catch (Throwable e) {
logger.error("unable to proceed with intercepted method call: " + e);
}
return null;
}
private Object[] getInterceptedMethodArgs(JoinPoint joinPoint) {
return joinPoint.getArgs();
}
private boolean callYourLogic(HttpServletRequest request) {
// INSERT HERE YOUR CUSTOM LOGIC (e.g.: is the server accessed from a mobile device?)
// THIS IS A STUPID LOGIC USED ONLY FOR EXAMPLE
return request.getParameter("mobile")!= null;
}
private HttpServletRequest getCurrentHttpRequest() {
return ((ServletRequestAttributes) RequestContextHolder.currentRequestAttributes()).getRequest();
}
private String invokeMethod(Method method, Object[] methodArgs, String className) {
if (method != null) {
try {
Object classInstance = getInstanceOfClass(method, className);
return (String) method.invoke(classInstance, methodArgs);
} catch (Exception e) {
logger.error("unable to invoke method" + method + " - " + e);
}
}
return null;
}
private Object getInstanceOfClass(Method method, String className) {
return applicationContext.getBean(className);
}
private Method getMobileViewAnnotatedMethodWithRouteName(Method[] methods, String routeName) {
for (Method m : methods) {
MobileView mobileViewAnnotation = m.getAnnotation(MobileView.class);
if (mobileViewAnnotation != null && mobileViewAnnotation.value().equals(routeName))
return m;
}
return null;
}
private String getRouteFromInterceptedMethod(Method method) {
RequestMapping requestMappingAnnotation = method.getAnnotation(RequestMapping.class);
if (requestMappingAnnotation != null)
return requestMappingAnnotation.value()[0];
return null;
}
private String getCurrentMethodName(JoinPoint joinPoint) {
return joinPoint.getSignature().getName();
}
private Method[] getAllControllerMethods(JoinPoint joinPoint) {
return joinPoint.getThis().getClass().getSuperclass().getMethods();
}
private String getSimpleClassNameWithFirstLetterLowercase(JoinPoint joinPoint) {
String simpleClassName = joinPoint.getThis().getClass().getSuperclass().getSimpleName();
return setFirstLetterLowercase(simpleClassName);
}
private String setFirstLetterLowercase(String simpleClassName) {
String firstLetterOfTheString = simpleClassName.substring(0, 1).toLowerCase();
String restOfTheString = simpleClassName.substring(1);
return firstLetterOfTheString + restOfTheString;
}
private Method getInterceptedMethod(Method[] methods, String lookingForMethodName) {
for (Method m : methods)
if (m.getName().equals(lookingForMethodName))
return m;
return null;
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
}

custom converter in XStream

I am using XStream to serialize my Objects to XML format. The formatted xml that I get is as below: node1, node2, node 3 are attributes of pojo,DetailDollars
I have requirement where in I have to calucluate a percentage, for example 100/ 25 and add the new node to the existing ones. So, the final output should be :
<DetailDollars>
<node1>100 </node1>
<node2>25</node2>
<node3>10</node3>
</DetailDollars>
I wrote a custom converter and registered to my xstream object.
public void marshal(..){
writer.startNode("node4");
writer.setValue(getNode1()/ getnode2() );
writer.endNode();
}
But, the xml stream I get has only the new node:
<DetailDollars>
<node4>4</node4>
</DetailDollars>
I am not sure which xstream api would get me the desired format. could you please help me with this .
Here is the converter you need:
public class DetailDollarsConverter extends ReflectionConverter {
public DetailDollarsConverter(Mapper mapper,
ReflectionProvider reflectionProvider) {
super(mapper, reflectionProvider);
}
#Override
public void marshal(Object obj, HierarchicalStreamWriter writer,
MarshallingContext context) {
super.marshal(obj,writer,context);
DetailDollars dl = (DetailDollars) obj;
writer.startNode("node4");
writer.setValue(Double.toString(dl.getNode1() / dl.getNode2()));
writer.endNode();
}
#Override
public Object unmarshal(HierarchicalStreamReader reader,
UnmarshallingContext context) {
return super.unmarshal(reader,context);
}
#SuppressWarnings("unchecked")
#Override
public boolean canConvert(Class clazz) {
return clazz.equals(DetailDollars.class);
}
}

Resources