I coded some routine Hadoop MapReduce jobs, and thus call the context.write() method just based on some examples from the given Apache Hadoop source code. But such kinda copy doesn't help me understand the Hadoop API deeper.
Therefore, recently I started to read the Hadoop API document (https://hadoop.apache.org/docs/r2.7.0/api/) more carefully and try to figure out are there any other methods in Context except for context.write(). For instance, in the teragen example, context.getCounter() is used.
But to my surprise, I couldn't find the Context class documentation at all from the link above.
Where can i find the documentation for the Context class in hadoop?
You can start to work out whats going on if you dig into the standard Mapper class source (around line 106).
public abstract class Context
implements MapContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT> {
}
So this is just an abstract class which implements the MapContext interface found here (Javadoc link).
The concrete implementation is MapContextImpl found here.
It looks like the ContextFactory (source) is responsible for creating the different implementations of Context.
Related
I'm using Quartz Scheduler in one of my projects. There are two main ways to create a Quartz job:
implement org.quartz.Job class
extend org.springframework.scheduling.quartz.QuartzJobBean (which implements org.quartz.Job class)
The last part of the QuartzJobBean javadoc is confusing :
* Note that the preferred way to apply dependency injection to Job instances is via a JobFactory:
that is, to specify SpringBeanJobFactory as Quartz JobFactory (typically via
SchedulerFactoryBean.setJobFactory SchedulerFactoryBean's "jobFactory" property}).
This allows to implement dependency-injected Quartz Jobs without a dependency on Spring base classes.*
For a pure Spring (or SpringBoot) use, I suppose that it is better to extend QuartzJobBean. I'm right?
First of all, since a QuartzJobBean is a Job, any API call that will accept a Job will accept a QuartzJobBean, but not visa versa. So if you need a QuartzJobBean because some API call expects you to pass it one, then there's your answer.
Otherwise, the answer depends on if you want to make use of (and be tied to) the functionality provided by QuartzJobBean. If you look at the source code for that class, you'll see that the sole gain in subclassing QuartzJobBean over implementing Job, is that QuartzJobBean performs this logic before passing control to your code:
try {
BeanWrapper bw = PropertyAccessorFactory.forBeanPropertyAccess(this);
MutablePropertyValues pvs = new MutablePropertyValues();
pvs.addPropertyValues(context.getScheduler().getContext());
pvs.addPropertyValues(context.getMergedJobDataMap());
bw.setPropertyValues(pvs, true);
}
catch (SchedulerException ex) {
throw new JobExecutionException(ex);
}
So if you extend the QuartzJobBean class and implement the executeInternal method, this code runs before your code. If you implement the Job class and the execute method, it does not. That's the only difference between the two approaches in terms of what actually happens when your job runs.
So to answer your question, ask yourself "do I want to take advantage of the above code?". If the answer is Yes, then extend QuartzJobBean to take advantage of that functionality. If you don't need this added functionality, don't want it, and/or don't want to be locked into the dependencies implied by the above code, then you should implement Job to avoid this code and its dependencies. My personal approach would be to implement Job unless I had some reason to extend QuartzJobBean instead.
For testing purposes im searching for a elegant and less error prune way to build a hal+json data structure based on a Java list of certain objects.
Currently im using a quite huge ugly String for mapping/defining a expected hal+json data structure. I could place this of course also into a file but still imho its a bit error prune. As soon as a object/property would change i also would need to change my hard coded hal+json string/file...
Does anybody knows a helper class or something what could help to build the hal+json based on Java objects?
Spring HATEOAS helps you with generation of hal+json response .
you have to take care of following configuration
on any spring configuration class add #EnableHypermediaSupport(type = { HypermediaType.HAL })
2.make sure that you have Jackson library on classpath
3.Java object extends ResourceSupport or you wrap java object around Resource.
more details on Resource
4.This should generate hal+json response.
5.Add specific links to resource like self , to other resources .
Please click here for more details on Links
I have parent pom with Dropwizard service. In that service I have some exception mapper declaration, for example:
#Provider
class ExceptionMapperForClassA {...}
Now in my child pom I'm extending this parent service and I want to create New Exception mapper for ClassA and I would like to extend it from ExceptionMapperForClassA.
I couldn't find any information in Jersey doc, which describes behaviour of Jersey when two exception mappers for same class are declared.
https://jersey.java.net/documentation/latest/representations.html
Actually, the question is - How to override some exception mapper and be sure that only my exception mapper will be called ?
I ran a quick test and the behaviour is as follows:
If you have 2 Exception mappers mapping the same exception, it appears that the provider sorted by its name will be the one called. E.g. I have a ResourceNotFound mapper, that I extend with TestMapper. The latter is never called.
If I rename the latter to ATestMapper, the first will never be called.
I would NOT rely on that behaviour however because I don't think it is defined anywhere (I might be wrong).
If you want to overwrite ExceptionMappers, I would recommend creating a base that is generic, for example:
public abstract class BaseExceptionMapper<T extends Exception> extends LoggingExceptionMapper<T> {
...
}
Define all your shared implementation in the base mapper, and extend it with specific mappers. That way you have exactly one mapper per Exception.
If you can not do the above and you do absolutely have 2 Mappers of the same type, you can explicitly overwrite the first by programatically adding it. For this, in your dropwizard application starter class, in the run method, you can do:
JerseyEnvironment jersey = environment.jersey();
jersey.getResourceConfig().register(TestMapper.class);
This will explicitly set this mapper as the exceptionmapper to be used.
Unless that has changed, it's impossible to override an existing exception mapper. I don't remember where I had read it, but it said that you'd have randomly chosen mapper responding if you register more than one.
As my answer here suggests, you need to unregister the previous exception mapper and and assign yours but that also was made impossible after dropwizard 0.8. Unless 0.9 offers a solution for that, you should focus on making your parent-pom taking a parameter from outside and decide to whether to register that ExceptionMapperForClassA or not.
Scenario: I have a web application that uses Spring 3 MVC. Using the powerful new annotations in Spring 3 (#Controller, #ResponseBody etc), I have written some domain objects with #XML annotations for marhalling ajax calls to web clients. Everything works great. I declared my Controller class to have a return type #ResponseBody with root XML object - the payload gets marshalled correctly and sent to Client.
The problem is that some data in the content is breaking the XML compliance. I need to wrap this with CDATA when necessary. I saw a POST here How to generate CDATA block using JAXB? that recommends using a custom Content Handler. Ok, fantastic!
public class CDataContentHandler extends (SAXHandler|XMLSerializer|Other...) {
// see http://www.w3.org/TR/xml/#syntax
private static final Pattern XML_CHARS = Pattern.compile("[<>&]");
public void characters(char[] ch, int start, int length) throws SAXException {
boolean useCData = XML_CHARS.matcher(new String(c,start,length)).find();
if (useCData) super.startCDATA();
super.characters(ch, start, length);
if (useCData) super.endCDATA();
}
}
Using Spring MVC 3, how do I achieve this? Everything was "auto-magically" done for me with regards to the JAXB aspects of setup, Spring read the return type of the method, saw the annotations of the return type and picked up JAXB2 off the classpath to do the marshalling (Object to XML conversion). So where on earth is the "hook" that permits a user to register a custom Content Handler to the config?
Using EclipseLink JAXB implementation it is as easy as adding #XmlCDATA to the Object attribute concerned. Is there some smart way Spring can help out here / abstract this problem away into a minor configuration detail?
I know Spring isn't tied to any particular implementation but for the sake of this question, please can we assume I am using whatever the default implementation is. I tried the Docs here http://static.springsource.org/spring-ws/site/reference/html/oxm.html but it barely helped at all with this question from what I could understand.
Thanks all for any replies, be really appreciated.
Update:
Thanks for the suggested answer below Akshay. It was sufficient to put me on right tracks. Investigating further, I see there is a bit of history with this one between Spring version 3.05 and 3.2. In Spring 3.05 it used to be quite difficult to register a custom MessageConverter (this is really the goal here).
This conversation pretty much explains the thinking behind the development changes requested:
https://jira.springsource.org/browse/SPR-7504
Here is a link to the typically required class override to build a cusom solution:
http://static.springsource.org/spring/docs/3.1.0.M1/javadoc-api/org/springframework/http/converter/AbstractHttpMessageConverter.html
And the following Question on stack overflow is very similar to what I was asking for (except the #ResponseBody discussion relates to JSON and jackson) - the goal is basically the same.
Spring 3.2 and Jackson 2: add custom object mapper
So it looks like usage of , and overriding MarshallingHttpMessageConverter is needed, registering to AnnotationMethodHandlerAdapter. There is a recommended solution in link above to also get clever with this stuff and wrap the whole thing behind a custom defined Annotation.
I haven't yet developed a working solution but since I asked the questions, wanted to at least post something that may help others with the same sort of question, to get started. With all due respect, although this has all improved in Spring 3.2, it's still bit of a dogs dinner to get a little customization working... I really was expecting a one liner config change etc.
Rather than twist and bend Spring, perhaps the easiest answer for my particular issue is just to change JAXB2 implementation and use something like Eclipse Link JAXB that can do this out of the box.
Basically you need to create a custom HttpMessageConverter. Instead of relying on the Jaxb2RootElementHttpMessageConverter that spring uses by default.
Unfortunately, customizing one converter means you are telling spring that you will take care of loading all the converters you need! Which is fairly involved and can get complicated, based on whether you use annotations, component scanning, Spring 3.1 or earlier, etc.. The issue of how to add a custom converter is addressed here: Custom HttpMessageConverter with #ResponseBody to do Json things
In your custom message converter you are free to use any custom JAXB2 content handlers.
Another, simpler approach to solve your original problem would be to use a custom XmlJavaTypeAdapter. Create a custom implementation of javax.xml.bind.annotation.adapters.XmlAdapter to handle CDATA, in the marshal method wrap the return value with the cdata braces. Then in your mapped pojo, use the XmlAdapter annotation, pass it the class of your custom adapter and you should be done.
I have not myself implemented the adapter approach, so couldn't provide sample code. But it should work, and won't be a lot of work.
Hope this helps.
First, I am a newbie in the JAXB and Spring world so if I missed something very obvious I would really appreciate it if someone can point it out instead of not replying. :) I tried searching for a solution here but could not find a good answer.
I have a bunch of subclass DTO's(say A1, A2, A3) which inherit from the same abstract class A. I want the result of my rest query to return a list of the subclass type. I have the following class to represent the result
#XmlRootElement(name="result")
#XmlSeeAlso({A1.class, A2.class, A3.class})
public class AResult<T>
{
...
}
Since AResult is generic I would like the #XmlSeeAlso to also be generic and just write something like
#XmlSeeAlso({(subclasses of A.class})
But I do not think that is possible with JAXB from the research I did on this site and elsewhere.
Since we use the annotation-driven tag in the config, it automatically uses the Jaxb2RootElementHttpMessageConverter class. This message converter creates the JaxbContext using the classes defined in #XmlSeeAlso among others. The createMarshaller and getContext methods are immutable in a superclass.
Because of point 1, I can not write a class where I can check if a class is a subclass of class A and then add it to the JaxbContext. I cannot use a custom Jaxb2RootElementHttpMessageConverter or a custom Marshaller.
How do I get around this? BTW, we are using Spring version 3.1.3
Thanks for your help.
JAXB doesn't scan your classpath for classes that might just happen to be subclasses of AResult (that would be rather slow!) but rather relies on the context knowing about all the classes that it might ever have to create instances of. All the #XmlSeeAlso annotation does is extend the context with the additional classes listed.
However, there are a number of other approaches. For example, you could create a class marked with #XmlRegistry that knows how to make the subclasses that you care about. Or you could experiment with using #XmlJavaTypeAdapter. Alas, I've only ever progressed as far as using the #XmlSeeAlso-based approach in my own code, so I can't comment really from experience.