I need to use the method setGroupingComparatorClass in Job and it takes an argument of type WritableComparable.
I am unable to implement WritableComparable class.
Please help me to solve this. Regards, Bidyut
setGroupingComparatorClass(Class<? extends RawComparator> cls)
Define the comparator that controls which keys are grouped together for a single call to Reducer.reduce(Object, Iterable, org.apache.hadoop.mapreduce.Reducer.Context)
job.setGroupingComparatorClass(CustomKey.GroupComparator.class);
In your Customkey class you can write static method.
Add below code in your custom key class.
public class Customkey implements WritableComparable<IndexerKey> {
public static class GroupComparator extends WritableComparator
implements Serializable {
private static final long serialVersionUID = -3385728040072507941L;
public GroupComparator() {
super(Customkey .class, true);
}
#SuppressWarnings("rawtypes")
public int compare(WritableComparable a, WritableComparable b) {
Customkey w1 = (Customkey ) a;
Customkey w2 = (Customkey ) b;
return w1.compareGroup(w2);
}
}
}
Hope this could help you.
Related
I have this project:
public class A {
#Valid
private B b;
}
public class B {
#Max(5)
private int n;
}
public class ValidatorFactory extends AbstractGwtValidatorFactory {
#GwtValidation({A.class, B.class})
public interface GwtValidator extends Validator {}
#Override
public AbstractGwtValidator createValidator() {
return GWT.create(GwtValidator.class);
}
}
public class SomeWidget extends Widget {
...
private A a;
public void validate() {
Validator validator = Validation.buildDefaultValidatorFactory().getValidator();
Set<ConstraintViolation<EgyenlegkozloModel>> violations = validator.validate(model);
}
...
}
After running SomeWidget.validate(), even though B.n is higher than 5, a violation doesn't get generated. I checked the generated code and saw that the generator didn't generate the snippet that would validate the child.
I have a class like this
#Component
public class TestClass {
public void testMethod(){
FinalClass f = new FinalClass("string");
somePrivateMethod(f.getSomeString());
}
private void somePrivateMethod(String s){
}
}
As you can see it has a public method and private method. In public method it is instantiating an instance of FinalClass, which is a class in some third party library and it is final. Lets say it is like this
public final class FinalClass {
private final String someString;
public FinalClass(final String s) {
someString = s;
}
public String getSomeString() {
return someString;
}
}
And Now I am writing unit test for my test class. Since I have to verify final classes and private methods, I am using powermockito. And this is how my test class looks like
#RunWith(PowerMockRunner.class)
#PrepareForTest({TestClass.class, FinalClass.class})
public class TestClassTest {
private TestClass testClass;
private FinalClass finalClass;
#Before
public void setUp() {
finalClass = PowerMockito.mock(FinalClass.class);
testClass = spy(new TestClass());
}
#Test
public void testSomething() throws Exception {
whenNew(FinalClass.class).withAnyArguments().thenReturn(finalClass);
testClass.testMethod();
verifyNew(FinalClass.class);
//verifyPrivate(testClass).invoke("testMethod");
}
}
It works fine. But the problem is the last two statements verifyNew and verifyPrivate are working mutually exclusively. I mean when I comment one of those(doesn't matter which), the test passes. But when both are enabled, the test fails
Does anyone have any idea why this is happening?
I can't get Spring Data Rest with class inheritance working.
I'd like to have a single JSON Endpoint which handles all my concrete classes.
Repo:
public interface AbstractFooRepo extends KeyValueRepository<AbstractFoo, String> {}
Abstract class:
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.PROPERTY, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = MyFoo.class, name = "MY_FOO")
})
public abstract class AbstractFoo {
#Id public String id;
public String type;
}
Concrete class:
public class MyFoo extends AbstractFoo { }
Now when calling POST /abstractFoos with {"type":"MY_FOO"}, it tells me: java.lang.IllegalArgumentException: PersistentEntity must not be null!.
This seems to happen, because Spring doesn't know about MyFoo.
Is there some way to tell Spring Data REST about MyFoo without creating a Repository and a REST Endpoint for it?
(I'm using Spring Boot 1.5.1 and Spring Data REST 2.6.0)
EDIT:
Application.java:
#SpringBootApplication
#EnableMapRepositories
public class Application {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
}
I'm using Spring Boot 1.5.1 and Spring Data Release Ingalls.
KeyValueRepository doesn't work with inheritance. It uses the class name of every saved object to find the corresponding key-value-store. E.g. save(new Foo()) will place the saved object within the Foo collection. And abstractFoosRepo.findAll() will look within the AbstractFoo collection and won't find any Foo object.
Here's the working code using MongoRepository:
Application.java
Default Spring Boot Application Starter.
#SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
AbstractFoo.java
I've tested include = JsonTypeInfo.As.EXISTING_PROPERTY and include = JsonTypeInfo.As.PROPERTY. Both seem to work fine!
It's even possible to register the Jackson SubTypes with a custom JacksonModule.
IMPORTANT: #RestResource(path="abstractFoos") is highly recommended. Else the _links.self links will point to /foos and /bars instead of /abstractFoos.
#JsonTypeInfo(use = JsonTypeInfo.Id.NAME, include = JsonTypeInfo.As.EXISTING_PROPERTY, property = "type")
#JsonSubTypes({
#JsonSubTypes.Type(value = Foo.class, name = "MY_FOO"),
#JsonSubTypes.Type(value = Bar.class, name = "MY_Bar")
})
#Document(collection="foo_collection")
#RestResource(path="abstractFoos")
public abstract class AbstractFoo {
#Id public String id;
public abstract String getType();
}
AbstractFooRepo.java
Nothing special here
public interface AbstractFooRepo extends MongoRepository<AbstractFoo, String> { }
Foo.java & Bar.java
#Persistent
public class Foo extends AbstractFoo {
#Override
public String getType() {
return "MY_FOO";
}
}
#Persistent
public class Bar extends AbstractFoo {
#Override
public String getType() {
return "MY_BAR";
}
}
FooRelProvider.java
Without this part, the output of the objects would be separated in two arrays under _embedded.foos and _embedded.bars.
The supports method ensures that for all classes which extend AbstractFoo, the objects will be placed within _embedded.abstractFoos.
#Component
#Order(Ordered.HIGHEST_PRECEDENCE)
public class FooRelProvider extends EvoInflectorRelProvider {
#Override
public String getCollectionResourceRelFor(final Class<?> type) {
return super.getCollectionResourceRelFor(AbstractFoo.class);
}
#Override
public String getItemResourceRelFor(final Class<?> type) {
return super.getItemResourceRelFor(AbstractFoo.class);
}
#Override
public boolean supports(final Class<?> delimiter) {
return AbstractFoo.class.isAssignableFrom(delimiter);
}
}
EDIT
Added #Persistent to Foo.java and Bar.java. (Adding it to AbstractFoo.java doesn't work). Without this annotation I got NullPointerExceptions when trying to use JSR 303 Validation Annotations within inherited classes.
Example code to reproduce the error:
public class A {
#Id public String id;
#Valid public B b;
// #JsonTypeInfo + #JsonSubTypes
public static abstract class B {
#NotNull public String s;
}
// #Persistent <- Needed!
public static class B1 extends B { }
}
Please see the discussion in this resolved jira task for details of what is currently supported in spring-data-rest regarding JsonTypeInfo. And this jira task on what is still missing.
To summarize - only #JsonTypeInfo with include=JsonTypeInfo.As.EXISTING_PROPERTY is working for serialization and deserialization currently.
Also, you need spring-data-rest 2.5.3 (Hopper SR3) or later to get this limited support.
Please see my sample application - https://github.com/mduesterhoeft/spring-data-rest-entity-inheritance/tree/fixed-hopper-sr3-snapshot
With include=JsonTypeInfo.As.EXISTING_PROPERTY the type information is extracted from a regular property. An example helps getting the point of this way of adding type information:
The abstract class:
#Entity #Inheritance(strategy= SINGLE_TABLE)
#JsonTypeInfo(use=JsonTypeInfo.Id.NAME,
include=JsonTypeInfo.As.EXISTING_PROPERTY,
property="type")
#JsonSubTypes({
#Type(name="DECIMAL", value=DecimalValue.class),
#Type(name="STRING", value=StringValue.class)})
public abstract class Value {
#Id #GeneratedValue(strategy = IDENTITY)
#Getter
private Long id;
public abstract String getType();
}
And the subclass:
#Entity #DiscriminatorValue("D")
#Getter #Setter
public class DecimalValue extends Value {
#Column(name = "DECIMAL_VALUE")
private BigDecimal value;
public String getType() {
return "DECIMAL";
}
}
public class Partitioner_2 implements Partitioner<Text,Text>{
#Override
public int getPartition(Text key, Text value, int numPartitions) {
int hashValue=0;
for(char c: key.toString().split("\\|\\|")[0].toCharArray()){
hashValue+=(int)c;
}
return Math.abs(hashValue * 127) % numPartitions;
}
}
That is my partitioner code and the key is in the form:
"str1||str2" , I would like to send all keys that have the same value for str1 to the same reducer.
My GroupComparator and KeyComparator are as follows:
public static class GroupComparator_2 extends WritableComparator {
protected GroupComparator_2() {
super(Text.class, true);
}
#Override
public int compare(WritableComparable w1, WritableComparable w2) {
Text kw1 = (Text) w1;
Text kw2 = (Text) w2;
String k1=kw1.toString().split("||")[0].trim();
String k2=kw2.toString().split("||")[0].trim();
return k1.compareTo(k2);
}
}
public static class KeyComparator_2 extends WritableComparator {
protected KeyComparator_2() {
super(Text.class, true);
}
#Override
public int compare(WritableComparable w1, WritableComparable w2) {
Text key1 = (Text) w1;
Text key2 = (Text) w2;
String kw1_key1=key1.toString().split("||")[0];
String kw1_key2=key2.toString().split("||")[0];
int cmp=kw1_key1.compareTo(kw1_key2);
if(cmp==0){
String kw2_key1=key1.toString().split("||")[1].trim();
String kw2_key2=key2.toString().split("||")[1].trim();
cmp=kw2_key1.compareTo(kw2_key2);
}
return cmp;
}
}
The error I am currently receiving is :
KeywordKeywordCoOccurrence_2.java:92: interface expected here
public class Partitioner_2 implements Partitioner<Text,Text>{
^
KeywordKeywordCoOccurrence_2.java:94: method does not override or implement a method from a supertype
#Override
^
KeywordKeywordCoOccurrence_2.java:147: setPartitionerClass(java.lang.Class<? extends org.apache.hadoop.mapreduce.Partitioner>) in org.apache.hadoop.mapreduce.Job cannot be applied to (java.lang.Class<KeywordKeywordCoOccurrence_2.Partitioner_2>)
job.setPartitionerClass(Partitioner_2.class);
But as far as I can tell I have overridden the getPartition() method which is the only method in the Partitioner interface? Any help in identifying what I am doing wrong and how to fix it would be much appreciated.
Thanks in advance!
Partitioner is an abstract class in the new mapreduce API (that you're apparently using).
So you should define it as:
public class Partitioner_2 extends Partitioner<Text, Text> {
I am writing my own custom Partitioner(Old Api) below is the code where I am extending Partitioner class:
public static class WordPairPartitioner extends Partitioner<WordPair,IntWritable> {
#Override
public int getPartition(WordPair wordPair, IntWritable intWritable, int numPartitions) {
return wordPair.getWord().hashCode() % numPartitions;
}
}
Setting the JobConf:
conf.setPartitionerClass(WordPairPartitioner.class);
WordPair Class contains:
private Text word;
private Text neighbor;
Questions:
1. I am getting error:"actual argument class (WordPairPartitioner) cannot convert to Class (?extends Partitioner).
2. Is this a right way to write the custom partitioner or do I need to override some other functionality as well?
I believe you are mixing up old API(classes from org.apache.hadoop.mapred.*) and new API(classes from org.apache.hadoop.mapreduce.*)
Using old API, you may do as follows:
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapred.JobConf;
import org.apache.hadoop.mapred.Partitioner;
public static class WordPairPartitioner implements Partitioner<WordPair,IntWritable> {
#Override
public int getPartition(WordPair wordPair, IntWritable intWritable, int numPartitions) {
return wordPair.getWord().hashCode() % numPartitions;
}
#Override
public void configure(JobConf arg0) {
}
}
In addition to Amar's answer, you should handle the eventuality of hashCode returning a negative number by bit masking:
#Override
public int getPartition(WordPair wordPair, IntWritable intWritable, int numPartitions) {
return (wordPair.getWord().hashCode() % numPartitions) & 0x7FFFFFFF;
}