Error during use $within operator with $elemMatch - spring

I'm using spring-data-mongodb (version 1.0.2.RELEASE) and mongodb (version 2.2).
I have an object A that contain a list of object Location. The classes are the following:
public class A {
#Id
private ObjectId id;
private List<Location> places;
//GETTER AND SETTER
}
public class Place {
private String name;
private String description;
#GeoSpatialIndexed
private double[] location;
//GETTER AND SETTER
}
I need to find all objects A with a specific location.
I tried to use together the operators $within and $elemMatch as following:
#Query(value = "{'places' : { $elemMatch: { location: {'$within' : {'$center' : [?0, ?1]} } }}}")
public List<A> findByLocation(Point location, double radius);
When i run this query, i receive the following exception:
org.springframework.data.mongodb.UncategorizedMongoDbException: can't find special index: 2d for: { places: { $elemMatch: { location: { $within: { $center: [ { x: 41.904159, y: 12.549132 }, 0.07000000000000001 ] } } } } }; nested exception is com.mongodb.MongoException: can't find special index: 2d for: { places: { $elemMatch: { location: { $within: { $center: [ { x: 41.904159, y: 12.549132 }, 0.07000000000000001 ] } } } } }
Any suggestions?
Regards

Apparently there's no 2d index on the location attribute, and MongoDB requires such an index to be present. I don't know if Spring should be creating that index by itself, but if it doesn't, you can create the index via
db.collection.ensureIndex({"location":"2d"})
where collection is replaced by the name of your collection.

I solved my problem changing A class. I replaced the list of Place with a list of org.springframework.data.mongodb.core.geo.Point as follow:
public class A {
#Id
private ObjectId id;
#GeoSpatialIndexed
private List<Point> places;
//GETTER AND SETTER
}
And i replaced old query with a new one as follow:
#Query(value = "{'places' : {'$within' : {'$center' : [?0, ?1]} } }")
public List<A> findByLocation(Point location, double radius);
And now it's work!
Thanks to all!

Related

Android Room + AutoValue breaks schema generation

I have a rather complicated (though not entirely unusual) scenario that seems to break with Android Room version 2.4.0 (specifically 2.4.0-beta01. It works on 2.4.0-alpha05).
I'll put the code down below, but I'll attempt to describe my situation in plain english for now.
Basically, I have two databases that reference the same table/entity. That entity is created using AutoValue, which, up until 2.4.0-alpha05, worked fine with the #AutoValue.CopyAnnotations annotation. However, once I upgraded to 2.4.0-beta01, the song table name is no longer detected:
There is a problem with the query: [SQLITE_ERROR] SQL error or missing database (no such table: word_table)
Without further ado, here is same sample code (adapted from the developer notes)
#AutoValue
#Entity(tableName = "word_table")
public abstract class Word {
#AutoValue.CopyAnnotations
#PrimaryKey
#NonNull
#ColumnInfo(name = "word")
public abstract String word();
#NonNull
public static Word create(#NonNull String word) {
return new AutoValue_Word.Builder().word(word).build();
}
#AutoValue.Builder
public abstract static class Builder {
abstract Builder word(String word);
public abstract Word build();
}
}
#Dao
#TypeConverters(Converters.class)
public interface WordDao {
#Query("SELECT * FROM word_table")
LiveData<List<Word>> getAlphabetizedWords();
}
#Dao
#TypeConverters(Converters.class)
public interface WordDao2 {
#Query("SELECT * FROM word_table")
LiveData<List<Word>> getAlphabetizedWords();
}
#Database(entities = {Word.class}, version = 1)
public abstract class WordRoomDatabase extends RoomDatabase {
public abstract WordDao wordDao();
public static WordRoomDatabase createDatabase(final Context context) {
return Room.databaseBuilder(context.getApplicationContext(), WordRoomDatabase.class, "word_database").build();
}
}
#Database(entities = {Word.class}, version = 1)
public abstract class WordRoomDatabase2 extends RoomDatabase {
public abstract WordDao2 wordDao2();
public static WordRoomDatabase2 createDatabase(final Context context) {
return Room.databaseBuilder(context.getApplicationContext(), WordRoomDatabase2.class, "word_database").build();
}
}
Update:
When I look at the generated schema, only WordRoomDatabase1 is populated, though pretty messed up:
{
"formatVersion": 1,
"database": {
"version": 1,
"identityHash": "a67e5757cf2fc0be1d7cee0b7192312f",
"entities": [
{
"tableName": "word_table",
"createSql": "CREATE TABLE IF NOT EXISTS `${TABLE_NAME}` ()",
"fields": [],
"primaryKey": {
"columnNames": [],
"autoGenerate": false
},
"indices": [],
"foreignKeys": []
}
],
"views": [],
"setupQueries": [
"CREATE TABLE IF NOT EXISTS room_master_table (id INTEGER PRIMARY KEY,identity_hash TEXT)",
"INSERT OR REPLACE INTO room_master_table (id,identity_hash) VALUES(42, 'a67e5757cf2fc0be1d7cee0b7192312f')"
]
}
}
Update 2:
I you change Word.java to not use AutoValue, it works great:
#Entity(tableName = "word_table")
public class Word {
#PrimaryKey
#NonNull
#ColumnInfo(name = "word")
private String word;
public Word(#NonNull String word) {this.word = word;}
public String getWord(){return this.word;}
}

How to use Jackson for parse object follow json type?

I have two Json objects like :
Object 1
{
"value": {
"data": [
"John",
"Justin",
"Tom"
],
"isGraduated": false
}
}
Object 2
{
"value": {
"data": {
"info": {
"background": {
"primarySchool" : "A school",
"univeristy": "X univeristy"
},
"name": "John",
"gender": "male",
"dayOfBirth": "1995-04-24"
}
},
"isGraduated": false
}
}
How can I deserialize the data field to list of strings or class(I've already declared) by using Jackson?
Edit
Add class Info declaration.
public class Info {
#JsonProperty("background")
private BackGround backGround;
#JsonProperty("name")
private String name;
#JsonProperty("gender")
private String gender;
#JsonProperty("dayOfBirth")
private String dayOfBirth;
public static class BackGround {
#JsonProperty("primarySchool")
private String primarySchool;
#JsonProperty("univeristy")
private String univeristy;
}
}
Looking at your JSON objects, there is no way you can figure out what will be there in data parameter. So you can use JsonNode as type for data parameter.
Note: This is the object hierarchy I have created to represent JSON objects
#ToString
class Wrapper {
private Value value;
// getter & setter
}
#ToString
class Value {
private JsonNode data;
private Boolean isGraduated;
// getter & setter
}
#ToString
class Data {
private Info info;
// getter & setter
}
#ToString
class Info {
private Background background;
private String name;
private String gender;
private String dayOfBirth;
// getter & setter
#ToString
static class Background {
private String primarySchool;
private String univeristy;
// getter & setter
}
}
Then you can check the node type before deserialize between List<String> and Info.calss like this,
JsonNodeType type = value.getValue().getData().getNodeType();
You will see type = JsonNodeType.ARRAY if the json object is type 1 and type = JsonNodeType.OBJECT if the json object is type 2.
Check this exaple,
public class Main {
public static void main(String[] args) throws IOException {
// String s = "{\"value\":{\"data\":[\"John\",\"Justin\",\"Tom\"],\"isGraduated\":false}}";
String s = "{\"value\":{\"data\":{\"info\":{\"background\":{\"primarySchool\":\"A school\",\"univeristy\":\"X univeristy\"},\"name\":\"John\",\"gender\":\"male\",\"dayOfBirth\":\"1995-04-24\"}},\"isGraduated\":false}}";
ObjectMapper om = new ObjectMapper();
Wrapper wrapper = om.readValue(s, Wrapper.class);
JsonNodeType type = wrapper.getValue().getData().getNodeType();
if (type == JsonNodeType.ARRAY) {
List<String> data = om.convertValue(wrapper.getValue().getData(), new TypeReference<List<String>>() {});
System.out.println(data);
} else if (type == JsonNodeType.OBJECT) {
Data data = om.convertValue(wrapper.getValue().getData(), Data.class);
System.out.println(data);
}
}
}
Not the general approach but approach for your specific case
ObjectMapper mapper = new ObjectMapper();
ObjectNode root = (ObjectNode) mapper.readTree(jsonContent);
JsonNode data = root.get("value").get("data");
if (data.has("info")) {
Info result = mapper.convertValue(data.get("info"), Info.class);
// handle result as Info instance
} else {
List<String> result = mapper.convertValue(data, new TypeReference<List<String>>() {});
// handle result as list of strings
}

Get properties from subclass object using BeanWrapperFieldExtractor

In my Spring batch application I have the following POJO classes:
public class School {
private String schoolName;
private String schoolAddress;
private ClassDetails classDetails;
}
public class ClassDetails {
private String className;
private String totalCountStudents;
private SectionDetails sectionDetails;
}
public class SectionDetails {
private String sectionName;
private String totalSubjects;
}
I have written the following FlatFileItemWriter to get the properties from School object.
public FlatFileItemWriter<School> write() throws Exception {
FlatFileItemWriter<School> flatFileWriter = new FlatFileItemWriter<School>();
flatFileWriter.setResource(new FileSystemResource("C:\\u01\\SchoolDetails.txt"));
flatFileWriter.setName("School-File-Writer");
flatFileWriter.setAppendAllowed(true);
flatFileWriter.setLineSeparator("\n");
flatFileWriter.setHeaderCallback(writer -> writer.write(columnHeaders()));
flatFileWriter.setLineAggregator(new DelimitedLineAggregator<School>() {
{
setDelimiter("^");
setFieldExtractor((FieldExtractor<School>) schoolFieldExtractor());
}
});
return flatFileWriter;
}
private BeanWrapperFieldExtractor<School> schoolFieldExtractor() {
return new BeanWrapperFieldExtractor<School>() {
{
String[] columnValuesMapper = new String[] {
"schoolName", "schoolAddress"
};
setNames(columnValuesMapper);
}
};
}
Currently the file I am sending out has schoolName, schoolAddress. But I want to get all the properties from subclasses along wth school object in BeanWrapperFieldExtractor. The final output file that I will be sending out should have schoolName, schoolAddress, className, totalCountStudents, sectionName, totalSubjects.
I am not sure on how to do that. Any help would be appreciated. Thanks in advance!
The BeanWrapperFieldExtractor supports the dotted notation for nested properties, so you can define your schoolFieldExtractor as follow:
private BeanWrapperFieldExtractor<School> schoolFieldExtractor() {
return new BeanWrapperFieldExtractor<School>() {
{
String[] columnValuesMapper = new String[] {
"schoolName", "schoolAddress",
"classDetails.className", "classDetails.totalCountStudents",
"classDetails.sectionDetails.sectionName", "classDetails.sectionDetails.totalSubjects",
};
setNames(columnValuesMapper);
}
};
}

Find all embedded documents from manual reference in MongoDB

I use MongoDB and Spring Boot in a project. I used manual reference to point out a collection, My structure is as follows:
Reel collection
{
_id : "reel_id_1",
name: "reel 1",
category :[
{
_id : "category_id_1",
name: "category 1",
videos: ["video_id_1","video_id_2"]
}
]
}
Video collection
{
_id: "video_id_1", // first document
name: "mongo"
}
{
_id: "video_id_2", // seconddocument
name: "java"
}
Java classes are
#Document
#Data
public class Reel {
#Id
private ObjectId _id;
private String name;
List<Category> category;
}
#Data
public class Category {
#Id
private ObjectId _id=new ObjectId();
private String name;
Video videos;
}
#Document
#Data
public class Video {
#Id
private ObjectId _id = new ObjectId();
private String name;
}
I tried to join both document via mongoTemplate
public List<Reel> findById(ObjectId _id) {
LookupOperation lookupOperation = LookupOperation.newLookup()
.from("video")
.localField("category.videos")
.foreignField("_id")
.as("category.videos");
UnwindOperation unwindOperation = Aggregation.unwind("category");
Aggregation agg = newAggregation(unwindOperation,match(Criteria.where("_id").is(_id)),lookupOperation);
Aggregation aggregation = newAggregation(lookupOperation);
List<Reel> results = mongoTemplate.aggregate(aggregation, "reel", Reel.class).getMappedResults();
return results;
}
But it throws an error.
Failed to instantiate java.util.List using constructor NO_CONSTRUCTOR with arguments
But since I use "unwind", I created a new Entity UnwindReel and add Category category instead of List<Category> category. And used
List<UnwindReel> results = mongoTemplate.aggregate(aggregation, "reel", UnwindReel.class).getMappedResults();
It combines only first video (video_id_1) object. How can I get all objects inside videos array? Is there any method to fetch?
Your JSON stored in database has wrong structure. Your Reel class expects list of Category, but in database you have stored as nested object.
You need to add this stage just after $lookup
{
"$addFields": {
"category": {
"$map": {
"input": "$category.videos",
"in": {
"videos": "$$this"
}
}
}
}
}
Java code
public List<Reel> findById(String _id) {
Aggregation aggregation = Aggregation.newAggregation(
Aggregation.match(Criteria.where("_id").is(_id)),
Aggregation.lookup(mongoTemplate.getCollectionName(Video.class), "category.videos", "_id", "category.videos"),
new AggregationOperation() {
#Override
public Document toDocument(AggregationOperationContext context) {
return new Document("$addFields",
new Document("category", new Document("$map", new Document("input", "$category.videos")
.append("in", new Document("videos", "$$this")))));
}
})
.withOptions(AggregationOptions.builder().allowDiskUse(Boolean.TRUE).build());
LOG.debug(
aggregation.toString().replaceAll("__collection__", mongoTemplate.getCollectionName(Reel.class)));
return mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(Reel.class), Reel.class)
.getMappedResults();
}
Recomendations
Do not hard-code collection name, use better mongoTemplate.getCollectionName method
Always log aggregation pipeline before performing, it helps debugging.
If your collection will grow in the future, use {allowDiskUse: true} MongoDb aggregation option.

Spring Boot - MongoDB - Inheritance

I'm running into something odd with inheritance and mongodbrepositories.
I have the following:
`
#Document
public class Base {
public String fieldA;
}
public class Derived extends Base {
public String fieldB;
}
public interface DerivedRepository extends MongoRepository<Base, String> {
List<Derived> findByFieldA(String fieldA);
}
`
When inserting i get
Inserting DBObject containing fields: [_class, _id, fieldA, fieldB ]
in collection: base
When i do findByFieldA('some value') on the repository i get the following:
find using query: { "fieldA" : "some value" } fields: null for class:
class Derived in collection: derived
Any idea what is going on here? And how can I fix this, either by saving it to the proper derived collection or by querying from the base collection.
Regards,
First, I would make Derived class as document since the parent is going to be shared among many implementations.
public class Base {
public String fieldA;
}
#Document
public class Derived extends Base {
public String fieldB;
#Override
public String toString() {
return "{fieldA: " + getFieldA() + ", fieldB: " + fieldB + "}";
}
}
Second, change the repository specification with the type of document (class marked as #Document) as:
public interface DerivedRepository extends MongoRepository<Derived, String> {
List<Derived> findByFieldA(String fieldA);
List<Derived> findByFieldB(String fieldB);
}
I added extra method findByFieldB(String fieldB) to explain more.
With these changes, you should be able to query either with fieldA or fieldB as below:
public class SpringBootMongoApplication {
#Autowired
private DerivedRepository derivedRepository;
public void testMethod() throws Exception {
Derived derived1 = new Derived();
derived1.setFieldB("fieldB1");
derived1.setFieldA("fieldA1");
Derived derived2 = new Derived();
derived2.setFieldB("fieldB2");
derived2.setFieldA("fieldA2");
this.derivedRepository.save(Arrays.asList(derived1, derived2));
List<Derived> deriveds = this.derivedRepository.findByFieldA("fieldA1");
System.out.println(deriveds);
List<Derived> deriveds1 = this.derivedRepository.findByFieldB("fieldB2");
System.out.println(deriveds1);
}
}
The output should be:
[{fieldA: fieldA1, fieldB: fieldB1}]
[{fieldA: fieldA2, fieldB: fieldB2}]
You can also verify the object persisted and their types with mongo query as below:
I have created an Spring Boot sample app which you can find in Github.

Resources