Spring Boot Mongo findById returns null - spring

I have a collection, with documents having a field named _id of type String, not generated manually.
I have been trying to get a document using its id.
val criteria = Criteria.where("_id").`is`("a2z3e44R")
val document = mongoTemplate.findOne(Query.query(criteria), MyDocument::class.java) // returns null
val criteria = Criteria.where("_id").`is`(ObjectId("a2z3e44R"))
val document = mongoTemplate.findOne(Query.query(criteria), MyDocument::class.java) // returns null
val document = mongoTemplate.findById("a2z3e44R", MyDocument::class.java) // returns null
mongoTemplate.findAll(MyDocument::class.java).first { myDocument ->
myDocument._id == "a2z3e44R"
} // OK...
MyDocument is
data class MyDocument(val _id: String, val name: String)
Trying to find a document by another field works.
An idea of what I could be missing or a workaround?

You should indicate the type of the id like this
public class Article {
#MongoId(value = FieldType.OBJECT_ID)
private String id;
private String title;
private String desc;
}

Try mark _id with annotation #Id. The #Id annotation is used to specify the identifier for Spring.
data class MyDocument(#Id val _id: String, val name: String)

Ypu could define in your repository:
public interface MyDocumentRepository extends MongoRepository<MyDocument, String> {
Pets findBy_id(ObjectId _id);
}
and use it :
myDocumentRepository.findBy_id("a2z3e44R");
for more info see
or
ObjectId objID = new ObjectId("a2z3e44R");
query.addCriteria(Criteria.where("_id").lt(objID));
like this other answer link

Related

Why is Spring Data MongoDB insert/save operation not returning the ObjectId

I have tried this several ways (U can assume my MongoConfiguration is Correct)
Using implementation 'org.springframework.boot:spring-boot-starter-data-mongodb:2.5.3'
My Class
#Document(collection = "address")
class Address {
#Id
var id: String? = null
var label: String? = null
var number: String? = null
var street: String? = null
...
My Repository
#Repository
interface AddressRepository : MongoRepository<Address, String> {
fun findByLabel(label: String): Address
}
in My #RestController I call the save (or insert)
val savedAddress = addressRepository.insert(address)
According to Gradle: org.springframework.data:spring-data-mongodb:3.1.6 as well as Gradle: org.springframework.data:spring-data-commons:2.4.6 Docs
Returns: the saved entity; will never be null.
However It does create a record! I inspected the result by using Compass
But it only echoes the address content I used to create it! witout the ID
If I would query the record e.g.
returnedAddress= savedAddress.label?.let { addressRepository.findByLabel(it) }!!
I get the rocord returned WITH the Id!
This Behavior I detected some time now and it is not always possible to locate the correct record again if the Id is the only Unique key in the collection!
Is there a Mistake?, configuration or any other way I can get the ObjectId / _id emmitted properly
Note: <S extends T> S save(S entity) calls insert when the entity record is new

How to tell Spring Data MongoDB to store nested fields of a document that are also documents in their own collection?

I have two collections called persons and addresses. The idea is to have person hold an address in the field address. I use Spring Data MongoDB to persist those mentioned documents.
My usual way of crafting the "relation" between Person > Address was to store the ID of the address and give it to the person object. Later when I find() a person I resolve the address object by it's id and voila I have my person + address.
However I find this somewhat every cumbersome since in my code I just want to add the Address object as whole and not only it's ID so I can work with it while also saving it to the repository at any point of time.
I therefore started a little unit test to see how Spring Data MongoDB saves the Address object if it's just a field of Person and is not saved by it's own Repository.
This is what I came up with:
import org.springframework.data.mongodb.core.mapping.Document
import org.springframework.data.mongodb.repository.MongoRepository
import org.springframework.stereotype.Repository
#Document("person")
data class Person(
val id: String,
val name: String,
val age: Int,
var address: Address
)
#Document("addresses")
data class Address(
val id: String,
val street: String?,
val number: Int?
)
#Repository
interface PersonRepository : MongoRepository<Person, String>
#Repository
interface AddressRepository : MongoRepository<Address, String>
And this is the unit test - that fails with the last steps as I was expecting:
internal class FooTest #Autowired constructor(
private val personRepository: PersonRepository,
private val addressRepository: AddressRepository
) {
#Test
fun `some experiment`() {
val testPerson = Person("001", "Peter", 25, Address("011","Lumberbumber", 12))
personRepository.save(testPerson)
val person = personRepository.findAll()[0]
assertThat(person).isNotNull
assertThat(person.address).isNotNull
assertThat(person.address.street).isEqualTo("Lumberbumber")
assertThat(person.address.number).isEqualTo(12)
// works because address was just copied into the object structure
// of `person` and was not seen as a standalone document
val address = addressRepository.findAll()[0]
assertThat(address.street).isEqualTo("Lumberbumber") // fails
assertThat(address.number).isEqualTo(12) // fails
// As expected `address` was not persisted alongside the `person` document.
}
}
So I thought about using AbstractMongoEventListener<Person> to intercept the saving process and pick the Address object out from Person here and do a addressRepository.save(addressDocument) while putting a lightweight address object (only having the ID) back in the Person document.
The same I'd to in the reverse when doing a find for Person and assembling Person and Address together again.
#Component
class MongoSaveInterceptor(
val addressRepository: AddressRepository
) : AbstractMongoEventListener<Person>() {
override fun onBeforeConvert(event: BeforeConvertEvent<Person>) {
val personToSave = event.source
val extractedAddress = personToSave.address
val idOfAddress = addressRepository.save(extractedAddress).id
personToSave.address = Address(idOfAddress, null, null)
}
override fun onAfterConvert(event: AfterConvertEvent<Person>) {
val person = event.source
val idOfAddress = person.address.id
val foundAddress = addressRepository.findById(idOfAddress)
foundAddress.ifPresent {
person.address = it
}
}
}
It works that way and might be a workaround solution for my requirement.
BUT
I feel that there has to be something like that already working and I might just need to find the proper configuration for that.
That's where I am stuck atm and need some guidance.
Another research showed me it's about #DBRef (https://www.baeldung.com/cascading-with-dbref-and-lifecycle-events-in-spring-data-mongodb) I have to use. This way Spring Data MongoDB stores the embedded document class and resolves it when loading the parent document object from the database.

What is the most convenient way to deal with nested objects in Room?

I want to save the server’s response in database (class Parent). The json has nested object, which also should be saved in database in new table (class Nested). The problem is what I don’t know how to write class Parent and ParentDao to make it use NestedDao
#Entity
data class Parent(
#PrimaryKey(autoGenerate = true)
var id: Long? = null,
#SerializedName(«nested»)
val homeTeam: Nested,
//other fields
)
#Entity
data class Nested(
#PrimaryKey(autoGenerate = true)
var nestedId: Long? = null,
#SerializedName("name")
val name: String,
//other fields
)
#Dao
interface ParentDao {
#Query("SELECT * FROM parent»)
fun getData(): Single<List<Parent>>
#Insert
fun insert(matches: List<Parent>)
}
This gives me an error: Cannot figure out how to save this field into database. You can consider adding a type converter for it.
So, what should I do to save and query Parent with Nested at once?
I don't know if you've succeed or not, but here is my answer I hope it'll help you.
That's what I used for my project and what is recommended for Room in Android docs
#Entity
data class Parent(
#PrimaryKey(autoGenerate = true)
var id: Long? = null,
#Embedded #ColumnInfo(name= "nested")
val homeTeam: Nested,
//other fields
)
data class Nested(
#PrimaryKey(autoGenerate = true)
var nestedId: Long? = null,
#ColumnInfo(name= "name")
val name: String,
//other fields
)
#Dao
interface ParentDao {
#Query("SELECT * FROM parent»)
fun getData(): Single<List<Parent>>
#Insert
fun insert(matches: List<Parent>)
}

Spring data elasticSearch : Update entity using alias

I'm currently fighting with the spring-data-elasticsearch API. I need it to work on an alias with several indexes pointing on it.
Each indexes have the same types stored, but are juste day to day storage (1rst index are monday's resulsts, second are tuesday's resulsts....).
Some of the ElasticsearchRepository methods don't work because of the alias. I currently managed to do a search (findOne() equivalent) but I am not able to update an entity.
I don't know how to achieve that, I looked to the documentation and samples.. but I'm stuck.
My repository
public interface EsUserRepository extends ElasticsearchRepository<User, String>
{
#Query("{\"bool\" : {\"must\" : {\"term\" : {\"id_str\" : \"?0\"}}}}")
User findByIdStr(String idStr);
}
My Entity
#Document(indexName = "osintlab", type = "users")
public class User
{
// Elasticsearch internal id
#Id
private String id;
// Just a test to get the real object index (_index field), in order to save it
#Field(index = FieldIndex.analyzed, type = FieldType.String)
private String indexName;
// Real id, saved under the "id_str" field
#Field(type = FieldType.String)
private String id_str;
#Field(type = FieldType.String)
private List<String> tag_user;
}
What I tested
final IndexQuery indexQuery = new IndexQuery();
indexQuery.setId(user.getId());
indexQuery.setObject(user);
esTemplate.index(indexQuery);
userRepository.index(user));
userRepository.save(user))

Spring Data MongoDB: Accessing and updating sub documents

First experiments with Spring Data and MongoDB were great. Now I've got the following structure (simplified):
public class Letter {
#Id
private String id;
private List<Section> sections;
}
public class Section {
private String id;
private String content;
}
Loading and saving entire Letter objects/documents works like a charm. (I use ObjectId to generate unique IDs for the Section.id field.)
Letter letter1 = mongoTemplate.findById(id, Letter.class)
mongoTemplate.insert(letter2);
mongoTemplate.save(letter3);
As documents are big (200K) and sometimes only sub-parts are needed by the application: Is there a possibility to query for a sub-document (section), modify and save it?
I'd like to implement a method like
Section s = findLetterSection(letterId, sectionId);
s.setText("blubb");
replaceLetterSection(letterId, sectionId, s);
And of course methods like:
addLetterSection(letterId, s); // add after last section
insertLetterSection(letterId, sectionId, s); // insert before given section
deleteLetterSection(letterId, sectionId); // delete given section
I see that the last three methods are somewhat "strange", i.e. loading the entire document, modifying the collection and saving it again may be the better approach from an object-oriented point of view; but the first use case ("navigating" to a sub-document/sub-object and working in the scope of this object) seems natural.
I think MongoDB can update sub-documents, but can SpringData be used for object mapping? Thanks for any pointers.
I figured out the following approach for slicing and loading only one subobject. Does it seem ok? I am aware of problems with concurrent modifications.
Query query1 = Query.query(Criteria.where("_id").is(instance));
query1.fields().include("sections._id");
LetterInstance letter1 = mongoTemplate.findOne(query1, LetterInstance.class);
LetterSection emptySection = letter1.findSectionById(sectionId);
int index = letter1.getSections().indexOf(emptySection);
Query query2 = Query.query(Criteria.where("_id").is(instance));
query2.fields().include("sections").slice("sections", index, 1);
LetterInstance letter2 = mongoTemplate.findOne(query2, LetterInstance.class);
LetterSection section = letter2.getSections().get(0);
This is an alternative solution loading all sections, but omitting the other (large) fields.
Query query = Query.query(Criteria.where("_id").is(instance));
query.fields().include("sections");
LetterInstance letter = mongoTemplate.findOne(query, LetterInstance.class);
LetterSection section = letter.findSectionById(sectionId);
This is the code I use for storing only a single collection element:
MongoConverter converter = mongoTemplate.getConverter();
DBObject newSectionRec = (DBObject)converter.convertToMongoType(newSection);
Query query = Query.query(Criteria.where("_id").is(instance).and("sections._id").is(new ObjectId(newSection.getSectionId())));
Update update = new Update().set("sections.$", newSectionRec);
mongoTemplate.updateFirst(query, update, LetterInstance.class);
It is nice to see how Spring Data can be used with "partial results" from MongoDB.
Any comments highly appreciated!
I think Matthias Wuttke's answer is great, for anyone looking for a generic version of his answer see code below:
#Service
public class MongoUtils {
#Autowired
private MongoTemplate mongo;
public <D, N extends Domain> N findNestedDocument(Class<D> docClass, String collectionName, UUID outerId, UUID innerId,
Function<D, List<N>> collectionGetter) {
// get index of subdocument in array
Query query = new Query(Criteria.where("_id").is(outerId).and(collectionName + "._id").is(innerId));
query.fields().include(collectionName + "._id");
D obj = mongo.findOne(query, docClass);
if (obj == null) {
return null;
}
List<UUID> itemIds = collectionGetter.apply(obj).stream().map(N::getId).collect(Collectors.toList());
int index = itemIds.indexOf(innerId);
if (index == -1) {
return null;
}
// retrieve subdocument at index using slice operator
Query query2 = new Query(Criteria.where("_id").is(outerId).and(collectionName + "._id").is(innerId));
query2.fields().include(collectionName).slice(collectionName, index, 1);
D obj2 = mongo.findOne(query2, docClass);
if (obj2 == null) {
return null;
}
return collectionGetter.apply(obj2).get(0);
}
public void removeNestedDocument(UUID outerId, UUID innerId, String collectionName, Class<?> outerClass) {
Update update = new Update();
update.pull(collectionName, new Query(Criteria.where("_id").is(innerId)));
mongo.updateFirst(new Query(Criteria.where("_id").is(outerId)), update, outerClass);
}
}
This could for example be called using
mongoUtils.findNestedDocument(Shop.class, "items", shopId, itemId, Shop::getItems);
mongoUtils.removeNestedDocument(shopId, itemId, "items", Shop.class);
The Domain interface looks like this:
public interface Domain {
UUID getId();
}
Notice: If the nested document's constructor contains elements with primitive datatype, it is important for the nested document to have a default (empty) constructor, which may be protected, in order for the class to be instantiatable with null arguments.
Solution
Thats my solution for this problem:
The object should be updated
#Getter
#Setter
#Document(collection = "projectchild")
public class ProjectChild {
#Id
private String _id;
private String name;
private String code;
#Field("desc")
private String description;
private String startDate;
private String endDate;
#Field("cost")
private long estimatedCost;
private List<String> countryList;
private List<Task> tasks;
#Version
private Long version;
}
Coding the Solution
public Mono<ProjectChild> UpdateCritTemplChild(
String id, String idch, String ownername) {
Query query = new Query();
query.addCriteria(Criteria.where("_id")
.is(id)); // find the parent
query.addCriteria(Criteria.where("tasks._id")
.is(idch)); // find the child which will be changed
Update update = new Update();
update.set("tasks.$.ownername", ownername); // change the field inside the child that must be updated
return template
// findAndModify:
// Find/modify/get the "new object" from a single operation.
.findAndModify(
query, update,
new FindAndModifyOptions().returnNew(true), ProjectChild.class
)
;
}

Resources