Spring Boot + MongoDB+create collection - spring

I have a problem. After having created a Spring Boot project with Eclipse and configuring the application.properties file, my collections are not created, whereas after execution, the Eclipse console signals that the connection to MongoDB has been carried out normally. I don't understand what's going on. With MySQL we had the tables created so I expected the creation of the collections, but nothing.
Summary, I don't see my collection (class annoted #Document) in MongoDB after deployment.

New collection won't be created until you insert at least one document. Refer the document Create Collection

You could do this in two ways through Spring. I tested the instructions below in Spring 2.1.7.
With just a #Document class, Spring will not create a collection in Mongo. It will however create a collection if you do the following:
You have a field you want to index in the collection, and you annotate it as such in the Java class. E.g.
#Indexed(unique = true)
private String indexedData;
Create a repository for the collection:
public interface MyClassRepository extends MongoRepository<MyClass, String> {
}
If you don't need/want an index, the second way of doing this would be to add some code that runs at startup, adds a dummy value in the collection and deletes it again.
#Configuration
public class LoadDatabase {
#Bean
CommandLineRunner initDb(MyClassRepository repository) {
// create an instance of your #Document annotated class
MyClass myDocument = new MyClass();
myDocument = repository.insert(myDocument);
repository.delete(myDocument);
}
}
Make sure your document class has a field of the correct type (String by default), annotated with #Id, to map Mongo's _id field.

Related

How to avoid Spring Repository<T, ID> to leak persistence information into service tier

I'm using spring-data-mongodb at the moment so this question is primarily in context of MongoDB but I suspect my question applies to repository code in general.
Out of the box when using a MongoRepository<T, ID> interface (or any other Repository<T, ID> descendent) the entity type T is expected to be the document type (the type that defines the document schema).
As a result injecting such a repository into service component means this repository is leaking database schema information into the service tier (highly pseudo) :
class MyModel {
UUID id;
}
#Document
class MyDocument {
#Id
String id;
}
interface MyRepository extends MongoRepository<MyDocument, String> {
}
class MyService {
MyRepository repository;
MyModel getById(UUID id) {
var documentId = convert(id, ...);
var matchingDocument = repository.findById(documentId).orElse(...);
var model = convert(matchignDocument, ...);
return model;
}
}
Whilst ideally I'd want to do this :
class MyModel {
UUID id;
}
#Document
class MyDocument {
#Id
String id;
}
#Configuration
class MyMagicConversionConfig {
...
}
class MyDocumentToModelConverter implements Converter<MyModel, MyDocument> {
...
}
class MyModelToDocumentConverter implements Converter<MyDocument, MyModel> {
...
}
// Note that the model and the model's ID type are used in the repository declaration
interface MyRepository extends MongoRepository<MyModel, UUID> {
}
class MyService {
MyRepository repository;
MyModel getById(UUID id) {
// Repository now returns the model because it was converted upstream
// by the mongo persistence layer.
var matchingModel = repository.findById(documentId).orElse(...);
return matchingModel ;
}
}
Defining this conversion once seems significantly more practical than having to consistently do it throughout your service code so I suspect I'm just missing something.
But of course this requires some way to inform the mongo mapping layer to be aware of what conversion has to be applied to move between MyModel and MyDocument and to use the latter for it's actual source of mapping metadata (e.g. #Document, #Id, etc.).
I've been fiddling with custom converters but I just can't seem to make the MongoDB mapping component do the above.
My two questions are :
Is it currently possible to define custom converters or implement callbacks that allow me to define and implement this model <-> document conversion once and abstract it away from my service tier.
If not, what is the idiomatic way to approach cleaning this up such that the service layer can stay blissfully unaware of how or with what schema an entity is persisted? A lot of Spring Boot codebases appear to be fine with using the type that defines the database schema as their model but that seems supoptimal. Suggestions welcome!
Thanks!
I think you're blowing things a bit out of proportion. The service layer is not aware of the schema. It is aware of the types returned by the repository. How the properties of those are mapped onto the schema, depends on the object-document mapping. This, by default, uses the property name, as that's the most straightforward thing to do. That translation can either be customized using annotations on the document type or by registering a FieldNamingStrategy with Spring Data MongoDB.
Spring Data MongoDB's object-document mapping subsystem provides a lot of customization hooks that allows transforming arbitrary MongoDB documents into entities. The types which the repositories return are your domain objects that - again, only by default - are mapped onto a MongoDB document 1:1, simply because that's the most reasonable thing to do in the first place.
If really in doubt, you can manually implement repository methods individually that allow you to use the MongoTemplate API that allows you to explicitly define the type, the data should be projected into.
You can use something like MapStruct or write your own Singleton Mapper.
Then create default methods in your repository:
interface DogRepository extends MongoRepository<DogDocument, String> {
DogDocument findById(String id);
default DogModel dogById(String id) {
return DogMapper.INSTANCE.toModel(
findById(id)
);
}
}

Spring Mongo - set custom collection on entity implementing interfaces

Given a mongo entity class :
#Data
#Document(collection = "#{ T(de.axa.services.ecm.contentmanagement.service.archivbearbeiten.util.ServiceUtils).getMongoCollectionPreFix() }_allgarchiv")
public class AllgArchivReference implements Kleinsparte {
My aim is to store this entity in an collection "xxx_allgarchiv" whereby "xxx" is based on the runtime environment.
I have one Crud repository:
#Repository
public interface ReferenceRepository extends MongoRepository<Kleinsparte, String> {
}
and the simple implementation....
result = referenceRepository.save(kleinsparte);
No I want to store the AllgArchivReference in the collection above ("xxx_allgarchiv").
Unfortunately the entity is stored in a new created collection "kleinsparte" but this is not what i want.
How could I enforce spring data to store allgarchivReference in eponymous collection without creating a new own crud repository?
By the way, there more than 20 other Reference-Entities which are also implementing the interface "Kleinsparte" and i don't want to create for each entity an own crud repository interface.
Is this possible? Any help are really appreciate.
Kind regards,
Bodo

#autowire beans and #value properties after object mapper deserialized json

I am using spring framework.
I am using objectMapper to desiriale store.json file:
service:
objectMapper.readValue(new File(jsonFilePath), Store.class)
store.json:
{
"type": "Store",
"name": "myStore",
}
Store.class:
#Value("${store.size:1000}")
private Integer sroreSize;
#autowire
private storePersistency storePersistency;
public Store(#JsonProperty("name") String name) {
super(name);
}
I am trying find out how to #autowire beans and #value properties in store.class, beans and properties that exist in applicationContext.
In current example sroreSize and storePersistency still null.
I know that I can inject fields to object mapper and then use #JacksonInject annotation but I have a lot of field to inject - not a good option for me.
Custom desirializer also not a good option for me.
Is there any way not to use custom desirializer or not to inject every bean/property that I need in store.class?
Something that injects all the beans and properties and I simply can use it in Store.class.
So you want some Store fields like storePersistency and sroreSize to be initialized once at application startup (which is when Spring will setup the application context) and then at runtime create multiple different Store objects differing in some fields as name that are initialized by Jackson.
I suggest annotating Store with #Component to get Spring to initialize #Value and #Autowired fields. The #Scope annotation will cause a new independent Store instance to be created each time. Simplified example:
#Component
#Scope(SCOPE_PROTOTYPE)
class Store {
private String name;
#Value("${store.size:1000}")
private Integer sroreSize;
}
Then the key is method readerForUpdating where you can pass an existing instance of Store and Jackson will update that instead of creating a new one as usually:
Store store = context.getBean(Store.class);
objectMapper.readerForUpdating(store).readValue("{\"name\":\"myStore\"}");
Where context is a Spring ApplicationContext reference that I autowired in a test class. You don't need to use the return value of readValue in this case, just inspect the existing store variable and name will be updated.

how to retrieve objects when using inheritance in spring Data

Say I have a class structure as follows, it is pretty basic inheritance:
Manager extends Person {
private String name;
Manager() {
}
}
Clerk extends Person {
private String salary;
}
In spring Data if I store these in Mongo, is it possible to configure it to map the correct class when I do a getById. I assume i will have to store some class info?
What i dont want to do is the need to create seperate repository classes if i can avoid it, also i dont know what the object will be when i do a getById
If you are using spring-data-mongodb MongoRepository to write data in your database according to your entity model, a _class field will be added to document roots and to complex property types (see this section). This fields store the fully qualified name of the Java class and it allows disambiguation when mapping from MongoDb Document to Spring data model.
However, if you only use MongoRepository to read from your database, you need to tell Spring-data how to map your entities explicitly. You will need to Override Mapping with Explicit Converters.
PersonReadConverter.class
public class PersonReadConverter implements Converter<Document, Person> {
#Override
public Contact convert(Document source) {
if (source.get("attribute_specific_to_Clerk") != null) {
Clerk clerk = new Clerk();
//Set attributes using setters or defined constructor
return clerk;
}
else {
Manager manager = new Manager()
//Set attribute using setters or defined constructor
return manager;
}
}
}
Then, you have to Register Spring Converters with the MongoConverter.
You can find an example of my own at: Spring Data Mongo - How to map inherited POJO entities?

How to set #CreatedDate in the past (for testing)

My spring-data-jpa backend has a class that populates the (test) database with a lot of test data. The class usese the spring data repositories to create entities. All my entities have a field annotated with #CreatedData and the corresponding #EntityListeners(AuditingEntityListener.class) annotation on the model class. This works fine so far. dateCreated is automatically set correctly.
But when running Junit test I sometimes need to create a (test) object with a dateCreated in the past. How can I archive this? Only via plain JDBC?
In case you are using Spring Boot, you can mock dateTimeProvider bean used in EnableJpaAuditing annotation. Spring Data uses this bean to obtain current time at the entity creation/modification.
#Import({TestDateTimeProvider.class})
#DataJpaTest
#EnableJpaAuditing(dateTimeProviderRef = "testDateTimeProvider")
public class SomeTest {
#MockBean
DateTimeProvider dateTimeProvider;
...
It is necessary to define actual testDateTimeProvider bean, but it won't be used at all, as you will use mock instead.
You can write mockito methods afterwards as usual:
#Test
public void shouldUseMockDate() {
when(dateTimeProvider.getNow()).thenReturn(Optional.of(LocalDateTime.of(2020, 2, 2, 0, 0, 0)));
... actual test assertions ...
I found a way that works for me (using plain JDBC):
First I create my domain objects for testing with spring-data-jpa:
MyModel savedModel = myRepo.save(myModel);
That automatically fills the "dateCreated" with timestamp of "now". Since I need creation dates in the past for testing I manually tweak them with plain JDBC:
#Autowired
JdbcTemplate jdbcTemplate;
[...]
// This syntax is for H2 DB. For MySQL you need to use DATE_ADD
String sql = "UPDATE myTable SET created_at = DATEADD('DAY', -"+ageInDays+", NOW()) WHERE id='"+savedLaw.getId()+"'";
jdbcTemplate.execute(sql);
savedModel.setCreatedAt(new Date(System.currentTimeMillis() - ageInDays* 3600*24*1000);
Do not forget to also setCreatedAt inside the returned model class.
I do not know exact test case which you are using, but i can see few solutions:
create a mock object, once the dateCreated is called return date
month ago
maybe use in-mem db, populate it with date before test
go with AOP from the link provided in comments

Resources