MappingInstantiationException upon retrieving multi-dim array of doubles from mongo in spring - spring

I am building a spring MVC app with MongoDB. How can I read matrices in spring from mongo? I have a model which persists to mongo just fine using the MongoTemplate class:
Matrix m = new Matrix();
m.setId(UUID.randomUUID().toString());
m.setValues(values);
mongoTemplate.insert(m, "matrix");
The above code works just fine. Values is a double[][] and it is persisted. I am using an extension of the MongoRepository class to make a findAll() call for a list of matrices.
public interface MatrixRepository extends MongoRepository<Matrix, String> {
Matrix findById(String id);
}
And in my service class:
public List<Matrix> readAll() {
return matrixRepository.findAll();
}
This calling this causes the following stack trace:
org.springframework.data.mapping.model.MappingInstantiationException: Could not instantiate bean class [java.lang.Double]: No default constructor found; nested exception is java.lang.NoSuchMethodException: java.lang.Double.<init>()
org.springframework.data.mapping.model.BeanWrapper.<init>(BeanWrapper.java:105)
org.springframework.data.mapping.model.BeanWrapper.create(BeanWrapper.java:73)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:239)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:187)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.readCollectionOrArray(MappingMongoConverter.java:736)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.getValueInternal(MappingMongoConverter.java:695)
org.springframework.data.mongodb.core.convert.MappingMongoConverter$2.doWithPersistentProperty(MappingMongoConverter.java:252)
org.springframework.data.mongodb.core.convert.MappingMongoConverter$2.doWithPersistentProperty(MappingMongoConverter.java:242)
org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:173)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:242)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:187)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:151)
org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:73)
org.springframework.data.mongodb.core.MongoTemplate$ReadDbObjectCallback.doWith(MongoTemplate.java:1693)
org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:1444)
org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1259)
org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1248)
org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:471)
org.springframework.data.mongodb.repository.support.SimpleMongoRepository.findAll(SimpleMongoRepository.java:255)
org.springframework.data.mongodb.repository.support.SimpleMongoRepository.findAll(SimpleMongoRepository.java:192)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

It happens if your monogo entity class has Double or double fields in spring data mongodb 1.0.0.M5 environment.
You can fix this by replacing your spring data mongodb with a newer version 1.3.3 RELEASE in your pom.xml

Related

Spring Boot + MongoDB+create collection

I have a problem. After having created a Spring Boot project with Eclipse and configuring the application.properties file, my collections are not created, whereas after execution, the Eclipse console signals that the connection to MongoDB has been carried out normally. I don't understand what's going on. With MySQL we had the tables created so I expected the creation of the collections, but nothing.
Summary, I don't see my collection (class annoted #Document) in MongoDB after deployment.
New collection won't be created until you insert at least one document. Refer the document Create Collection
You could do this in two ways through Spring. I tested the instructions below in Spring 2.1.7.
With just a #Document class, Spring will not create a collection in Mongo. It will however create a collection if you do the following:
You have a field you want to index in the collection, and you annotate it as such in the Java class. E.g.
#Indexed(unique = true)
private String indexedData;
Create a repository for the collection:
public interface MyClassRepository extends MongoRepository<MyClass, String> {
}
If you don't need/want an index, the second way of doing this would be to add some code that runs at startup, adds a dummy value in the collection and deletes it again.
#Configuration
public class LoadDatabase {
#Bean
CommandLineRunner initDb(MyClassRepository repository) {
// create an instance of your #Document annotated class
MyClass myDocument = new MyClass();
myDocument = repository.insert(myDocument);
repository.delete(myDocument);
}
}
Make sure your document class has a field of the correct type (String by default), annotated with #Id, to map Mongo's _id field.

Spring Boot WebFlux Converter

I am trying to migrate my project from the Spring MVC to the Spring WebFlux.
The repository I am currently using is ReactiveCrudRepository.
In order to achieve the post-redirect-get pattern, which I have used within Spring MVC, I need to rewrite the current converter to work with ReactiveCrudRepository.
I was trying to do that with this aproach:
#Component
public class ObjByIdConverter implements Converter<String, Obj> {
#Autowired
private IObjRepository objRepository;
#Override
public Obj convert(String id) {
return objRepository.findById(id).block();
}
}
When I implement converter in this way, I am getting the following error:
block()/blockFirst()/blockLast() are blocking, which is not supported in thread reactor-http-xxx.
When I was using CrudRepository instead of ReactiveCrudRepository everything was worked fine.
Is there a way to implement converter to work with ReactiveCrudRepository?
~~~ Edit 1 ~~~
The controller class:
#PostMapping
public Mono<String> processOrder(#ModelAttribute("newCar") Car car) {
webDataBinder.validate();
BindingResult bindingResult = webDataBinder.getBindingResult();
if (bindingResult.hasErrors()) {
return Mono.just("orderForm");
}
return this.carRepository.save(car).thenReturn("redirect:/");
}
The model class:
#Document(collection = "cars")
#ToString
#EqualsAndHashCode
public class Car {
#Id
private String id;
private List<Obj> objs = new ArrayList<>();
// constructor, getters, setters, ...
}
I am using the Thymeleaf view technology.
I have to provide the implementation for ObjByIdConverter because I am getting the following error message: [Failed to convert property value of type 'java.lang.String' to required type 'java.util.List' for property 'objs'; nested exception is java.lang.IllegalStateException: Cannot convert value of type 'java.lang.String' to required type 'com.example.app.model.Obj' for property 'objs[0]': no matching editors or conversion strategy found]
You should not use block in any case in reactive development. If you have ReactiveRepository and Spring Webflux, use them together with Mono/Flux from repository to controller to leverage the reactive way of doing.
But I think the main reason why you try to convert the result to a standard type is for the post-redirect-get pattern, could you detail this in the spring controller context ?

StateMachineRuntimePersister Instantiation getting failed because Spring not able to find its dependent bean MongoDbStateMachineRepository

I am new to spring state machines. I am trying to setup state machine for my transaction data and externalise it to mongo database. But i am getting error while creating "StateMachineRuntimePersister" bean.
Error says - Parameter 0 of method mongoPersist in com.pws.funder.config.PersistConfig required a bean of type 'org.springframework.statemachine.data.mongodb.MongoDbStateMachineRepository' that could not be found
#Configuration
public class PersistConfig {
#Bean(name="runtime")
public StateMachineRuntimePersister<WalletGatewayStates, WalletGatewayEvents, UUID> mongoPersist(
MongoDbStateMachineRepository mongoRepository) {
return new MongoDbPersistingStateMachineInterceptor<WalletGatewayStates,WalletGatewayEvents,UUID>(mongoRepository);
}
}
Any leads would be helpful.
Just create interface like this:
public interface StateMachineRepository extends MongoDbStateMachineRepository {
}
and pass it into mongoPersist method.
Spring automatically creates implementation from your repository interface and put this bean in the context.

How to fix java.lang.IllegalStateException when using spring-data-neo4j

I have a simple test project where checking spring-data-neo4j with spring boot version: 2.1.0.RELEASE (https://github.com/tomkasp/neo4j-playground/blob/master/src/main/java/com/athleticspot/neo4jplayground/domain/AthleteRepository.java)
spring-data-neo4j (version: 5.1.4.RELEASE) dependency is injected by spring-boot-starter-data-neo4j.
My goal was to create a repository method which fetches data with containing and ingnorecase functionalities. In order to do that I've created below method within repository:
public interface AthleteRepository extends CrudRepository<Athlete, Long> {
List<Athlete> findByNameContainingIgnoreCase(String name);
}
When I run above functions I'm getting:
java.lang.IllegalStateException: Unable to ignore case of java.lang.String types, the property 'name' must reference a String
at org.springframework.util.Assert.state(Assert.java:73) ~[spring-core-5.1.2.RELEASE.jar:5.1.2.RELEASE]
at org.springframework.data.neo4j.repository.query.filter.PropertyComparisonBuilder.applyCaseInsensitivityIfShouldIgnoreCase(PropertyComparisonBuilder.java:101) ~[spring-data-neo4j-5.1.2.RELEASE.jar:5.1.2.RELEASE]
Doesn't spring-data-neo4j support Containing and IgnoreCase together? Am I missing something?
At the moment it seems not possible because the referenced org.springframework.data.neo4j.repository.query.filter.PropertyComparisonBuilder seems to allow ignoring case only for "SIMPLE_PROERTY" (is, or equals). See method canIgnoreCase in same class:
private boolean canIgnoreCase(Part part) {
return part.getType() == SIMPLE_PROPERTY && String.class.equals(part.getProperty().getLeafType());
}
Fix is coming with spring 5.2 (Moore): https://jira.spring.io/browse/DATAGRAPH-1190

java.lang.NullPointerException in Spring Boot [duplicate]

This question already has answers here:
Why is my Spring #Autowired field null?
(21 answers)
Closed 6 years ago.
I was able to use RestTemplate and autowire it. However I want to move my rest template related part of code into another class as follows:
public class Bridge {
private final String BASE_URL = "http://localhost:8080/u";
#Autowired
RestTemplate restTemplate;
public void addW() {
Map<String, String> x = new HashMap<String, String>();
W c = restTemplate.getForObject(BASE_URL + "/device/yeni", W.class, x);
System.out.println("Here!");
}
}
And at another class I call it:
...
Bridge wb = new Bridge();
wb.addW();
...
I am new to Spring and Dependency Injection terms. My restTemplate variable is null and throws an exception. What can I do it how to solve it(I don't know is it related to I use new keyword)?
Using Bridge wb = new Bridge() does not work with dependency injection. Your restTemplate is not injected, because wb in not managed by Spring.
You have to make your Bridge a Spring bean itself, e.g. by annotation:
#Service
public class Bridge {
// ...
}
or by bean declaration:
<bean id="bridge" class="Bridge"/>
Just to add further to Jeha's correct answer.
Currently, by doing
Bridge wb = new Bridge();
Means that, that object instance is not "Spring Managed" - I.e. Spring does not know anything about it. So how can it inject a dependency it knows nothing about.
So as Jeha said. Add the #Service annotation or specify it in your application context xml config file (Or if you are using Spring 3 you #Configuration object)
Then when the Spring context starts up, there will be a Singleton (default behavior) instance of the Bridge.class in the BeanFactory. Either inject that into your other Spring-Managed objects, or pull it out manually e.g.
Bridge wb = (Bridge) applicationContext.getBean("bridge"); // Name comes from the default of the class
Now it will have the dependencies wired in.
If you want to use new operator and still all dependency injected, then rather than making this a spring component (by annotating this with #Service), make it a #Configurable class.
This way even object is instantiated by new operator dependencies will be injected.
Few configuration is also required. A detailed explanation and sample project is here.
http://spring-framework-interoperability.blogspot.in/2012/07/spring-managed-components.html

Resources