I'm using Jackson2RepositoriesPopulatorFactoryBean to populate my bdd from json files.
It work perfectly but fail to find repository for object that are child of this repository. ( I have some object inheriting from an abstract one ).
'An exception occurred while running. null: InvocationTargetException:
No repository found for domain type: x.y.z'
I investigate and found that the Populator fetch repository from class name of the object.
My question is : is it possible to change that ? (And set it to fetch parent repo if it fail with actual class repo ?)
EDIT :
A solution could be to add a repository for each class in a package to the list of Repositories in spring context...
How to do that without adding an #RepositoryRestResource interface for each of them ?
Well, it seems I found a solution using the #Document on parent class instead of on child.
This to avoid creating a collection by child.
Plus, I add one repository (#Repository) by child... this is not the best way to do, but this is a solution.
Related
I have a project that uses Spring. The project consists on two different parts, the generic part and the specific one. The generic part is compiled as a .jar, it defines a set of traits and it's used as a dependency by the specific part, which is the one that implements the methods.
In order to test the generic part, I have created a "fake" implementation of one of the trait (let's say "fakeMethodA"), under the test directory of the generic project and I annotated this fake implementation with the #Component annotation. I'm getting the beans using the application context.
The problem comes when I try to use this generic part on the specific project. Since my actual implementation of this trait (let's say "methodAImplementation") also has a #Component annotation, when I run my tests I get:
org.springframework.beans.factory.NoUniqueBeanDefinitionException
expected single matching bean but found 2:
It finds the fakeMethodA from the generic part and methodAImplementation from the implementation. Is there any way to exclude this "fake" implementation from the execution? Is there a better way to define this?
Any help would be greatly appreciated.
The problem was solved by the use of #Profile annotation on the generic method.
I annotated the fake method on the tests with:
#Profile(value = Array("Test"))
And the right implementation with another profile value. After that, when I select the bean from the context, I can select the correct profile.
Is there a way to determine which repository method is being called?
I have a Parent entity that has a OneToMany annotation to a Child entity. In the HATEOAS documentation, I have the link which is /parent/{id}/child and can reach the children. I want to know what the method is that is returning the children so I can override it in the ChildRepository. I am having difficulty trying to determine that and have tried several ways to do so.
Is the /parent/{id}/child being called from the ChildRepository or is that being handled in the ParentRepository?
I have been at this for several hours and cannot find an answer on Google or SOF, maybe I someone can help me with the answer or ask the right question.
Thank you in advance!
If you need to implement a custom logic in Spring Data REST you can try to use:
Projections and Excerpts, even in the output of repository query methods and for making DTO objects (link)
Custom handlers
HATEOAS resource processors (example1, example2, example3)
RepositoryRestControllers
I'm hoping to make some calls to solr using Alfresco's org.alfresco.repo.search.impl.solr.SolrAdminHTTPClient class. However that bean search.solrAdminHTTPCLient does not seem to be accessible to me from the standard application context. Attempting to add a dependency and property reference for my own bean (via xml) has failed as well. Any reason this is not accessible?
public class MyClass extends DeclarativeWebScript implements ApplicationContextAware{
...
SolrAdminHTTPClient adminClient = (SolrAdminHTTPClient) appContext.getBean("search.solrAdminHTTPCLient");
Would like to avoid creating my own clients for standard solr admin queries.
Judging by the folder tree leading to this file, I would say that bean is available in the search SubSystem which means it lives completely in a different context, a child context in fact.
So you need to lookup that context first, before trying to retrieve your bean !
UPDATE: I have done some digging, and I guess that your window to that child context is in this particular bean.
So I think you can do the following :
SwitchableApplicationContextFactory search = (SwitchableApplicationContextFactory)applicationContext.getBean("Search");
ApplicationContext searchCtx = search.getApplicationContext();
SolrAdminHTTPClient adminClient = (SolrAdminHTTPClient) searchCtx.getBean("search.solrAdminHTTPCLient");
A friend from the IRC channel has however suggested an alternative solution:
Set up a seperate ChildApplicationContextFactory for Each and every bean you which to access in your child context, and he suggested you get some inspiration from this.
I have two Maven projects, one called project-data and the other one call project-rest which has a dependency on the project-data project.
The Maven build is successful in the project-data project but it fails in the project-rest project, with the exception:
Caused by: org.hibernate.DuplicateMappingException: duplicate import: TemplatePageTag refers to both com.thalasoft.learnintouch.data.jpa.domain.TemplatePageTag and com.thalasoft.learnintouch.data.dao.domain.TemplatePageTag (try using auto-import="false")
I could see some explanation here: http://isolasoftware.it/2011/10/14/hibernate-and-jpa-error-duplicate-import-try-using-auto-importfalse/
What I don't understand, is why this message does not occur when building the project-data project and occurs when building the project-rest project.
I tried to look up in the pom.xml files to see if there was something in there that could explain the issue.
I also looked up the way the tests are configured and run on the project-rest project.
But I haven't yet seen any thing.
The error is basically due to the fact that the sessionFactory bean underlies two entities with the same logical name TemplatePageTag :
One lies under the com.thalasoft.learnintouch.data.jpa.domain package.
The other under the com.thalasoft.learnintouch.data.dao.domain.
Since this fall to an unusual case, you will have Hibernate complaining about the case. Mostly because you may run in eventual issues when running some HQL queries (which are basically entity oriented queries) and may have inconsistent results.
As a solution, you may need either to:
Rename your Entity beans with different name to avoid confusion which I assume is not a suitable solution in your case since it may need much re-factoring and can hurt your project compatibility.
Configure your EJB entities to be resolved with different names. As you are configuring one entity using xml based processing and the other through annotation, the schema is not the same to define the entities names:
For the com.thalasoft.learnintouch.data.jpa.domain.TemplatePageTag entity, you will need to add the name attribute to the #Entity annotation as below:
#Entity(name = "TemplatePageTag_1")
public class TemplatePageTag extends AbstractEntity {
//...
}
For the com.thalasoft.learnintouch.data.dao.domain.TemplatePageTag, as it is mapped using an hbm xml declaration, you will need to add the entity-name attribute to your class element as follows:
<hibernate-mapping>
<class name="com.thalasoft.learnintouch.data.dao.domain.TemplatePageTag"
table="template_page_tag"
entity-name="TemplatePageTag_2"
dynamic-insert="true"
dynamic-update="true">
<!-- other attributes declaration -->
</class>
</hibernate-mapping>
As I took a look deeper into your project strucure, you may need also to fix entity names for other beans as you have been following the same schema for many other classes, such as com.thalasoft.learnintouch.data.jpa.domain.AdminModule and com.thalasoft.learnintouch.data.dao.domain.AdminModule.
This issue could be fixed by using a combination of #Entity and #Table annotations. Below link provides a good explanation and difference between both.
difference between name-attribute-in-entity-and-table
I want to be able to set default values for some fields in my domain classes.
Till now I had a class which stored a Map of settings for my whole project, with a task in mind to move this map into a redis database.
The day has come and I moved all the data to redis and created a nice spring bean to get/set the values.
However...
it seems that default values are set on the domain class instance before bean is injected.
This kind of breaks the whole process.
Also... there's an issue with unit tests.
I've created a class which implements the same interface as the spring bean and holds test values. I wanted to inject it into domain classes, but this fails as well.
So right now I'm trying to find a good way to handle externally stored defauls values for my domain classes with ability to run unit tests.
Any thoughts?
There are a few different approaches you could take:
Introduce a separate bean with the default values so that those are supplied in the same way as they were before. In a separate higher level context or later on in application startup, you could then override the bean definition with the one that pulls from the database
Use a BeanPostProcessor or BeanFactoryPostProcessor to specify the default values, then use your new bean for retrieving new values
If neither of these answers is helpful, please post your setup and example code so I can get a clearer picture of what you're trying to do.
What I did in the end:
I've created a class which is connecting to Redis and gets me all the data I require.
For unit testing I've created a copy of this class, it implements the same interface but instead of getting the data from Redis it has a simple Map inside and get's the data from there. In the end it acts the same, but the data is stored internally. So in my unit tests I just inject this Unit test version of this class where appropriate.
Probably not the best solution there is but it worked for me for the last few months.