I Save my data to database using spring.
#RepositoryRestResource(collectionResourceRel = "operators", path = "operators")
public interface OperatorsRepository extends MongoRepository<Operator, String> {
}
and I have file:
main\resources\application.properties
spring.data.mongodb.uri=mongodb://admin:password#myclusterurl/test?retryWrites=true&w=majority
In my config class I use:
#Bean
CommandLineRunner commandLineRunner(OperatorsRepository operatorsRepository){operatorsRepository.save(myobjToSave);}
Everything works fine, i get data saved data using REST. But my problem is that in compass mongodb I don't see created collections and data. Why? Using mongo shell and mongo atlas is the same.
Declaring a bean doesn't mean it is automatically executed. If you want to create a new collection from, let's say, a JSON file from the src/main/resources (or test), then you have to trigger the call of this method somehow.
I suggest to use #PostConstruct annotation that triggers once upon the object creation. Since you want to create data using the OperatorsRepository, I'd use it at #Service class injecting that object:
#PostConstruct
void createData() {
this.operatorsRepository.save(myobjToSave);
}
Try this
spring.data.mongodb.uri=mongodb://xxxxxxxxxxxxxxxxxxx
spring.autoconfigure.exclude=org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration
Related
I'm migrating a legacy application from Spring-core 4 to Springboot 2.5.2.
The application is using spring-data-rest (SDR) alongside spring-data-mongodb to handle our entities.
The legacy code was overriding SDR configuration by extending the RepositoryRestMvcConfiguration and overriding the bean definition for persistentEntityJackson2Module to remove serializerModifier and deserializerModifier.
#EnableWebMvc
#EnableSpringDataWebSupport
#Configuration
class RepositoryConfiguration extends RepositoryRestMvcConfiguration {
...
...
#Bean
#Override
protected Module persistentEntityJackson2Module() {
// Remove existing Ser/DeserializerModifier because Spring data rest expect linked resources to be in href form. Our platform is not tailored for it yet
return ConverterHelper.configureSimpleModule((SimpleModule) super.persistentEntityJackson2Module())
.setDeserializerModifier(null)
.setSerializerModifier(null);
}
It was to avoid having to process DBRef as href link when posting entities, we pass the plain POJO instead of the href and we persist it manually before the entity.
Following the migration, there is no way to set the same overrided configuration but to avoid altering all our processes of creation we would like to keep passing the POJO even for DbRef.
I will add an exemple of what was working before :
We have the entity we want to persist :
public class EntityWithDbRefRelation {
....
#Valid
#CreateOnTheFly // Custom annotation to create the dbrefEntity before persisting the current entity
#DBRef
private MyDbRefEntity myDbRefEntity;
}
the DbRefEntity
public class MyDbRefEntity {
...
private String name;
}
and the JSON Post request we are doing:
POST base-api/entityWithDbRefRelations
{
...
"myDbRefEntity": {
"name": "My own dbRef entity"
}
}
In our database this request create our myDbRefEntity and then create the target entityWithDbRefRelation with a dbRef linked to the other entity.
Following the migration, the DBRef is never created because when deserializing the JSON into a PersistingEntity, the myDbRefEntity is ignored because it's expecting an href instead of a complex object.
I see 3 solutions :
Modify all our process to first create the DBRef through one request then create our entity with the link to the dbRef
Very costly as we have a lot of services creating entities through this backend
Compliant with SDR
Define our own rest mvc controllers to do operations, to ignore the SDR mapping machanism
Add AOP into the RepositoryRestMvcConfiguration around the persistentEntityJackson2Module to set le serializerModifier and deserializedModifier to null
I really prefer to avoid this solution as Springboot must have remove a way to configure it on purpose and it could break when migrating on newer version
Does anyone know a way to continue considering the property as a complex object instead of an href link except from my 3 previous points ?
Tell me if you need more information and thanks in advance for your help!
i started a project on spring boot using a rest a webservice, when i shared it between my team they puted some comments :
get method need to be grouped Ex : get/users & get/users/{id} will be get/users/{id}
remove put method & just use post Ex: post/users/0 add | post/users/{id} update
make a helper class for Jdbc Template and call it in the repository classes to centralize the code
pls guys help me to solve this i'm so confused, and thank you
get method need to be grouped Ex : get/users & get/users/{id} will be
get/users/{id}
I do not agree with this. /get/users will be returning List<User> and get/users/{id} will return User that matches with {id}
remove put method & just use post Ex: post/users/0 add |
post/users/{id} update
Post should be used when you create a new resource. POST is not idempotent. Each time you call a post a new resource will be created.
e.g. Calling POST /Users will create a new User every-time.
PUT on other hands works like upsert. Create if the resource is not present and update/replace if present. Put is idempotent and doesn't change the resource's state even if it's called multiple times.
make a helper class for Jdbc Template and call it in the repository
classes to centralize the code
Helper classes help to separate the concerns and achieve single responsibility.
However, JdbcTemplate is a ready to use abstraction of JDBC. I don't see any point in creating Helper. You can create a DataAccessObject (DAO) or Repository which has-a JdbcTemplate. Like the two Dao shown below
public class UserDao {
#Autowired
private JdbcTemplate jdbcTemplate;
public User findUserById(String id){}
public void addUser(User user){}
}
// -------
public class BooksDao{
#Autowired
private JdbcTemplate jdbcTemplate;
public List<Book> getAllBooksByType(String type){}
public void Book getBookByName(String name){}
}
Now, your Dao objects can be called from Controller or if you need to modify data before/after DB operation, best is to have a Service layer between Controller and Dao.
Don't bother too much about recommendations or rules. Stick to the basic OOPS concepts. Those are really easy to understand and implement.
Always:
Encapsulate data variables and methods working on those variables together
Make sure your class has a Single Responsibility
Write smaller and testable methods (if you can't write tests to cover your method, then something is wrong with your method)
Always keep the concerns separate
Make sure your objects are loosely coupled. (You are already using spring so just use the spring's auto-wiring)
I have a mapper class for an incoming Event message.
Once the event message comes to the application, the mapper class sets the values in the entity object and saves it in the Database.
I have Autowired the entity object in my mapper class.
Whenever a new event comes in, the autowired entity object is still having the Old/previous values.
Is autowiring of Domain/Entity object possible in this case or I should go with 'New' keyword instead of Autowiring as Spring bean.
I see some posts about using #Configurable. I am not sure which is the best coding practice in this case?
#Service
public class LegacyEventMapper {
#Autowired
private LegacyEvent legacyEvent;
#Autowired
private LegacyEntity legacyEntity;
public void mapLegacyNotificationDetails(LegacyScheduleEvent body) throws Exception {
//Setting the values into the Entity object
Thanks
I have no idea why you actually want to #Autowire an #Entity and make it spring aware. This is wrong. You can do it, but it makes absolutely no sense.
What you actually want to do is create a new LegacyEntity (via the new LegacyEntity) and save that instance to DB.
What you have read via #Configurable is the other way around - you inject a spring bean/service into an Entity.
I think We can #Autowire an #Entity class. But then we need to mention in Entity class that it is of Request scope
#Entity
#Scope(scopeName=WebApplicationContext.SCOPE_REQUEST, proxyMode=ScopedProxyMode.TARGET_CLASS)
public class LegacyEntity {
I am not sure if using the new keyword is the better approach instead of autowiring an Entity class?
I've been struggling for the past week to successfully integrate Spring Data MongoDB into our application. We use the fairly common practice of having separate databases for each collection that we rely on. For instance, TenantConfiguration database contains only the TenantConfigurations collection.
I've read through the documentation several times and trawled through the code for a solution but have turned up nothing. Surely such a widely adopted project has some solution for this issue? My current attempt looks like this:
#Configuration
#EnableMongoRepositories(basePackages = "com.whatever.service.repository",
basePackageClasses = TenantConfigurationRepository.class,
mongoTemplateRef = "tenantConfigurationTemplate")
public class TenantConfigurationRepositoryConfig {
#Value("${mongo.hosts}")
private List<String> mongoHosts;
#Bean
public MongoTemplate tenantConfigurationTemplate() throws Exception {
final List<ServerAddress> serverAddresses = new ArrayList<>();
for (String host : mongoHosts) {
serverAddresses.add(new ServerAddress(host, 27017));
}
final MongoClientOptions clientOptions = new MongoClientOptions.Builder()
.connectTimeout(25000)
.readPreference(ReadPreference.primaryPreferred())
.build();
final MongoClient client = new MongoClient(serverAddresses, clientOptions);
return new MongoTemplate(client, "TenantConfiguration");
}
}
Here is one of the other individual repository configurations:
#Configuration
#EnableMongoRepositories(basePackages = "com.whatever.service.repository",
basePackageClasses = RegisteredCardRepository.class,
mongoTemplateRef = "registeredCardTemplate")
public class RegisteredCardRepositoryConfig {
#Value("${mongo.hosts}")
private List<String> mongoHosts;
#Bean
public MongoTemplate registeredCardTemplate() throws Exception {
final List<ServerAddress> serverAddresses = new ArrayList<>();
for (String host : mongoHosts) {
serverAddresses.add(new ServerAddress(host, 27017));
}
final MongoClientOptions clientOptions = new MongoClientOptions.Builder()
.connectTimeout(25000)
.readPreference(ReadPreference.primaryPreferred())
.build();
final MongoClient client = new MongoClient(serverAddresses, clientOptions);
return new MongoTemplate(client, "RegisteredCard");
}
}
Now here is the actual repository definition for the RegisteredCard repository:
#Repository
public interface RegisteredCardRepository extends MongoRepository<RegisteredCard, Guid>,
QueryDslPredicateExecutor<RegisteredCard> { }
This all makes perfect sense to me, the individual configurations uniquely identify the specific repository interfaces they configure and the specific template bean to use with that repository via the mongoTemplateRef parameter of the annotation. At least, this is how the documentation seems to imply it should work.
In reality, when I start up the application, the RegisteredCard repository resolves to a MongoDB repository instance with an associated MongoDbFactory that is bound to the TenantConfiguration database. In fact, every single repository receives the same, incorrect MongoOperations object. Despite each repository having its own unique configuration, it appears that whatever database is accessed first remains the target database for every repository.
Are there any solutions available to this problem?
It's taken me almost a week, but I've actually found a passable solution to this issue. Here's a quick run-down of facts I've picked up while researching this issue:
#EnableMongoRepositories(basePackageClasses = Whatever.class) simply uses a qualified class name to indicate what package it should scan for all of your defined data models. This is entirely equivalent to doing #EnableMongoRepositories(basePackageClasses = "com.mypackage.whatevers") if Whatever.class resides in that package.
#EnableMongoRepositories is not repeatable but can be used to annotate several classes. This has been covered in other SO conversations but bears repeating here. You will need to define several repository configuration classes; one for each database you intend to interact with.
Each of your individual repository configurations must specify its own MongoTemplate instance in the #EnableMongoRepositories annotation. You can get away with providing only a single Mongo bean but the MongoTemplate relies on a specific MongoMappingContext.
The #EnableMongoRepositories annotation helps define your mapping context, this understands the structure of your data models and how to serialize them. It also understands the #Document and #Field annotations and does the heavy lifting of persisting your objects. The Mongo template instances are where your specify what database you want to interact with. So by providing the #EnableMongoRepositories annotation with both a basePackage attribute and a mongoTemplateRef attribute you can tell Spring Data Mongo to "take these models and persist them in this specific database".
The unfortunate requirement for this solution is that you must organize your data models into separate packages depending on what database they belong in. If, like me, you are using a Mongo database structure that allocates a single collection to each database (this is fairly common for heavily accessed collections), this means that each of your data models must reside in its own package. Each of these packages must be pointed to by an #EnableMongoRepositories annotation also containing a mongoTemplateRef attribute to a unique MongoTemplate bean.
I hope this helps someone avoid the trouble I've gone through trying to accomplish what should be a fairly run-of-the-mill Mongo integration.
PS: Abandon all hope, those who seek to combine auditing with this configuration.
I know this is old but for those who are looking for a short solution like me:
#Autowired
#Qualifier("registeredCardTemplate")
private MongoTemplate template;
Qualifier name is your "mongoTemplateRef={XXX}"
I'm using MyBatis with second level cache activated via <cache/> in xml mapper files.
Suppose I want to interact with the underlying DB/DataSource decoupled from MyBatis, for instance via direct jdbcTemplate.
How can I assure, that the MyBatis cache gets flushed appropriateley when I Insert/Update/Delete via jdbcTemplate on a table for that MyBatis holds cached query results.
In other words, how can I force MyBatis to flush its cache from outside of MyBatis mappers for certain cache namespace?
I'm aware of #Options(flushCache=true) annotation, but this seems not to work outside of mapper interfaces.
you can get cache from configuration and then get by namespace and clear it.
#Resource
SqlSessionFactory sqlSessionFactory;
public void clearCacheByNamespace(){
Configuration config = sqlSessionFactory.getConfiguration();
Cache cache = config.getCache("com.persia.dao.UserInfoMapper");
if(cache != null){
cache.clear();
}
}
Hi i have used another approach, because we used spring. Use autowire the Session implementation and call appropriate method
public class SomeServerClass{
#Autowired
private org.mybatis.spring.SqlSessionTemplate sqlSessionTemplate;
private void someClearMethod(){
sqlSessionTemplate.clearCache();
}
}
If I use interface org.apache.ibatis.session.SqlSession it refers to same instance