I have a database and in that database there are many tables of data. I want to fetch the data from any one of those tables by entering a query from the front-end application. I'm not doing any manipulation to the data, doing just retrieving the data from database.
Also, mapping the data requires writing so many entity or POJO classes, so I don't want to map the data to any object. How can I achieve this?
In this case, assuming the mapping of tables if not relevant, you don't need to use JPA/Hibernate at all.
You can use an old, battle tested jdbc template that can execute a query of your choice (that you'll pass from client), will serialize the response to JSONObject and return it as a response in your controller.
The client side will be responsible to rendering the result.
You might also query the database metadata to obtain the information about column names, types, etc. so that the client side will also get this information and will be able to show the results in a more convenient / "advanced" way.
Beware of security implications, though. Basically it means that the client will be able to delete all the records from the database by a simple query and you won't be able to avoid it :)
Does the connector generate a generic Schema such as record with the map of column->value in it, or each table get its own schema that would map to a different class trough binding. That is in the first scenario, trough binding all record across all table would bind to a record class, that contain a map.
I implemented some similar functionality in the past, and what I did was creating a generic record class containing a map with the field and value, which is consistent with what the underlying JDBC API returns.
Although this seems to defy the purpose of schema, hence i wonder how it works.
If anyone could give me some hint and documentation to investigate that, that would be much appreciated
I'm looking for a solution with Spring / camel to consume multiple REST services during runtime and create tables to store the data from REST API and compare the data dynamically. I don't know the schema for JSON API in advance to generate the JAVA client classes to create JPA persistent entity classes during run time.
You'll need to think through this differently. Id forget about Java class POJOs that you don't have and can't create since the class structure isn't known in advance. So anything with POJO->Entity binding would be pretty useless.
One solution is to simply parse the xml or json body manually with en event-based parser (like SAX for XML) and simply build an SQL create string as you go through the document. Your field and table names would correspond to the tags in the document. Without access to an XSD or other structure description, no meta data is available for field lengths or types. Make everything really long VARCHAR? Also perhaps an XML or other kind of database might suite your problem domain better. In any case, you could include such a thing right in your Camel route as a Processor that will process the body and create the necessary tables if they don't already exist. You could even alter a table for lengths in the process when you have a field value that is longer than what's currently defined.
I have a working DAL based on JPA repositories. Now we decided to expose them as Data Rest service with Spring Data Rest. The migration was fine but I noticed that it produced a lot of additional queries to DB in order to retrieve some related objects.. I used projections to override such behavior but still have the issues.
The main question is why related objects are still being generated inside _embedded field in additional to fields which are described inside projections?? Such behavior adds unnecessary objects to JSON as well as additional queries to DB..
For application developers, I suppose the traditional paradigm for writing an application with domain objects that can be persisted to an underlying data store (SQL database for arguments sake), is to write the domain objects and then write (or generate) the table structure. There is a tight coupling between what the domain object looks like and what the structure of underlying data store looks like. So if you want to add a piece of information to your domain object, you add the field to your code and then add a column to the appropriate database table. All familiar?
This is all well and good for data stores that have a well defined structure (I'm mainly talking about SQL databases whereby the tables and columns are pre-defined and fixed), but now a number of alternatives to the ubiquitous SQL database exist and these often do not constrain the data in this way. For instance, MongoDB is a NoSQL database whereby you divide data into collections but aside from that there is no structuring of the data. You don't define new columns when you want to add a new field.
Now to the question: given the flexibility of a data store like MongoDB, how would one go about achieving a similar kind of flexibility in the domain objects that represent this data? So for instance if I'm using Spring and creating my own domain obejcts, when I add a "middleName" field to my data, how can I avoid having to add a "middleName" field to my domain object? I'm looking for some kind of mechanism/approach/framework to dynamically inspect the data and have access to it in my domain object without having to make a code change every time. All ideas welcome.
I think you have a couple of choices:
You can use a dynamic programming language and not have domain objects (clojure for example)
If you're fixed on using java, the mongo java driver returns data in DBObject which is essentially a Map. So the default behavior already provides what you want. It's only when you map the DBObject into domain objects, using a library like morphia (or spring-data), that you even have to worry about domain objects at all.
But, if I was using java, I would stick with the standard convention of domain objects mapped via morphia, because I think adding a field is a very minor inconvenience when compared against the benefits.
I think the question is inherintly paradoxical-
On one hand, you want to have domain objects, i.e. objects that represent the data (and behaviour) of your problem domain.
On the other hand, you say that you don't want your domain objects to be explicitly influenced by changes to the data.
But when you have objects that represent your problem domain, you want to do just that- to represent your problem domain.
So that if, for example, middle name is added, then your representation of the real-life 'User' entity should change to accomodate this change to the real-life user; perhaps not only by adding this piece of data to your object, but also adding some related behaviour (validation of middle name, or some functionality related to it).
In essense, what I'm trying to say here is that when you have (classic OO) domain objects, you may need to change your behaviour / functionality along with your data, and since you don't have any automatic way of changing your behaviour, the question of automatically changing your data becomes irrelevant.
If you don't want behaviour associated with your data, then you essentialy have DTOs, and #Kevin's answer is what you're looking for.
Honestly, it sounds more like you're looking for some kind of blackbox DTO where, like you describe, fields are added or removed "arbitrarily" depending on the data. This makes me inclined to suggest a simple Map to do the job. You can't really have a domain-driven design if your domain model is constantly changing.