Generic JPA repository for runtime generated entities - spring

In my scenario im generating hibernate entity classes at runtime under "com.mrg.domain" package. And in my generic restcontroller i can create instances of these entities according to #PathVariable. Below code works fine for this..
#RequestMapping( value = "/{entity}", method = RequestMethod.POST)
public #ResponseBody RestResponse createEntity(#PathVariable String entity, #RequestBody String requestBody) {
Object model = null;
ObjectMapper mapper = new ObjectMapper();
try {
// ex : if {entitiy} param is equal "post" modelName will be "Post"
String modelName = Character.toUpperCase(entity.charAt(0)) + entity.substring(1);
// Creating a new instance according to modelName
Class<?> clazz = Class.forName("com.mrg.domain." + modelName);
model = clazz.newInstance();
// Converting #RequestBody json String to domain object..
model = mapper.readValue(requestBody, clazz);
} catch(Exception ex){
// TODO handle exceptions & inform user..
}
return new RestResponse(model.toString());
}
Now the next step i am trying to implement is a generic jpa repository(something like below) so that i can persist runtime generated models without implementing repositories for each entity. But couldn't find a solution yet.
#Repository
public interface GenericRepository<T> extends PagingAndSortingRepository<T, Long>{ }
Below topic and many other topics implemented generic repositories but also repositories per entities that uses generic repo. Since i have runtime generated entities repo implementation per entity doesnt work for me..
How to make generic jpa repository? Should I do this? Why?
Any suggestion or a way for achieving this? I'm new to generics and reflection so if what im trying to accomplish is not possible, tell me reasons and i would be appreciate..
Thanks and regards,

You could use this pattern. This one uses EJB but can be used in Spring etc.
#Stateless
public abstract class AbstractRepository<T> {
#PersistenceContext
protected EntityManager em;
public abstract Class<T> getActualClass();
public T getSingleResult(Map<String, String> params) {
// build querytext based on params
TypedQuery<T> query = em.createQuery(queryText.toString(), getActualClass());
............
}
}
Now for the implementation class:
#Stateless
public class InputStreamRepository extends AbstractRepository<InputDataStream> {
#Override
public Class<InputDataStream> getActualClass() {
return InputDataStream.class;
}
}
The getActualClass method will give you the Entity's class impl info.

I had a react application where different data is defined in JSON and in the server side, I need to store this in the DB. My initial approach was to create entities , repositories and controller for all of this seperately. But another possible approach for CRUD operation is with MongoDB & JPA. Here is the idea.
import java.util.List;
import org.bson.Document;
import org.json.JSONObject;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.CrossOrigin;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping("/api/generic")
#CrossOrigin(origins = { "*" })
public class GenericController {
#Autowired
private MongoTemplate mongoTemplate;
#PostMapping
public ResponseEntity<Document> addData(#RequestBody String data) {
JSONObject jsonObject = new JSONObject(data);
String documentName = jsonObject.getString("documentName");
Document doc = Document.parse(data);
Document insertedDoc = mongoTemplate.insert(doc, documentName);
return new ResponseEntity<>(insertedDoc, HttpStatus.CREATED);
}
#GetMapping("/{documentName}")
public List<Document> getData(#PathVariable String documentName) {
List<Document> allData = mongoTemplate.findAll(Document.class, documentName);
return allData;
}
}

Related

Is it possible to set the default date format in JSON-B (Yasson) globally, instead of adding an annotation on every property?

I have using Jersey so far and I am doing my first implementation with JSON-B.
I am using Payara, so I working with Jersey and Yasson. I had an issue, because the serialized dates would always contain the "[UTC]" suffix.
I have managed to use an annotation on my date property, in my DTO. But I would like to configure that globally (in the JAX-RS application config?), instead of repeating myself on every date property. Is that possible? I haven't found anything so far...
Side question: I assume that it is possible to get rid of this "[UTC]" suffix, since it breaks all clients trying to parse the date. Any idea?
Thanks to this Github issue, I was able to solve my problem. Here is what I ended up writing in my code:
JSONConfigurator.java:
import javax.json.bind.Jsonb;
import javax.json.bind.JsonbBuilder;
import javax.json.bind.JsonbConfig;
import javax.json.bind.config.PropertyNamingStrategy;
import javax.ws.rs.ext.ContextResolver;
import javax.ws.rs.ext.Provider;
#Provider
public class JSONConfigurator implements ContextResolver<Jsonb> {
#Override
public Jsonb getContext(Class<?> type) {
JsonbConfig config = getJsonbConfig();
return JsonbBuilder
.newBuilder()
.withConfig(config)
.build();
}
private JsonbConfig getJsonbConfig() {
return new JsonbConfig()
.withDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSXXX", null);
}
}
And:
import javax.ws.rs.ApplicationPath;
import javax.ws.rs.core.Application;
import java.util.HashSet;
import java.util.Set;
#ApplicationPath("/api")
public class ApplicationConfig extends Application {
#Override
public Set<Class<?>> getClasses() {
Set<Class<?>> resources = new HashSet<Class<?>>();
addRestResourceClasses(resources);
resources.add(JSONConfigurator.class);
return resources;
}
private void addRestResourceClasses(Set<Class<?>> resources) {
...
}
}

Mapping RestTemplate response to java Object

I am using RestTemplate get data from remote rest service and my code is like this.
ResponseEntity<List<MyObject >> responseEntity = restTemplate.exchange(request, responseType);
But rest service will return just text message saying no record found if there are no results and my above line of code will throw exception.
I could map result first to string and later use Jackson 2 ObjectMapper to map to MyObject.
ResponseEntity<String> responseEntity = restTemplate.exchange(request, responseType);
String jsonInput= response.getBody();
List<MyObject> myObjects = objectMapper.readValue(jsonInput, new TypeReference<List<MyObject>>(){});
But I don't like this approach. Is there any better solution for this.
First of all you could write a wrapper for the whole API. Annotate it with #Component and you can use it wherever you want though Springs DI. Have a look at this example project which shows of generated code for a resttemplate client by using swagger codegen.
As you said you tried implementing a custom responserrorhandler without success I assume that the API returns the response body "no record found" while the status code is 200.
Therefore you could create a custom AbstractHttpMessageConverter as mentioned in my second answer. Because you are using springs resttemplate which is using the objectmapper with jackson we don't event have to use this very general super class to create our own. We can use and extend the more suited AbstractJackson2HttpMessageConverter class.
An implementation for your specific use case could look as follows:
import com.fasterxml.jackson.databind.JavaType;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.springframework.http.HttpInputMessage;
import org.springframework.http.MediaType;
import org.springframework.http.converter.HttpMessageNotReadableException;
import org.springframework.http.converter.json.AbstractJackson2HttpMessageConverter;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.lang.reflect.Type;
import java.util.Collection;
import java.util.Collections;
import java.util.Map;
import java.util.stream.Collectors;
public class WeirdAPIJackson2HttpMessageConverter extends AbstractJackson2HttpMessageConverter {
public static final String NO_RECORD_FOUND = "no record found";
public WeirdAPIJackson2HttpMessageConverter() {
// Create another constructor if you want to pass an already existing ObjectMapper
// Currently this HttpMessageConverter is applied for every MediaType, this is application-dependent
super(new ObjectMapper(), MediaType.ALL);
}
#Override
public Object read(Type type, Class<?> contextClass, HttpInputMessage inputMessage) throws IOException, HttpMessageNotReadableException {
try (BufferedReader br = new BufferedReader(new InputStreamReader(inputMessage.getBody(), DEFAULT_CHARSET))) {
String responseBodyStr = br.lines().collect(Collectors.joining(System.lineSeparator()));
if (NO_RECORD_FOUND.equals(responseBodyStr)) {
JavaType javaType = super.getJavaType(type, contextClass);
if(Collection.class.isAssignableFrom(javaType.getRawClass())){
return Collections.emptyList();
} else if( Map.class.isAssignableFrom(javaType.getRawClass())){
return Collections.emptyMap();
}
return null;
}
}
return super.read(type, contextClass, inputMessage);
}
}
The custom HttpMessageConverter is checking the response body for your specific "no record found". If this is the case, we try to return a default value depending on the generic return type. Atm returning an empty list if the return type is a sub type of Collection, an empty set for Set and null for all other Class types.
Furthermore I created a RestClientTest using a MockRestServiceServer to demonstrate you how you can use your RestTemplate within the aforementioned API wrapper component and how to set it up to use our custom AbstractJackson2HttpMessageConverter.
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.client.RestClientTest;
import org.springframework.boot.web.client.RestTemplateBuilder;
import org.springframework.core.ParameterizedTypeReference;
import org.springframework.http.HttpMethod;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.test.web.client.ExpectedCount;
import org.springframework.test.web.client.MockRestServiceServer;
import org.springframework.web.client.RestTemplate;
import java.util.List;
import java.util.Optional;
import static org.junit.Assert.*;
import static org.springframework.test.web.client.match.MockRestRequestMatchers.method;
import static org.springframework.test.web.client.match.MockRestRequestMatchers.requestTo;
import static org.springframework.test.web.client.response.MockRestResponseCreators.withStatus;
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = {RestTemplateResponseErrorHandlerIntegrationTest.MyObject.class})
#RestClientTest
public class RestTemplateResponseErrorHandlerIntegrationTest {
static class MyObject {
// This just refers to your MyObject class which you mentioned in your answer
}
private final static String REQUEST_API_URL = "/api/myobjects/";
private final static String REQUEST_API_URL_SINGLE = "/api/myobjects/1";
#Autowired
private MockRestServiceServer server;
#Autowired
private RestTemplateBuilder builder;
#Test
public void test_custom_converter_on_weird_api_response_list() {
assertNotNull(this.builder);
assertNotNull(this.server);
RestTemplate restTemplate = this.builder
.messageConverters(new WeirdAPIJackson2HttpMessageConverter())
.build();
this.server.expect(ExpectedCount.once(), requestTo(REQUEST_API_URL))
.andExpect(method(HttpMethod.GET))
.andRespond(withStatus(HttpStatus.OK).body(WeirdAPIJackson2HttpMessageConverter.NO_RECORD_FOUND));
this.server.expect(ExpectedCount.once(), requestTo(REQUEST_API_URL_SINGLE))
.andExpect(method(HttpMethod.GET))
.andRespond(withStatus(HttpStatus.OK).body(WeirdAPIJackson2HttpMessageConverter.NO_RECORD_FOUND));
ResponseEntity<List<MyObject>> response = restTemplate.exchange(REQUEST_API_URL,
HttpMethod.GET,
null,
new ParameterizedTypeReference<List<MyObject>>() {
});
assertNotNull(response.getBody());
assertTrue(response.getBody().isEmpty());
Optional<MyObject> myObject = Optional.ofNullable(restTemplate.getForObject(REQUEST_API_URL_SINGLE, MyObject.class));
assertFalse(myObject.isPresent());
this.server.verify();
}
}
What I usually do in my projects with restTemplate is save the response in a java.util.Map and create a method that converts that Map in the object I want. Maybe saving the response in an abstract object like Map helps you with that exception problem.
For example, I make the request like this:
List<Map> list = null;
List<MyObject> listObjects = new ArrayList<MyObject>();
HttpHeaders headers = new HttpHeaders();
HttpEntity<String> entity = new HttpEntity<>(headers);
ResponseEntity<Map> response = restTemplate.exchange(url, HttpMethod.GET, entity, Map.class);
if (response != null && response.getStatusCode().value() == 200) {
list = (List<Map>) response.getBody().get("items"); // this depends on the response
for (Map item : list) { // we iterate for each one of the items of the list transforming it
MyObject myObject = transform(item);
listObjects.add(myObject);
}
}
The function transform() is a custom method made by me: MyObject transform(Map item); that receives a Map object and returns the object I want. You can check if there was no records found first instead of calling the method transform.

spring data JPA/Hibernate. Complex cross table SQL which doesn't map to a table entity

* Edited with possible solution - any comments ? *
spring 4.2.5 RELEASE
I'm starting to create Java web services onto a Legacy database.
Following the spring-data JPA repository pattern creating entities which map to tables, a repository extending CrudRepository is working well.
As described in this great tutorial
All the examples I've seen assume simple mapping of a table to an entity. Order -> OrderEntity, OrderLine, Customer etc.
How would you deal with read-only reporting type queries which do not fit into this pattern where the query result contains columns from many tables and use complex cross table joins.
I'm just struggling to get my head around how to deal with this scenario.
Possible Solution
I've managed to run native SQL using the NamedParameterJdbcTemplate and map the results onto a POJO using a BeanPropertyRowMapper
ApplicationContext class
The NamedParameterJdbcTemplate bean is defined (the rest of the beans HikariCP, JPA Session Factory, JPA Transaction Manager, DozerBean mapper have been left out for brevity)
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackages = {"com.savant.test.spring.donorservicejpa.dao.repository"},
repositoryBaseClass = com.savant.test.spring.donorservicejpa.dao.repository.BaseRepositoryImpl.class )
#ComponentScan(
{"com.savant.test.spring.donorservicejpa.dao.jdbc.repository",
"com.savant.test.spring.donorservicejpa.dao.query.objects"})
public class ApplicationContext {
#Bean
NamedParameterJdbcTemplate jdbcTemplate(DataSource dataSource) {
return new NamedParameterJdbcTemplate(dataSource);
}
}
POJO for search results. No Spring annotations, just a simple class
package com.savant.test.spring.donorservicejpa.dao.query.objects;
public class SessionSearchResult {
private String sessno;
private String sesdate;
// etc
// setters/getters
}
'Repository'. It's not actually a repository in spring terms, just an interface/class implementation
package com.savant.test.spring.donorservicejpa.dao.jdbc.repository;
public interface SessionSearchRepository{
List<SessionSearchResult> findByCriteria(String searchCriteria);
}
Base implementation
package com.savant.test.spring.donorservicejpa.dao.jdbc.repository;
import org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate;
public class BaseJdbcRepositoryImpl {
protected final NamedParameterJdbcTemplate jdbcTemplate;
BaseJdbcRepositoryImpl(NamedParameterJdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
}
Simple test implementation of the search.
package com.savant.test.spring.donorservicejpa.dao.jdbc.repository;
import com.savant.test.spring.donorservicejpa.dao.query.objects.SessionSearchResult;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.BeanPropertyRowMapper;
import org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
#Component
public class SessionSearchRepositoryImpl extends BaseJdbcRepositoryImpl implements SessionSearchRepository {
private static final String SESSION_SEARCH_SQL
= "SELECT sesdet.sessno, sessdays.sesdate "
+ "FROM sesdet, sessdays "
+ "WHERE sessdays.sessno = sesdet.sessno "
+ "AND sesdet.sessno = :sessno";
#Autowired
public SessionSearchRepositoryImpl(NamedParameterJdbcTemplate jdbcTemplate) {
super(jdbcTemplate);
}
#Transactional(readOnly = true)
#Override
public List<SessionSearchResult> findByCriteria(String searchCriteria) {
Map<String, String> queryParams = new HashMap<>();
queryParams.put("sessno", searchCriteria);
List<SessionSearchResult> searchResults = jdbcTemplate.query(SESSION_SEARCH_SQL, queryParams,
new BeanPropertyRowMapper<>(SessionSearchResult.class));
return searchResults;
}
}
And a simple test just to run the SQL
#Autowired
SessionSearchRepository sessionSearchRepository;
#Test
public void a_testSessionSearch() throws Exception, Throwable {
List<SessionSearchResult> sl = sessionSearchRepository.findByCriteria("CA04AS");
for (SessionSearchResult sessionSearchEntity : sl) {
}
}

Spring beans are not injected in flyway java based migration

I'm trying to inject component of configuration properties in the flyway migration java code but it always null.
I'm using spring boot with Flyway.
#Component
#ConfigurationProperties(prefix = "code")
public class CodesProp {
private String codePath;
}
Then inside Flyway migration code, trying to autowrire this component as following:
public class V1_4__Migrate_codes_metadata implements SpringJdbcMigration {
#Autowired
private CodesProp codesProp ;
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
codesProp.getCodePath();
}
Here, codesProp is always null.
Is there any way to inject spring beans inside flyway or make it initialized before flyway bean?
Thank You.
Flyway doesn't support dependency injection into SpringJdbcMigration implementations. It simply looks for classes on the classpath that implement SpringJdbcMigration and creates a new instance using the default constructor. This is performed in SpringJdbcMigrationResolver. When the migration is executed, SpringJdbcMigrationExecutor creates a new JdbcTemplate and then calls your migration implementation's migrate method.
If you really need dependencies to be injected into your Java-based migrations, I think you'll have to implement your own MigrationResolver that retrieves beans of a particular type from the application context and creates and returns a ResolvedMigration instance for each.
If like me, you don't want to wait for Flyway 4.1, you can use Flyway 4.0 and add the following to your Spring Boot application:
1) Create a ApplicationContextAwareSpringJdbcMigrationResolver class in your project:
import org.flywaydb.core.api.FlywayException;
import org.flywaydb.core.api.MigrationType;
import org.flywaydb.core.api.MigrationVersion;
import org.flywaydb.core.api.configuration.FlywayConfiguration;
import org.flywaydb.core.api.migration.MigrationChecksumProvider;
import org.flywaydb.core.api.migration.MigrationInfoProvider;
import org.flywaydb.core.api.migration.spring.SpringJdbcMigration;
import org.flywaydb.core.api.resolver.ResolvedMigration;
import org.flywaydb.core.internal.resolver.MigrationInfoHelper;
import org.flywaydb.core.internal.resolver.ResolvedMigrationComparator;
import org.flywaydb.core.internal.resolver.ResolvedMigrationImpl;
import org.flywaydb.core.internal.resolver.spring.SpringJdbcMigrationExecutor;
import org.flywaydb.core.internal.resolver.spring.SpringJdbcMigrationResolver;
import org.flywaydb.core.internal.util.ClassUtils;
import org.flywaydb.core.internal.util.Location;
import org.flywaydb.core.internal.util.Pair;
import org.flywaydb.core.internal.util.StringUtils;
import org.flywaydb.core.internal.util.scanner.Scanner;
import org.springframework.context.ApplicationContext;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.Map;
/**
* Migration resolver for {#link SpringJdbcMigration}s which are registered in the given {#link ApplicationContext}.
* This resolver provides the ability to use other beans registered in the {#link ApplicationContext} and reference
* them via Spring's dependency injection facility inside the {#link SpringJdbcMigration}s.
*/
public class ApplicationContextAwareSpringJdbcMigrationResolver extends SpringJdbcMigrationResolver {
private final ApplicationContext applicationContext;
public ApplicationContextAwareSpringJdbcMigrationResolver(Scanner scanner, Location location, FlywayConfiguration configuration, ApplicationContext applicationContext) {
super(scanner, location, configuration);
this.applicationContext = applicationContext;
}
#SuppressWarnings("unchecked")
#Override
public Collection<ResolvedMigration> resolveMigrations() {
// get all beans of type SpringJdbcMigration from the application context
Map<String, SpringJdbcMigration> springJdbcMigrationBeans =
(Map<String, SpringJdbcMigration>) this.applicationContext.getBeansOfType(SpringJdbcMigration.class);
ArrayList<ResolvedMigration> resolvedMigrations = new ArrayList<ResolvedMigration>();
// resolve the migration and populate it with the migration info
for (SpringJdbcMigration springJdbcMigrationBean : springJdbcMigrationBeans.values()) {
ResolvedMigrationImpl resolvedMigration = extractMigrationInfo(springJdbcMigrationBean);
resolvedMigration.setPhysicalLocation(ClassUtils.getLocationOnDisk(springJdbcMigrationBean.getClass()));
resolvedMigration.setExecutor(new SpringJdbcMigrationExecutor(springJdbcMigrationBean));
resolvedMigrations.add(resolvedMigration);
}
Collections.sort(resolvedMigrations, new ResolvedMigrationComparator());
return resolvedMigrations;
}
ResolvedMigrationImpl extractMigrationInfo(SpringJdbcMigration springJdbcMigration) {
Integer checksum = null;
if (springJdbcMigration instanceof MigrationChecksumProvider) {
MigrationChecksumProvider version = (MigrationChecksumProvider) springJdbcMigration;
checksum = version.getChecksum();
}
String description;
MigrationVersion version1;
if (springJdbcMigration instanceof MigrationInfoProvider) {
MigrationInfoProvider resolvedMigration = (MigrationInfoProvider) springJdbcMigration;
version1 = resolvedMigration.getVersion();
description = resolvedMigration.getDescription();
if (!StringUtils.hasText(description)) {
throw new FlywayException("Missing description for migration " + version1);
}
} else {
String resolvedMigration1 = ClassUtils.getShortName(springJdbcMigration.getClass());
if (!resolvedMigration1.startsWith("V") && !resolvedMigration1.startsWith("R")) {
throw new FlywayException("Invalid Jdbc migration class name: " + springJdbcMigration.getClass()
.getName() + " => ensure it starts with V or R," + " or implement org.flywaydb.core.api.migration.MigrationInfoProvider for non-default naming");
}
String prefix = resolvedMigration1.substring(0, 1);
Pair info = MigrationInfoHelper.extractVersionAndDescription(resolvedMigration1, prefix, "__", "");
version1 = (MigrationVersion) info.getLeft();
description = (String) info.getRight();
}
ResolvedMigrationImpl resolvedMigration2 = new ResolvedMigrationImpl();
resolvedMigration2.setVersion(version1);
resolvedMigration2.setDescription(description);
resolvedMigration2.setScript(springJdbcMigration.getClass().getName());
resolvedMigration2.setChecksum(checksum);
resolvedMigration2.setType(MigrationType.SPRING_JDBC);
return resolvedMigration2;
}
}
2) Add a new configuration class to post process the Spring Boot generated Flyway instance:
import org.flywaydb.core.Flyway;
import org.flywaydb.core.internal.dbsupport.DbSupport;
import org.flywaydb.core.internal.dbsupport.h2.H2DbSupport;
import org.flywaydb.core.internal.dbsupport.mysql.MySQLDbSupport;
import com.pegusapps.zebra.infrastructure.repository.flyway.ApplicationContextAwareSpringJdbcMigrationResolver;
import org.flywaydb.core.internal.resolver.sql.SqlMigrationResolver;
import org.flywaydb.core.internal.util.Location;
import org.flywaydb.core.internal.util.PlaceholderReplacer;
import org.flywaydb.core.internal.util.scanner.Scanner;
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.config.BeanPostProcessor;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import javax.sql.DataSource;
import java.sql.SQLException;
#Configuration
#ComponentScan("db.migration")
public class FlywayConfiguration {
#Bean
public BeanPostProcessor postProcessFlyway(ApplicationContext context) {
return new BeanPostProcessor() {
#Override
public Object postProcessBeforeInitialization(Object o, String s) throws BeansException {
return o;
}
#Override
public Object postProcessAfterInitialization(Object o, String s) throws BeansException {
if (o instanceof Flyway) {
Flyway flyway = (Flyway) o;
flyway.setSkipDefaultResolvers(true);
ApplicationContextAwareSpringJdbcMigrationResolver resolver = new ApplicationContextAwareSpringJdbcMigrationResolver(
new Scanner(Thread.currentThread().getContextClassLoader()),
new Location("classpath:db/migration"),
context.getBean(org.flywaydb.core.api.configuration.FlywayConfiguration.class),
context);
SqlMigrationResolver sqlMigrationResolver = null;
try {
sqlMigrationResolver = new SqlMigrationResolver(
getDbSupport(),
new Scanner(Thread.currentThread().getContextClassLoader()),
new Location("classpath:db/migration"),
PlaceholderReplacer.NO_PLACEHOLDERS,
"UTF-8",
"V",
"R",
"__",
".sql");
} catch (SQLException e) {
e.printStackTrace();
}
flyway.setResolvers(sqlMigrationResolver, resolver);
}
return o;
}
private DbSupport getDbSupport() throws SQLException {
DataSource dataSource = context.getBean(DataSource.class);
if( ((org.apache.tomcat.jdbc.pool.DataSource)dataSource).getDriverClassName().equals("org.h2.Driver"))
{
return new H2DbSupport(dataSource.getConnection());
}
else
{
return new MySQLDbSupport(dataSource.getConnection());
}
}
};
}
}
Note that I have some hardcoded dependencies on tomcat jdbc pool, h2 and mysql. If you are using something else, you will need to change the code there (If there is anybody that knows how to avoid it, please comment!)
Also note that the #ComponentScan package needs to match with where you will put the Java migration classes.
Also note that I had to add the SqlMigrationResolver back in since I want to support both the SQL and the Java flavor of the migrations.
3) Create a Java class in the db.migrations package that does the actual migration:
#Component
public class V2__add_default_surveys implements SpringJdbcMigration {
private final SurveyRepository surveyRepository;
#Autowired
public V2__add_surveys(SurveyRepository surveyRepository) {
this.surveyRepository = surveyRepository;
}
#Override
public void migrate(JdbcTemplate jdbcTemplate) throws Exception {
surveyRepository.save(...);
}
}
Note that you need to make the class a #Component and it needs to implement the SpringJdbcMigration. In this class, you can use Spring constructor injection for any Spring bean from your context you might need to do the migration(s).
Note: Be sure to disable ddl validation of Hibernate, because the validation seems to run before Flyway runs:
spring.jpa.hibernate.ddl-auto=none
In short do not autowire beans in your db migrations or even reference classes from your application!
If you refactor/delete/change classes you referenced in the migration it may not even compile or worse corrupt your migrations.
The overhead of using plain JDBC template for the migrations is not worth the risk.
If you are using deltaspike you can use BeanProvider to get a reference to your Class. Here is a DAO example, but it should work fine with your class too.
Change your DAO code:
public static UserDao getInstance() {
return BeanProvider.getContextualReference(UserDao.class, false, new DaoLiteral());
}
Then in your migration method:
UserDao userdao = UserDao.getInstance();
And there you've got your reference.
(referenced from: Flyway Migration with java)

Override default dispatcherServlet when a custom REST controller has been created

Following my question here, I have succeded in creating a custom REST controller to handle different kinds of requests to /api/urls and operate accordingly.
However, there is still a default controller handling requests at /urls which affects my application: When receiving a request that is not /api/something, it should fetch my database for the URL linked to said /whatever and redirect the user there. Moreover, under /api/urls I've developed certain validation rules to ensure integrity and optimization of the requests, which does not jhappen in /urls so anyone could insert any kind of data into my database.
What would be a possible way to disable this default handler? Seeing the logs I headed to register my own ServletRegistrationBean as instructed here but this is for having two isolated environments as far as I understand
My goal is to simply "disconnect" /urls URL from the default REST controller -which is no longer of any use to me now that I have my own one- and just use the custom one that I implemented in /api/urls (Or whatever other URL I may decide to use such as "/service/shortener* if possible)
Below are my Java classes:
Url.java (getters and setters omitted for brevity):
#Document
public class Url {
#Id private String id;
private String longURL;
private String hash;
private String originalUrl;
private String shortUri;
private Date creationDate;
}
UrlRepository.java
import org.springframework.data.mongodb.repository.MongoRepository;
public interface UrlRepository extends MongoRepository<Url, String> {
// Empty
}
UrlController.java:
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping("/api/urls")
public class UrlController {
#Autowired
private UrlRepository repo;
#RequestMapping(method=RequestMethod.GET)
public List<Url> getAll() {
System.out.println("Showing all stored links");
List<Url> results = repo.findAll();
return results;
}
#RequestMapping(method=RequestMethod.GET, value="{id}")
public Url getUrl(#PathVariable String id) {
System.out.println("Looking for URL " + id);
return null;
}
#RequestMapping(method=RequestMethod.POST)
public Url create(#RequestBody Url url) {
System.out.println("Received POST " + url);
return null;
}
#RequestMapping(method=RequestMethod.DELETE, value="{id}")
public void delete(#PathVariable String id) {
//TBD
}
#RequestMapping(method=RequestMethod.PUT, value="{id}")
public Url update(#PathVariable String id, #RequestBody Url url) {
//TBD
}
}
Application.java:
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Instead of trying to hack your way around Spring Boot and Spring Data REST I strongly suggest to work WITH the frameworks instead of around them.
To change the default context-path from / to /api simply add a property to your application.properties file.
server.context-path=/api
Now you would need to change your controller mapping to /urls instead of /api/urls.
If you only want /api for Spring Data REST endpoints use the following property
spring.data.rest.base-uri=/api
This will make all Spring Data REST endpoints available under /api. You want to override the /urls so instead of using #Controller use #RepositoryRestController this will make your controller override the one registered by default.
#RepositoryRestController
#RequestMapping("/urls")
public class UrlController { ... }

Resources