OData (Olingo) "inhibit" endpoint - spring

My question is about what is best way to inhibit an endpoint that is automatically provided by Olingo?
I am playing with a simple app based on Spring boot and using Apache Olingo.On short, this is my servlet registration:
#Configuration
public class CxfServletUtil{
#Bean
public ServletRegistrationBean getODataServletRegistrationBean() {
ServletRegistrationBean odataServletRegistrationBean = new ServletRegistrationBean(new CXFNonSpringJaxrsServlet(), "/user.svc/*");
Map<String, String> initParameters = new HashMap<String, String>();
initParameters.put("javax.ws.rs.Application", "org.apache.olingo.odata2.core.rest.app.ODataApplication");
initParameters.put("org.apache.olingo.odata2.service.factory", "com.olingotest.core.CustomODataJPAServiceFactory");
odataServletRegistrationBean.setInitParameters(initParameters);
return odataServletRegistrationBean;
} ...
where my ODataJPAServiceFactory is
#Component
public class CustomODataJPAServiceFactory extends ODataJPAServiceFactory implements ApplicationContextAware {
private static ApplicationContext context;
private static final String PERSISTENCE_UNIT_NAME = "myPersistenceUnit";
private static final String ENTITY_MANAGER_FACTORY_ID = "entityManagerFactory";
#Override
public ODataJPAContext initializeODataJPAContext()
throws ODataJPARuntimeException {
ODataJPAContext oDataJPAContext = this.getODataJPAContext();
try {
EntityManagerFactory emf = (EntityManagerFactory) context.getBean(ENTITY_MANAGER_FACTORY_ID);
oDataJPAContext.setEntityManagerFactory(emf);
oDataJPAContext.setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
return oDataJPAContext;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
...
My entity is quite simple ...
#Entity
public class User {
#Id
private String id;
#Basic
private String firstName;
#Basic
private String lastName;
....
Olingo is doing its job perfectly and it helps me with the generation of all the endpoints around CRUD operations for my entity.
My question is : how can I "inhibit" some of them? Let's say for example that I don't want to enable the delete my entity.
I could try to use a Filter - but this seems a bit harsh. Are there any other, better ways to solve my problem?
Thanks for the help.

As you have said, you could use a filter, but then you are really coupled with the URI schema used by Olingo. Also, things will become complicated when you have multiple, related entity sets (because you could navigate from one to the other, making the URIs more complex).
There are two things that you can do, depending on what you want to achieve:
If you want to have a fined grained control on what operations are allowed or not, you can create a wrapper for the ODataSingleProcesor and throw ODataExceptions where you want to disallow an operation. You can either always throw exceptions (i.e. completely disabling an operation type) or you can use the URI info parameters to obtain the target entity set and decide if you should throw an exception or call the standard single processor. I have used this approach to create a read-only OData service here (basically, I just created a ODAtaSingleProcessor which delegates some calls to the standard one + overridden a method in the service factory to wrap the standard single processor in my wrapper).
If you want to completely un-expose / ignore a given entity or some properties, then you can use a JPA-EDM mapping model end exclude the desired components. You can find an example of such a mapping here: github. The mapping model is just an XML file which maps the JPA entities / properties to EDM entity type / properties. In order for olingo to pick it up, you can pass the name of the file to the setJPAEdmMappingModel method of the ODataJPAContext in your initialize method.

Related

How to mock findByPrincipalName in spring\mockito

I am trying to mock findByPrincipalName as in my test context I do not have redis set up but I am unable to do so, I get the following error:
The method thenReturn(Map<String,capture#2-of ?>) in the type OngoingStubbing<Map<String,capture#2-of ?>> is not applicable for the arguments (Map<String,capture#3-of ? extends Session>)
I do not really understand what this error is telling me, below is how I am attempting to mock the method:
Map<String, ? extends Session> sessions = new HashMap<>();
#MockBean
private FindByIndexNameSessionRepository<?> sessionRepository;
when(this.sessionRepository.findByPrincipalName(VALID_SUB)).thenReturn(sessions);
What do I need to do to be able to mock this method? The class RedisSession is not accessible so I cannot create an instance of this to use.
This is not a problem related to mocking, but simply a generic type mismatch. You defined the repository as FindByIndexNameSessionRepository<?>, while your sessions reference type is Map<String, ? extends Session>, so your repository returns ? (2), while you're trying to return an object containing ? extends Session (3). The numbering in the last sentence marks the bounds (?) accordingly to the log you've provided - bounds defined in different places are treated as different type definitions and do not match (read more here).
What you need to do is: define types for both the repository and the object it should return so that they match. One way of doing that would be simply sticking to the interface (Session) or if you wanted to make it more concrete, you could use a generic type definition on the class level (<T extends Session>) and apply it to the repository and the map.
#MockBean
private FindByIndexNameSessionRepository<Session> sessionRepository;
#Test
void test() {
Map<String, Session> sessions = new HashMap<>();
when(sessionRepository.findByPrincipalName(VALID_SUB))
.thenReturn(sessions);
...
}
class TypedIndexNameSessionTest<T extends Session> {
#MockBean
private FindByIndexNameSessionRepository<T> sessionRepository;
#Test
void emptySessions() {
Map<String, T> sessions = new HashMap<>();
when(sessionRepository.findByPrincipalName(VALID_SUB))
.thenReturn(sessions);
...
}
}
I've tested the code locally and pushed it to my GitHub repository - you can see the full example there (all tests pass).

Capturing entity information in custom entity listener

I would like a custom entity listener to generate an auto-incremented alias for a few of the entities.
I have implemented one util class in order to generate auto incremented alias for the entities in a distributed environment as follows:
#Component
public class AutoIncrementingIdGenerationUtil {
private final RedisTemplate<String, Object> redisTemplate;
public AutoIncrementingIdGenerationUtil(
RedisTemplate<String, Object> redisTemplate) {
this.redisTemplate = redisTemplate;
}
public String getNextSequenceNumber(String keyName) {
RedisAtomicLong counter = new RedisAtomicLong(keyName,
Objects.requireNonNull(redisTemplate.getConnectionFactory()));
return counter.incrementAndGet();
}
}
Now, I have several entities in my application, for a FEW OF ENTITIES, I would like to generate the alias.
So I am writing my own custom entity listener as follows:
#Component
public class CustomEntityListener<T> {
private final AutoIncrementingIdGenerationUtil autoIncrementingIdGenerationUtil;
public CustomEntityListener(
AutoIncrementingIdGenerationUtil autoIncrementingIdGenerationUtil) {
this.autoIncrementingIdGenerationUtil = autoIncrementingIdGenerationUtil;
}
#PrePersist
void onPrePersist(Object entity) { <----HERE I WOULD LIKE TO CAST TO CONCRETE ENTITY TYPE,
if(StringUtils.isBlank(entity.getAlias())) {
entity.setAlias(autoIncrementingIdgenerationUtil.getNextSequenceNumber(entity.getEntityType());
}
}
As mentioned above, all of the entities do not have an alias attribute. I am not getting any proper idea regarding how to do this. One bad idea is to use getTEntityype(). But in this case, it would be too many if-else and typecast accordingly, which will not look good. Any better idea regarding how to do it?
Another related question in the same context, if I have an entity having a #PrePersist function already, will the function defined in entity listener override this, OR will both of them run?
Entity listeners cannot be parameterized. Just make the relevant entities implement an interface, e.g. Aliased, with a setAlias() method. You'll then have a single type to cast to.
Also, why use Redis? Doesn't your DB have sequences?

Dynamically securing Spring Rest endpoint

Actually my code look like that:
#PreAuthorize("hasAuthority('admin')")
#RequestMapping(value = "/xxxx", method = RequestMethod.POST, consumes = MediaType.APPLICATION_JSON_UTF8_VALUE)
public ResponseEntity<> method(#RequestBody RequestClass request) {
}
As you can see the allowed authorities are hard-coded in java code.
Is there a way to override the behaviour of PreAuthorize or to load the proper endpoint configuration at startup from an external source(database or configuration file)?
I suppose you might give something like this a try (might need some tweaking) but it's ugly and I would not do it myself...
Setup your method to role mappings in your configuration. For example:
permissions.method1: admin
permissions.method2: admin
permissions.method3: user
Then use an #ConfigurationProperties class to load in your map into a Map.
#ConfigurationProperties("")
public class SecurityMappingProperties {
private final Map<String, String> permissions = new HashMap<>();
public Map<String, String> getPermissions() {
return permissions;
}
}
Then setup a service to handle the lookup.
#Service
public class MethodPermissionService {
#Autowired
private SecurityMappingProperties mappingProperties;
//lookup the mapped role and see if you user has it..
public Boolean lookupPermissionForMethod(String method){
return doesUserHaveRole(mappingProperties.get(method));
}
private Boolean doesUserHaveRole(String role){
//implement whatever logic you want to look up the requesting user's role...
}
}
Then in your controllers, invoke the methodPermissionService and pass in the method name, like so...
#PreAuthorize("#methodPermissionService('method1')")
This, of course, would require you to have every secured method in all of your controllers to have an #Preauthorize with the matching method name as the argument to the methodPermissionService('xxx').
Since we are already in this rabbit hole, if you really wanted to, you could also just have a single place to declare all of them in some sort of MethodRoleHolder class where you can make them static Strings like the following:
public static final String METHOD1_SECURITY = "#methodPermissionService('method1')";
public static final String METHOD2_SECURITY = "#methodPermissionService('method2')";
then use them in your controllers...
#PreAuthorize(MethodRoleHolder.METHOD1_SECURITY)
Upfront caveat: I haven't actually tried this myself exactly as I laid out here but I have implemented a security scheme similar to this, just without the dynamic role mapping look up part.

Spring - Injection of beans using Builder pattern

Context
An application that utilizes Spring 4.1.7. All configurations are in XML files (not using annotations) and I rather keep it that way (but I can change the ways things are done if I must).
Problem
I have created a new class that comes with a builder class.
Now I'd like to inject other beans into this new class. I can probably use lookup-methods and similar solutions to do that and then use the new class's builder in the caller beans to create an instance. However, I rather an instance of this new class to be injected to its caller beans then they creating one through the builder. This is where I'm not sure how I can do that. For example, this looks like an Abstract Factory to me, but I don't know how I can pass those parameters (which are passed to the builder) at runtime to the Abstract Factory and subsequently the factories it builds.
Some code snippets to make the question clearer:
public final class Processor {
private final StatusEnum newStatus;
private final Long timeOut;
// I'd like this to be be injected by Spring through its setter (below)
private DaoBean daoInstance;
private Processor() {
this.newStatus = null;
this.timeOut = null;
}
private Processor(Builder builder) {
this.newStatus = builder.getNewStatus();
this.timeOut = builder.getTimeOut();
}
// To be called by Spring
public void setDaoInstance(DaoBean instance) {
this.daoInstance = instance;
}
public void updateDatabase() {
daoInstance.update(newStatus, timeOut);
}
// Builder class
public static final class Builder {
private StatusEnum newStatus;
private Long timeOut;
// lots of other fields
public Long getTimeOut() {
return this.timeOut;
}
public StatusEnum getNewStatus() {
return this.newStatus;
}
public Builder withTimeOut(Long timeOut) {
this.timeOut = timeOut;
return this;
}
public Builder withNewStatus(StatusEnum newStatus) {
this.newStatus = newStatus;
return this;
}
public Processor build() {
return new Processor(this);
}
}
}
I'd like an instance of "DaoBean" to be injected to the "Processor" class. But to do that, Processor will have to be a bean or otherwise I have to utilize something like lookup-methods. On the other hand, wherever I want to use processor, I have to do something like this:
new Processor.Builder()
.withTimeOut(1000L)
.withNewStatus(StatusEnum.UPDATED)
.build()
.updateDatabase();
Instead of this, I wonder if I can make the Processor a bean that Spring can inject to its callers whilst maintaining its immutability. An instance of DaoBean can then be injected to the Processor by Spring. That way I'd be able to segregate the wiring code and the business logic.
It's worth mentioning that the Builder has a lot more than 2 fields and not all of them have to be set. This is why I thought an abstract factory is the way to go (building instances of the Processor in different ways).
One solution, while keeping the builder, would probably be to simply making the Builder itself a Spring bean...
This allows something like this..
#Autowired
private Builder builder;
public void someMethod() {
Result = builder.withX(...).doSomething();
}
This way, your Result object is immutable, can be created via a nice builder and the builder can inject the Spring bean (dao, in your case) into it without anyone even noticing that it's there.
And the only thing that changes is, that you don't create the builder yourself, but let Spring create it for you...
#Component
#Scope("prototype") // normally a good idea
public static class Builder {
#Autowired
private DaoBean dao;
// your logic here
}
(Same works with JavaConfig or XML config, if you don't want to scan.)
Especially with many combinations, I prefer a builder pattern, since a factory would need complex method signatures. Of course, the builder has the disadvantage that you cannot check at compile time if a given combination of attribute types is at least theoretically acceptable. Ok, you could simulate that with various builders, but that would probably be overkill.

Spring Boot equivalent to XML multi-database configuration

I would like to port two projects to Spring Boot 1.1.6. The are each part of a larger project. They both need to make SQL connections to 1 of 7 production databases per web request based region. One of them persists configuration setting to a Mongo database. They are both functional at the moment but the SQL configuration is XML based and the Mongo is application.properties based. I'd like to move to either xml or annotation before release to simplify maintenance.
This is my first try at this forum, I may need some guidance in that arena as well. I put the multi-database tag on there. Most of those deal with two connections open at a time. Only one here and only the URL changes. Schema and the rest are the same.
In XML Fashion ...
#Controller
public class CommonController {
private CommonService CommonService_i;
#RequestMapping(value = "/rest/Practice/{enterprise_id}", method = RequestMethod.GET)
public #ResponseBody List<Map<String, Object>> getPracticeList(#PathVariable("enterprise_id") String enterprise_id){
CommonService_i = new CommonService(enterprise_id);
return CommonService_i.getPracticeList();
}
#Service
public class CommonService {
private ApplicationContext ctx = null;
private JdbcTemplate template = null;
private DataSource datasource = null;
private SimpleJdbcCall jdbcCall = null;
public CommonService(String enterprise_id) {
ctx = new ClassPathXmlApplicationContext("database-beans.xml");
datasource = ctx.getBean(enterprise_id, DataSource.class);
template = new JdbcTemplate(datasource);
}
Each time a request is made, a new instance of the required service is created with the appropriate database connection.
In the spring boot world, I've come across one article that extended TomcatDataSourceConfiguration.
http://xantorohara.blogspot.com/2013/11/spring-boot-jdbc-with-multiple.html That at least allowed me to create a java configuration class however, I cannot come up with a way to change the prefix for the ConfigurationProperties per request like I am doing with the XML above. I can set up multiple configuration classes but the #Qualifier("00002") in the DAO has to be a static value. //The value for annotation attribute Qualifier.value must be a constant expression
#Configuration
#ConfigurationProperties(prefix = "Region1")
public class DbConfigR1 extends TomcatDataSourceConfiguration {
#Bean(name = "dsRegion1")
public DataSource dataSource() {
return super.dataSource();
}
#Bean(name = "00001")
public JdbcTemplate jdbcTemplate(DataSource dsRegion1) {
return new JdbcTemplate(dsRegion1);
}
}
On the Mongo side, I am able to define variables in the configurationProperties class and, if there is a matching entry in the appropriate application.properties file, it overwrites it with the value in the file. If not, it uses the value in the code. That does not work for the JDBC side. If you define a variable in your config classes, that value is what is used. (yeah.. I know it says mondoUrl)
#ConfigurationProperties(prefix = "spring.mongo")
public class MongoConnectionProperties {
private String mondoURL = "localhost";
public String getMondoURL() {
return mondoURL;
}
public void setMondoURL(String mondoURL) {
this.mondoURL = mondoURL;
}
There was a question anwsered today that got me a little closer. Spring Boot application.properties value not populating The answer showed me how to at least get #Value to function. With that, I can set up a dbConfigProperties class that grabs the #Value. The only issue is that the value grabbed by #Value is only available in when the program first starts. I'm not certain how to use that other than seeing it in the console log when the program starts. What I do know now is that, at some point, in the #Autowired of the dbConfigProperties class, it does return the appropriate value. By the time I want to use it though, it is returning ${spring.datasource.url} instead of the value.
Ok... someone please tell me that #Value is not my only choice. I put the following code in my controller. I'm able to reliably retrieve one value, Yay. I suppose I could hard code each possible property name from my properties file in an argument for this function and populate a class. I'm clearly doing something wrong.
private String url;
//private String propname = "${spring.datasource.url}"; //can't use this
#Value("${spring.datasource.url}")
public void setUrl( String val) {
this.url = val;
System.out.println("==== value ==== " + url);
}
This was awesome... finally some progress. I believe I am giving up on changing ConfigurationProperties and using #Value for that matter. With this guy's answer, I can access the beans created at startup. Y'all were probably wondering why I didn't in the first place... still learning. I'm bumping him up. That saved my bacon. https://stackoverflow.com/a/24595685/4028704
The plan now is to create a JdbcTemplate producing bean for each of the regions like this:
#Configuration
#ConfigurationProperties(prefix = "Region1")
public class DbConfigR1 extends TomcatDataSourceConfiguration {
#Bean(name = "dsRegion1")
public DataSource dataSource() {
return super.dataSource();
}
#Bean(name = "00001")
public JdbcTemplate jdbcTemplate(DataSource dsRegion1) {
return new JdbcTemplate(dsRegion1);
}
}
When I call my service, I'll use something like this:
public AccessBeans(ServletRequest request, String enterprise_id) {
ctx = RequestContextUtils.getWebApplicationContext(request);
template = ctx.getBean(enterprise_id, JdbcTemplate.class);
}
Still open to better ways or insight into foreseeable issues, etc but this way seems to be about equivalent to my current XML based ways. Thoughts?

Resources