DataSourcePool or #Reference DataSource in WorkflowProcess - osgi

I am writing an AEM Custom Workflow component using the AEM Archetype. All is good.
I can write code that uses a Reference annotation
#Reference(target = "(&(objectclass=javax.sql.DataSource)(datasource.name=MYDB))")
private DataSource ds;
This works well - I can query, get rows, etc.
However, I do not want to hard code MYDB.
The documentation leads me to believe I can merely add:
#Reference
DataSourcePool dsp;
Then, look up the DataSource however, this does not produce any results. I can iterate over all datasource names and there are none.
Is it is permissions thing?
My component code looks like this:
public class queryForData implements WorkflowProcess {
private static final Logger log = LoggerFactory.getLogger(queryForSingleRow.class);
#Reference(target = "(&(objectclass=javax.sql.DataSource)(datasource.name=MYDB))")
private DataSource ds;
#Override
public void execute(WorkItem workItem, WorkflowSession workflowSession, MetaDataMap args) throws WorkflowException {
blah, blah, blah

Basically, I added a Day Commons JDBC Pool connection configuration in configManager. What I was missing was the DataSource name which is at the bottom of the dialog so, I missed it. Once I gave it a "name" the DS was found in the code by merely using a
#Reference
private DataSourcePool dsp;
And looking up the DataSource by name.

Related

Spring Autowire configuration in flink

i am trying to use the comination of flink and springboot and im having some problems.
Lets say i am having this flow.
Getting json string that have one field date that contains date string.
using map function and ObjectMapper to parse it into object of LocalDateTime
print
This is simple usecase that will describe my probem.
So, i have Word Class represnting Word that contains LocalDateTime field.
#Data
public class Word {
#JsonDeserialize(using = LocalDateTimeSerde.class)
LocalDateTime date;
}
The LocalDateTimeDeserlization is looking like that(I want to autowire the app configuration):
#RequiredArgsConstructor(onConstructor = #__(#Autowired))
#JsonComponent
public class LocalDateTimeSerde extends JsonDeserializer<LocalDateTime> {
private final AppConf conf;
#Override
public LocalDateTime deserialize(JsonParser jsonParser, DeserializationContext deserializationContext) throws IOException, JsonProcessingException {
DateTimeFormatter formatter = DateTimeFormatter.ofPattern(this.conf.getDateFormatter());
return LocalDateTime.parse(jsonParser.getText(), formatter);
}
}
AppConf.java represneting the configuration of the application is:
#Data
#Configuration
#ConfigurationProperties(value = "app")
public class AppConf {
private String dateFormatter;
}
DemoApplication.java:
final StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment(1);
String example = "{\"date\":\"2019-01-29 00:00\"}";
var stream = env
.fromElements(example)
.map(x->new ObjectMapper().readValue(x,Word.class))
.returns(Word.class);
stream.print();
env.execute("Demo App");
The exception im getting is :
Caused by: java.lang.IllegalArgumentException: Class com.example.demo.LocalDateTimeSerde has no default (no arg) constructor
The main problem here is that the code of the deserialization is running on the TaskManager and over there springboot doesnt take a part, so it doesn`t inject AppConf into the class.
Adding #NoArgsConstructor will not solve the problem
I think i know why it is hapenning (because flink master serialize the classes to the workers and then springboot doesn`t "ScanComponents" and takes control.
Is there any solution for that? I really want to combine spring with flink also in the worker`s function.
Thanks.
In general, I personally don't think it's a good idea to mix those concepts. The easiest solution is to use AutoWired only on the job manager and use explicit dependency injection when you go into Flink-land.
For example, you could extract the date pattern in the DemoApplication and set it on the ObjectMapper. (Don't forget to initialize ObjectMapper only once in your real code!)
If you really want to use AutoWiring. I guess you need to manually trigger the autowiring on taskmanager. There is a related post specifically for ObjectMapper.

OData (Olingo) "inhibit" endpoint

My question is about what is best way to inhibit an endpoint that is automatically provided by Olingo?
I am playing with a simple app based on Spring boot and using Apache Olingo.On short, this is my servlet registration:
#Configuration
public class CxfServletUtil{
#Bean
public ServletRegistrationBean getODataServletRegistrationBean() {
ServletRegistrationBean odataServletRegistrationBean = new ServletRegistrationBean(new CXFNonSpringJaxrsServlet(), "/user.svc/*");
Map<String, String> initParameters = new HashMap<String, String>();
initParameters.put("javax.ws.rs.Application", "org.apache.olingo.odata2.core.rest.app.ODataApplication");
initParameters.put("org.apache.olingo.odata2.service.factory", "com.olingotest.core.CustomODataJPAServiceFactory");
odataServletRegistrationBean.setInitParameters(initParameters);
return odataServletRegistrationBean;
} ...
where my ODataJPAServiceFactory is
#Component
public class CustomODataJPAServiceFactory extends ODataJPAServiceFactory implements ApplicationContextAware {
private static ApplicationContext context;
private static final String PERSISTENCE_UNIT_NAME = "myPersistenceUnit";
private static final String ENTITY_MANAGER_FACTORY_ID = "entityManagerFactory";
#Override
public ODataJPAContext initializeODataJPAContext()
throws ODataJPARuntimeException {
ODataJPAContext oDataJPAContext = this.getODataJPAContext();
try {
EntityManagerFactory emf = (EntityManagerFactory) context.getBean(ENTITY_MANAGER_FACTORY_ID);
oDataJPAContext.setEntityManagerFactory(emf);
oDataJPAContext.setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
return oDataJPAContext;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
...
My entity is quite simple ...
#Entity
public class User {
#Id
private String id;
#Basic
private String firstName;
#Basic
private String lastName;
....
Olingo is doing its job perfectly and it helps me with the generation of all the endpoints around CRUD operations for my entity.
My question is : how can I "inhibit" some of them? Let's say for example that I don't want to enable the delete my entity.
I could try to use a Filter - but this seems a bit harsh. Are there any other, better ways to solve my problem?
Thanks for the help.
As you have said, you could use a filter, but then you are really coupled with the URI schema used by Olingo. Also, things will become complicated when you have multiple, related entity sets (because you could navigate from one to the other, making the URIs more complex).
There are two things that you can do, depending on what you want to achieve:
If you want to have a fined grained control on what operations are allowed or not, you can create a wrapper for the ODataSingleProcesor and throw ODataExceptions where you want to disallow an operation. You can either always throw exceptions (i.e. completely disabling an operation type) or you can use the URI info parameters to obtain the target entity set and decide if you should throw an exception or call the standard single processor. I have used this approach to create a read-only OData service here (basically, I just created a ODAtaSingleProcessor which delegates some calls to the standard one + overridden a method in the service factory to wrap the standard single processor in my wrapper).
If you want to completely un-expose / ignore a given entity or some properties, then you can use a JPA-EDM mapping model end exclude the desired components. You can find an example of such a mapping here: github. The mapping model is just an XML file which maps the JPA entities / properties to EDM entity type / properties. In order for olingo to pick it up, you can pass the name of the file to the setJPAEdmMappingModel method of the ODataJPAContext in your initialize method.

How to specify DataSource in JdbcTemplate?

In my application.properties I have set:
datasource.test.driverClass=org.postgresql.Driver
datasource.test.url=jdbc:postgresql://localhost:5433/test
datasource.test.username=admin
datasource.test.password=admin
logging.level.com.eternity = DEBUG
In my controller, I am trying to execute some SQL query form a string like this:
String selectQueryPartOne = "SELECT name, ("+ StringUtils.join(sumString, " + ")+") AS 'Price' FROM house WHERE NOT (" +StringUtils.join(sumString, " IS NULL OR ")+" IS NULL);";
JdbcTemplate statement = new JdbcTemplate();
statement.queryForList(selectQueryPartOne);
Which would work fine, however, I am receiving the following error:
java.lang.IllegalArgumentException: No DataSource specified
I've discovered, that in my statement object, I need to setDataSource first. However, I have no idea where I can get this dataSource object. Could you help?
When you create the JdbcTemplate instance yourself, you are working outside of the Spring dependency injection, and therefore will not have the DataSource injected. You need to use the Spring-provided instance via autowiring, something like:
#Controller
public class MyController {
#Autowired private JdbcTemplate jdbcTemplate;
#RequestMapping("/")
public String myAction(){
// do stuff with the jdbc template
}
}
Also, the Spring and Spring-boot documentation are great resources for further study on working with spring.

raw JDBC, Guice and Transactions

I try to create a service with multiple DAO's currently AddressDao and CustomerDao, know I want to create a Transaction that spans the two, something like:
#Inject
private CustomerDao customerDao;
#Inject
private AddressDao addressDao;
Customer getCustomer(int id) {
Customer customer = customerDao.getCustomer(id);
customer.setAddress(addressDao.getAddress(customer.getAddressId());
return customer;
}
Inside the Daos my stuff lookin like that
public class CustomerDaoJdbcImpl implements CustomerDao {
private static final Logger logger = LoggerFactory.getLogger(CustomerDaoJdbcImpl.class);
#Inject
private Database db;
public Customer getCustomer(int id) {
try(Connection connection = db.getConnection()) {
} catch(SQLException e) {
...
}
}
}
Now since my Connection gets injected into the dao's i can't span a transaction.
Also I think i don't get it right and maybe need a good book to understand everything.
What are the preferred solutions? or am I doing it wrong?
Currently I think that I have a solution made up, but it lacks threading.
I got some code from:
https://stackoverflow.com/a/2353795/2250209
and here:
https://github.com/mybatis/guice/tree/master/src/main/java/org/mybatis/guice/transactional
Currently I have a Database class which pulls the Connection from the Datasource, this class gets injected into a DAO, and if I annotated the DAO or Service the connection will be kept open until I call commit or rollback, but I don't know if that is the best pattern since some people recommend to close the connection inside the method.

Spring Boot equivalent to XML multi-database configuration

I would like to port two projects to Spring Boot 1.1.6. The are each part of a larger project. They both need to make SQL connections to 1 of 7 production databases per web request based region. One of them persists configuration setting to a Mongo database. They are both functional at the moment but the SQL configuration is XML based and the Mongo is application.properties based. I'd like to move to either xml or annotation before release to simplify maintenance.
This is my first try at this forum, I may need some guidance in that arena as well. I put the multi-database tag on there. Most of those deal with two connections open at a time. Only one here and only the URL changes. Schema and the rest are the same.
In XML Fashion ...
#Controller
public class CommonController {
private CommonService CommonService_i;
#RequestMapping(value = "/rest/Practice/{enterprise_id}", method = RequestMethod.GET)
public #ResponseBody List<Map<String, Object>> getPracticeList(#PathVariable("enterprise_id") String enterprise_id){
CommonService_i = new CommonService(enterprise_id);
return CommonService_i.getPracticeList();
}
#Service
public class CommonService {
private ApplicationContext ctx = null;
private JdbcTemplate template = null;
private DataSource datasource = null;
private SimpleJdbcCall jdbcCall = null;
public CommonService(String enterprise_id) {
ctx = new ClassPathXmlApplicationContext("database-beans.xml");
datasource = ctx.getBean(enterprise_id, DataSource.class);
template = new JdbcTemplate(datasource);
}
Each time a request is made, a new instance of the required service is created with the appropriate database connection.
In the spring boot world, I've come across one article that extended TomcatDataSourceConfiguration.
http://xantorohara.blogspot.com/2013/11/spring-boot-jdbc-with-multiple.html That at least allowed me to create a java configuration class however, I cannot come up with a way to change the prefix for the ConfigurationProperties per request like I am doing with the XML above. I can set up multiple configuration classes but the #Qualifier("00002") in the DAO has to be a static value. //The value for annotation attribute Qualifier.value must be a constant expression
#Configuration
#ConfigurationProperties(prefix = "Region1")
public class DbConfigR1 extends TomcatDataSourceConfiguration {
#Bean(name = "dsRegion1")
public DataSource dataSource() {
return super.dataSource();
}
#Bean(name = "00001")
public JdbcTemplate jdbcTemplate(DataSource dsRegion1) {
return new JdbcTemplate(dsRegion1);
}
}
On the Mongo side, I am able to define variables in the configurationProperties class and, if there is a matching entry in the appropriate application.properties file, it overwrites it with the value in the file. If not, it uses the value in the code. That does not work for the JDBC side. If you define a variable in your config classes, that value is what is used. (yeah.. I know it says mondoUrl)
#ConfigurationProperties(prefix = "spring.mongo")
public class MongoConnectionProperties {
private String mondoURL = "localhost";
public String getMondoURL() {
return mondoURL;
}
public void setMondoURL(String mondoURL) {
this.mondoURL = mondoURL;
}
There was a question anwsered today that got me a little closer. Spring Boot application.properties value not populating The answer showed me how to at least get #Value to function. With that, I can set up a dbConfigProperties class that grabs the #Value. The only issue is that the value grabbed by #Value is only available in when the program first starts. I'm not certain how to use that other than seeing it in the console log when the program starts. What I do know now is that, at some point, in the #Autowired of the dbConfigProperties class, it does return the appropriate value. By the time I want to use it though, it is returning ${spring.datasource.url} instead of the value.
Ok... someone please tell me that #Value is not my only choice. I put the following code in my controller. I'm able to reliably retrieve one value, Yay. I suppose I could hard code each possible property name from my properties file in an argument for this function and populate a class. I'm clearly doing something wrong.
private String url;
//private String propname = "${spring.datasource.url}"; //can't use this
#Value("${spring.datasource.url}")
public void setUrl( String val) {
this.url = val;
System.out.println("==== value ==== " + url);
}
This was awesome... finally some progress. I believe I am giving up on changing ConfigurationProperties and using #Value for that matter. With this guy's answer, I can access the beans created at startup. Y'all were probably wondering why I didn't in the first place... still learning. I'm bumping him up. That saved my bacon. https://stackoverflow.com/a/24595685/4028704
The plan now is to create a JdbcTemplate producing bean for each of the regions like this:
#Configuration
#ConfigurationProperties(prefix = "Region1")
public class DbConfigR1 extends TomcatDataSourceConfiguration {
#Bean(name = "dsRegion1")
public DataSource dataSource() {
return super.dataSource();
}
#Bean(name = "00001")
public JdbcTemplate jdbcTemplate(DataSource dsRegion1) {
return new JdbcTemplate(dsRegion1);
}
}
When I call my service, I'll use something like this:
public AccessBeans(ServletRequest request, String enterprise_id) {
ctx = RequestContextUtils.getWebApplicationContext(request);
template = ctx.getBean(enterprise_id, JdbcTemplate.class);
}
Still open to better ways or insight into foreseeable issues, etc but this way seems to be about equivalent to my current XML based ways. Thoughts?

Resources