Connect LDAP from Spring - spring

I have to realize a web application based on Spring, allowing the user to manage LDAP data. The connection to the LDAP should be done only with the JNDI framework (no SpringLDAP allowed).
For this, I realized a utility class to do the basic operations (add, update, delete, list, ...).
Here is a short block of code of this class :
public class LdapUtility {
private static LdapUtility instance;
private DirContext dirContext;
public static LdapUtility getInstance() {
if(LdapUtility.instance == null)
LdapUtility.instance = new LdapUtility();
return LdapUtility.instance;
}
/**
* Connect to the LDAP
*/
private LdapUtility() {
Hashtable env = new Hashtable();
env.put(Context.INITIAL_CONTEXT_FACTORY,"com.sun.jndi.ldap.LdapCtxFactory");
env.put(Context.PROVIDER_URL, "ldap://localhost:389");
env.put(Context.SECURITY_AUTHENTICATION, "simple");
env.put(Context.SECURITY_PRINCIPAL, "cn=Manager,dc=my-domain,dc=com");
env.put(Context.SECURITY_CREDENTIALS, "secret");
try {
dirContext = new InitialDirContext(env);
}
catch(Exception ex) {
dirContext = null;
}
}
public void addUser(User u) {
dirContext.createSubcontext(....); //add user in the LDAP
}
}
With this code, I can access all my methods by calling LdapUtility.getInstance()..., but the connection to the LDAP will never be released.
Another way would be to connect to the LDAP before each operation, but in this case there would be too much connections to the LDAP...
So, here is my question : what is the most elegant/smartest way to access these methods ?
Thank you in advance :-)

Since you're already using Spring, I would recommend using Spring LDAP:
Spring LDAP is a Java library for simplifying LDAP operations, based on the pattern of Spring's JdbcTemplate. The framework relieves the user of common chores, such as looking up and closing contexts, looping through results, encoding/decoding values and filters, and more.
Especially if you're not familiar with LDAP and potential performance problems, it can help to start of using a utility library like this that will do all the heavy lifting for you.
You configure the LDAP connection settings in the spring config:
<bean id="contextSource" class="org.springframework.ldap.core.support.LdapContextSource">
<property name="url" value="ldap://localhost:389" />
<property name="base" value="dc=example,dc=com" />
<property name="userDn" value="cn=Manager" />
<property name="password" value="secret" />
</bean>
<bean id="ldapTemplate" class="org.springframework.ldap.core.LdapTemplate">
<constructor-arg ref="contextSource" />
</bean>
You can then just use the LdapTemplate wherever you need to perform an LDAP action:
return ldapTemplate.search(
"", "(objectclass=person)",
new AttributesMapper() {
public Object mapFromAttributes(Attributes attrs)
throws NamingException {
return attrs.get("cn").get();
}
});

without a spring (being forbidden), i would quickly implement something simillar:
(when being lazy) create a simple callback interface (such as you can find in spring -- JpaCallback.execute(EntityManager em)) -- but for LDAP -- MyLdapCallback.execute(LdapConnection connection) -- intead of LdapConnection you can imagine anything you require -- objects from OpenLdap or SDK Context. Something like (just for presentation):
...
interface LdapCallback<T> {
T execute(DirContext ctx) throws NamingException, IOException;
}
...
private <T> T execute(LdapCallback<T> callback) throws NamingException, IOException {
T result = null;
LdapContext ctx = new InitialLdapContext();
try {
result = callback.execute(ctx);
} finally {
if (tls != null) {
tls.close();
}
ctx.close();
}
return result;
}
...
Once done, you will create anonymous classes for each Ldap call an call the callback via execute(callback).
(having more time) implement ad 1. + create AOP that will wrap my methods marked with annotation with aspect that will itself execute my methods within the wrapper above (without explicitly doing so in my code)

There are several ways to connect to ldap. Using javax.naming.* is one of them. In javadoc you may find, that classes in your SPI provider manages their own connections, so you don't care for it -- that may be an answer to your question -- see JDK doc and how Context manages conections and network -- http://docs.oracle.com/javase/6/docs/api/javax/naming/ldap/LdapContext.html .
If you are accustomed to more JDBC-like access, you may find http://www.openldap.org/jldap/ more to your liking. There you have conections completely under your control and you treat them much the same way as in JDBC. You may use any pooling library you like.

Not knowing the exact requirements I interpret the core question as being "when to open/close the connection".
My crystal ball tells me you may want to use a connection pool. True, you don't close the connection explicitly as this is handled by the pool but this may be ok for your assignment. It's fairly easy:
// Enable connection pooling
env.put("com.sun.jndi.ldap.connect.pool", "true");
The complete source code is referenced in Oracle's basic LDAP tutorial.

Related

Spring Integration AOP for Logging outbound Http requests

I was looking at a post from 2014 about using Spring AOP for logging HTTP requests/replies:
Spring integration + logging response time for http adapters(or any endpoint)
To this end, I tried this AOP configuration:
<aop:config >
<aop:aspect id="myAspect" ref="inboundOutboundHttpLogging">
<aop:pointcut id="handleRequestMessageMethod"
expression="execution(* org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleRequestMessage(*))
and
args(message))" />
<aop:before method="requestMessageSent" pointcut-ref="handleRequestMessageMethod" arg-names="message"/>
</aop:aspect>
</aop:config>
Is there perhaps a newer way of using AOP for logging HTTP requests? I want to avoid having to put per-request logging (i.e. outbound-gateway advice on each gateway).
Thanks for any pointers.
The handleRequestMessage() is essentially an input message to this gateway and output. So, if you don't like implementing an AbstractRequestHandlerAdvice and adding it into each your gateway via their <request-handler-advice-chain>, then consider to use a <wire-tap> for input and output channels of those gateway.
You may implement, though, a BeanPostProcessor.postProcessBeforeInitialization() to add your custom AbstractRequestHandlerAdvice into those HTTP gateways you are interested in.
My point is that <aop:aspect> you are presenting us really might lead to some unexpected behavior, like that final method concern you have edit out from your question...
Based upon the suggestions made by #artem-bilan, I was able to find a solution similar to AOP for injecting logging AbstractRequestHandlerAdvice into HTTP outbound request processing. I'm contributing this as a way of showing a possible solution for anyone else who comes across this question.
As #artem-bilan mentions, there is a mechanism for injecting AbstractRequestHandlerAdvice into a AbstractReplyProducingMessageHandler such as an HttpRequestExecutingMessageHandler. In my case, I'm wanting to log the message contents (header and payload) prior to the HTTP call and also log the return message (header and payload). This works nicely.
#artem-bilan suggests that the BeanPostProcessor mechanism can allow to inject the advice without having to add that declaration to each http outbound bean. The BeanPostProcessor looks like this:
public class AddHttpOutboundAdvicePostProcessor implements BeanPostProcessor {
final List<Advice> adviceList;
final AddHttpOutboundAdvicePostProcessor(List<Advice> adviceList) {
this.adviceList = adviceList;
}
#Override
public Object postProcessAfterInitialization(#NonNull Object bean,
#NonNull String beanName)
throws BeansException {
if (bean instanceof AbstractHttpRequestExecutingMessageHandler) {
((AbstractHttpRequestExecutingMessageHandler) bean).setAdviceChain(adviceList);
}
return bean;
}
}
We need to set up this bean into our context. (I'm a die-hard declarative fan hence this is in XML.)
<bean id = "addHttpLoggingPostProcessor"
class = "com.my.package.AddHttpOutboundAdvicePostProcessor" >
<constructor-arg name="adviceList>
<util:list>
<ref bean="outboundLogger" />
</util:list>
</constructor-arg>
</bean>
Here, the outboundLogger is a bean that managers the request-handler-advice. In my choice of implementation, I'm sending a copy of the outbound message to a channel for logging beforehand, and a copy of the response message down another channel for logging the response. The XML declaration of the bean takes the two channel names as constructors:
<bean id="outboundLogger" class="com.my.package.HttpRequestProcessorLogger" >
<constructor-arg name="requestLoggingChannelName" value="XXX" />
<constructor-arg name="responseLoggingChannelName" value="YYY" />
</bean>
where XXX and YYY are the names of channels to the components that perform the logging. I've set these channels to be ExecutorChannels so that the logging is performed asynchronously.
The HttpRequestProcessorLogger bean manages the call to handleRequestMessage():
public class HttpRequestProcessorLogger extends AbstractRequestHandlerAdvice {
private MessageChannel requestLoggingChannel;
private MessageChannel responseLoggingChannel;
private String requestLoggingChannelName;
private String responseLoggingChannelName;
private BeanFactory beanFactory;
public HttpRequestProcessorLogger(String requestLoggingChannelName, String responseLoggingChannelName) {
this.requestLoggingChannelName = requestLoggingChannelName;
this.responseLoggingChannelName = responseLoggingChannelName;
}
#Override
protected Object doInvoke(ExecutionCallback callback, Object target, Message<?> message) {
getChannels();
requestLoggingChannel.send(message);
final Object result = callback.execute();
final message<?> outputMessage =
(MessageBuilder.class.isInstance(result) ? ((MessageBuilder<?>) result).build()
: (Message<?>) result;
responseLoggingChannel.send(outputMessage);
return outputMessage;
}
private synchronized void getChannels() {
if (requestLoggingChannelName != null) {
final DestinationResolver<MessageChannel>
channelResolver = ChannelResolverUtils.getChannelResolver(this.beanFactory);
requestLoggingChannel = channelResolver.resolverDestination(requestLoggingChannelName);
responseLoggingChannel = channelResolver.resolverDestination(responseLoggingChannelName);
requestLoggingChannelName = null;
responseLoggingChannelName = null;
}
}
#Override
public void setBeanFactory(#NonNull BeanFactory beanFactory) throws BeanException {
this.beanFactory = beanFactory;
}
}

Spring Integration Flow with Jdbc Message source which has dynamic query

I am trying to do a change data capture from oracle DB using spring cloud data flow with kafka as broker. I am using polling mechanism for this. I am polling the data base with a basic select query at regular intervals to capture any updated data. For a better fail proof system, I have persisted my last poll time in oracle DB and used it to get the data which is updated after last poll.
public MessageSource<Object> jdbcMessageSource() {
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, this.properties.getQuery());
jdbcPollingChannelAdapter.setUpdateSql(this.properties.getUpdate());
return jdbcPollingChannelAdapter;
}
#Bean
public IntegrationFlow pollingFlow() {
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcMessageSource(),spec -> spec.poller(Pollers.fixedDelay(3000)));
flowBuilder.channel(this.source.output());
flowBuilder.transform(trans,"transform");
return flowBuilder.get();
}
My queries in application properties are as below:
query: select * from kafka_test where LAST_UPDATE_TIME >(select LAST_POLL_TIME from poll_time)
update : UPDATE poll_time SET LAST_POLL_TIME = CURRENT_TIMESTAMP
This working perfectly for me. I am able to get the CDC from the DB with this approach.
The problem I am looking over now is below:
Creating an table just to maintain the poll time is an overburden. I am looking for maintaining this last poll time in a kafka topic and retrieve that time from kafka topic when I am making the next poll.
I have modified the jdbcMessageSource method as below to try that:
public MessageSource<Object> jdbcMessageSource() {
String query = "select * from kafka_test where LAST_UPDATE_TIME > '"+<Last poll time value read from kafka comes here>+"'";
JdbcPollingChannelAdapter jdbcPollingChannelAdapter =
new JdbcPollingChannelAdapter(this.dataSource, query);
return jdbcPollingChannelAdapter;
}
But the Spring Data Flow is instantiating the pollingFlow( ) (please see the code above) bean only once. Hence what ever the query that is run first will remain the same. I want to update the query with new poll time for each poll.
Is there a way where I can write a custom Integrationflow to have this query updated everytime I make a poll ?
I have tried out IntegrationFlowContext for that but wasn't successful.
Thanks in advance !!!
With the help of both the answer above, I was able to figure out the approach.
Write a jdbc template and wrap that as a bean and use it for the Integration Flow.
#EnableBinding(Source.class)
#AllArgsConstructor
public class StockSource {
private DataSource dataSource;
#Autowired
private JdbcTemplate jdbcTemplate;
private MessageChannelFactory messageChannelFactory; // You can use normal message channel which is available in spring cloud data flow as well.
private List<String> findAll() {
jdbcTemplate = new JdbcTemplate(dataSource);
String time = "10/24/60" . (this means 10 seconds for oracle DB)
String query = << your query here like.. select * from test where (last_updated_time > time) >>;
return jdbcTemplate.query(query, new RowMapper<String>() {
#Override
public String mapRow(ResultSet rs, int rowNum) throws SQLException {
...
...
any row mapper operations that you want to do with you result after the poll.
...
...
...
// Change the time here for the next poll to the DB.
return result;
}
});
}
#Bean
public IntegrationFlow supplyPollingFlow() {
IntegrationFlowBuilder flowBuilder = IntegrationFlows
.from(this::findAll, spec -> {
spec.poller(Pollers.fixedDelay(5000));
});
flowBuilder.channel(<<Your message channel>>);
return flowBuilder.get();
}
}
In our use case, we were persisting the last poll time in a kafka topic. This was to make the application state less. Every new poll to the DB now, will have a new time in the where condition.
P.S: your messaging broker (kafka/rabbit mq) sdould be running in your local or connect to them if there are hosted on a different platform.
God Speed !!!
See Artem's answer for the mechanism for a dynamic query in the standard adapter; an alternative, however, would be to simply wrap a JdbcTemplate in a Bean and invoke it with
IntegrationFlows.from(myPojo(), "runQuery", e -> ...)
...
or even a simple lambda
.from(() -> jdbcTemplate...)
We have this test configuration (sorry, it is an XML):
<inbound-channel-adapter query="select * from item where status=:status" channel="target"
data-source="dataSource" select-sql-parameter-source="parameterSource"
update="delete from item"/>
<beans:bean id="parameterSource" factory-bean="parameterSourceFactory"
factory-method="createParameterSourceNoCache">
<beans:constructor-arg value=""/>
</beans:bean>
<beans:bean id="parameterSourceFactory"
class="org.springframework.integration.jdbc.ExpressionEvaluatingSqlParameterSourceFactory">
<beans:property name="parameterExpressions">
<beans:map>
<beans:entry key="status" value="#statusBean.which()"/>
</beans:map>
</beans:property>
<beans:property name="sqlParameterTypes">
<beans:map>
<beans:entry key="status" value="#{ T(java.sql.Types).INTEGER}"/>
</beans:map>
</beans:property>
</beans:bean>
<beans:bean id="statusBean"
class="org.springframework.integration.jdbc.config.JdbcPollingChannelAdapterParserTests$Status"/>
Pay attention to the ExpressionEvaluatingSqlParameterSourceFactory and its createParameterSourceNoCache() factory. The this result can be used for the select-sql-parameter-source.
The JdbcPollingChannelAdapter has a setSelectSqlParameterSource on the matter.
So, you configure a ExpressionEvaluatingSqlParameterSourceFactory to be able to resolve some query parameter as an expression for some bean method invocation to get a desired value from Kafka. Then createParameterSourceNoCache() will help you to obtain an expected SqlParameterSource.
There is some info in docs as well: https://docs.spring.io/spring-integration/docs/current/reference/html/#jdbc-inbound-channel-adapter

How to poll directory for a file?

I need to be able to poll a directory for a specific file using SCP, and once the file has been processed, it needs to keep polling.
Is this possible with Spring Batch?
The normal way to handle this is using Spring Integration. The way I'd address it is with a Spring Integration flow that uses a SFTP Inbound Channel Adapter to retrieve the files, then passes the transferred name to Spring Batch to launch. The flow would actually be similar to the sample in the SpringBatchIntegration in my Spring Batch Webinar here: https://github.com/mminella/SpringBatchWebinar
In that example, I use Twitter to launch the job. The only thing you'd need to change is the twitter piece for the SFTP.
I had to solve the same question (but just accessing to the local filesystem) and I did not find any solution in the framework, so I ended up creating my own class which polls for the file and creates a resource.I know this is just a workaround, but I haven't found a better way to do that so far.
I can't remember where (maybe in the "retry handling" part) but I read in the documentation something like "batch jobs should not try to solve issues like files not found, connections down and so, these kind of errors should make the job raise an error to be handled by operators" so I gave up...
On the other hand Spring Retry was part of Spring batch and now is a new separate library, maybe you just can assume the file is there and if the reader does not find it, let the step fail and establish a "retry policy" for that step, but for me that's overkill.
This is what I did:
<bean id="resourceFactory"
class="com.mycompany.batch.zip.ResourceFactory">
<property name="retryAttemps" value="${attemps}" />
<property name="timeBetweenAttemps" value="${timeBetweenAttemps}"/>
</bean>
<bean id="myResource"
factory-bean="resourceFactory" factory-method="create" scope="step">
<constructor-arg value="${absolutepath}" type="java.lang.String" />
</bean>
<!--step scope to avoid looking for the file when deployment-->
<bean id="myReader"
class="org.springframework.batch.item.xml.StaxEventItemReader" scope="step">
<property name="fragmentRootElementName" value="retailer" />
<property name="unmarshaller" ref="reportUnmarshaller" />
<property name="resource" ref="myResource" />
</bean>
And this is my class:
public class ResourceFactory {
public static final Logger LOG= LoggerFactory.getLogger(ResourceFactory.class);
private int retryAttemps;
private long timeBetweenAttemps;
public Resource create(String resource) throws IOException, InterruptedException {
Resource r;
File f=new File(resource);
int attemps=1;
while (!f.exists()) {
if (attemps<this.retryAttemps) {
attemps++;
LOG.warn("File "+resource+" not found, waiting "+timeBetweenAttemps+
" before retrying. Attemp: "+attemps+" of "+this.retryAttemps);
Thread.sleep(this.timeBetweenAttemps);
} else {
throw new FileNotFoundException(resource);
}
if (resource!=null && resource.endsWith(".zip")) {
ZipFile zipFile = new ZipFile(resource);
ZipEntry entry=zipFile.entries().nextElement();
if (entry==null) {
throw new FileNotFoundException("The zip file has no entries inside");
}
//TODO Test if a buffered Stream is faster than the raw InputStream
InputStream is=new BufferedInputStream(zipFile.getInputStream(entry));
r= new InputStreamResource(is);
if (LOG.isInfoEnabled()) {
int size=(int)entry.getSize();
LOG.info("Opening a compressed file of "+size+" bytes");
}
} else {
LOG.info("Opening a regular file");
r= new FileSystemResource(f);
}
}
return r;
}
}
If anyone knows a better way to do that, I'll gladly remove this answer (and implement the new solution)
PS: BTW, I've found some faults in my code when reviewing this post, so for me this is being helpful even with no other answers :)

AS400ConnectionPool in Spring

I have a question that can I use AS400ConnectionPool in Spring if then pls provide me the example
Yes, you can... I have never seen that class, but looking at the API this is pretty straithforward:
<bean id="as400ConnectionPool" class="com.ibm.as400.access.AS400ConnectionPool">
<property name="maxConnections" value="128"/>
</bean>
Now you can simply inject the pool into your services:
#Autowired
private AS400ConnectionPool testPool;
//...
AS400 newConn = testPool.getConnection("myAS400", "myUserID", "myPassword", AS400.COMMAND);
It is even easier with Java configuration:
#Configuration
public class As400Config {
#Bean
public AS400ConnectionPool testPool() {
// Create an AS400ConnectionPool.
AS400ConnectionPool testPool = new AS400ConnectionPool();
// Set a maximum of 128 connections to this pool.
testPool.setMaxConnections(128);
// Preconnect 5 connections to the AS400.COMMAND service.
testPool.fill("myAS400", "myUserID", "myPassword", AS400.COMMAND, 5);
return testPool;
}
}
Note that you cannot easily call testPool.fill() in XML configuration.

How to disable freemarker caching in Spring MVC

I'm using spring mvc v3 with freemarker views and cannot disable caching.
I tried by setting cache to false in viewResolver element in (spring-servlet.xml) but didn't work.
Basically what I'd like to the do some changes in freemarker and see these changes in the browser with refresh only (w/o restarting the application)
Any hints how to do that?
In my XML the following was successful:
<bean id="freemarkerMailConfiguration" class="org.springframework.ui.freemarker.FreeMarkerConfigurationFactoryBean">
<property name="templateLoaderPaths" value="classpath:emailtemplates/task,classpath:emailtemplates/user"/>
<!-- Activate the following to disable template caching -->
<property name="freemarkerSettings" value="cache_storage=freemarker.cache.NullCacheStorage" />
</bean>
This is my mail config, but the freemarkerConfig should be interesting four you, too.
I dont use to configure freemarker with xml configurations but with #Configuration annotated classes; cause i rather the Spring-Boot´ style. So you can disable the freemarker´s cache like this:
#Bean
public FreeMarkerConfigurer freeMarkerConfigurer() throws IOException, TemplateException
{
FreeMarkerConfigurer configurer = new FreeMarkerConfigurer()
{
#Override
protected void postProcessConfiguration(freemarker.template.Configuration config) throws IOException, TemplateException
{
ClassTemplateLoader classTplLoader = new ClassTemplateLoader(context.getClassLoader(), "/templates");
ClassTemplateLoader baseMvcTplLoader = new ClassTemplateLoader(FreeMarkerConfigurer.class, ""); //TODO tratar de acceder a spring.ftl de forma directa
MultiTemplateLoader mtl = new MultiTemplateLoader(new TemplateLoader[]
{
classTplLoader,
baseMvcTplLoader
});
config.setTemplateLoader(mtl);
config.setCacheStorage(new NullCacheStorage());
}
};
configurer.setDefaultEncoding("UTF-8");
configurer.setPreferFileSystemAccess(false);
return configurer;
}
The key is in:
config.setCacheStorage(new NullCacheStorage());
But you can also use this instruction instead:
config.setTemplateUpdateDelayMilliseconds(0);
It should work for you.
In application.properties:
spring.freemarker.cache=false
As defined by the manual :
If you change the template file, then FreeMarker will re-load and
re-parse the template automatically when you get the template next
time. However, since checking if the file has been changed can be time
consuming, there is a Configuration level setting called ``update
delay''. This is the time that must elapse since the last checking for
a newer version of a certain template before FreeMarker will check
that again. This is set to 5 seconds by default. If you want to see
the changes of templates immediately, set it to 0.
After searching around, the configuration key was in the freemarker.template.Configuration javadocs, at the setSetting(key, value) method.
So, in short, just set the config template_update_delay to 0 for immediate change detection.
<bean id="freemarkerConfig" class="org.springframework.web.servlet.view.freemarker.FreeMarkerConfigurer">
<property name="templateLoaderPath" value="/WEB-INF/ftl/"/>
<property name="freemarkerSettings">
<props>
<prop key="template_update_delay">0</prop>
<prop key="default_encoding">UTF-8</prop>
</props>
</property>
</bean>
Did you check the FreeMarker documentation, which contains some hints regarding how to influence template caching at the FreeMarker Configuration level. I'm not sure if you have access to the FreeMarker Configuration object from inside Spring MVC, but if you have, then the documentation page mentioned above could point you towards a possible solution.
I wasted the last two days (note entirely for this project) trying to disable the cache. It turns out I have the two options antiJARLocking and antiResourceLocking set in my context.xml. Then the templates will ALWAYS be cached
I had the same problem which I could solve only by implementing a custom template loader. Here is the working code:
protected void init() throws Exception {
freemarkerConfig = new Configuration();
freemarkerConfig.setObjectWrapper(ObjectWrapper.DEFAULT_WRAPPER);
freemarkerConfig.setTemplateLoader(new CacheAgnosticTemplateLoader(new DefaultResourceLoader(), pdfTemplatePath));
}
protected static class CacheAgnosticTemplateLoader extends SpringTemplateLoader {
public CacheAgnosticTemplateLoader(ResourceLoader resourceLoader, String templateLoaderPath) {
super(resourceLoader, templateLoaderPath);
}
#Override
public long getLastModified(Object templateSource) {
// disabling template caching
return new Date().getTime();
}
}
It seems that in the recently released FreeMarker version 2.3.17, a legal and simpler way to do it has appeared: freemarker.cache.NullCacheStorage.

Resources