Spring Jdbc inbound channel adapter - spring

I'm trying for a program in spring which does DB poll and selects that record to read. I see example for xml but i would like to know how do we do in java config. Can someone show me an example ?

You need JdbcPollingChannelAdapter #Bean definition, marked with the #InboundChannelAdapter:
#Bean
#InboundChannelAdapter(value = "fooChannel", poller = #Poller(fixedDelay="5000"))
public MessageSource<?> storedProc(DataSource dataSource) {
return new JdbcPollingChannelAdapter(dataSource, "SELECT * FROM foo where status = 0");
}
http://docs.spring.io/spring-integration/docs/4.3.11.RELEASE/reference/html/overview.html#programming-tips

Related

JdbcPollingChannelAdapter and IntegrationFlow No rolling back the update when an exception occurs in the Integraion flow messages

My use case is that I have got a spring boot application with a JdbcPollingChannelAdapter to fetch data from a postgresql database, updating the fetched rows and moving foreward with message flow (using IntegrationFlowBuilder) to process some transform to the ResultSet and publish the results to RabbitMQ.
JdbcPollingChannelAdapter is configured to fetch data each 60 seonds with a select for update query followed by an update query to flag the status form NEW to PUBLISH status:
The sql query :select * from table where status= 'NEW' order by tms_creation limit 100 for update;
The update query : update table set cod_etat = 'PUBLISH', tms_modification = now() where id in (:id)
Also, there is no Max Row per Poll to fetch data, which means that the jdbc poller will execute the sql request as many time as data (with status NEW) is present.
First issue: I stop my RabbitMQ and let my microservice running, the JdbcPollingChannelAdapter fetch the first ResultSet pass them through the Message flow and process the update. The message flow process the resultSet to send them through a channel to rabbitMQ(using spring cloud stream). The send fail and no Rollback has occured which means that the resultSet has been flagged as published.
I Have been loking around in documentation to figure out what I have missed. So any help would be appreciate.
Second issue: I run 3 instances of my application on PCF, and handle the concurrent access to the rows in the datable. My transaction and the select for update query in The JdbcPollingChannelAdapter suppose to get Row-level Lock Modes for the current transaction as per sql query (select for update). But what is happening is that more than one instance could get the same rows which is supposed to be managed by the current lock. Thus, it leads to multiple instances handling the same data and publishing them multiple times.
My code is as
#EnableConfigurationProperties(ProprietesSourceJdbc.class)
#Component
public class KafkaGuy {
private static final Logger LOG = LoggerFactory.getLogger(KafkaGuy.class);
private ProprietesSourceJdbc proprietesSourceJdbc;
private DataSource sourceDeDonnees;
private DemandeSource demandeSource;
private ObjectMapper objectMapper;
private JdbcTemplate jdbcTemplate;
public KafkaGuy(ProprietesSourceJdbc proprietesSourceJdbc, DemandeSource demandeSource, DataSource dataSource, JdbcTemplate jdbcTemplate, ObjectMapper objectMapper) {
this.proprietesSourceJdbc = proprietesSourceJdbc;
this.demandeSource = demandeSource;
this.sourceDeDonnees = dataSource;
this.objectMapper = objectMapper;
this.jdbcTemplate = jdbcTemplate;
}
#Bean
public MessageSource<Object> jdbcSourceMessage() {
JdbcPollingChannelAdapter jdbcSource = new JdbcPollingChannelAdapter(this.sourceDeDonnees, this.proprietesSourceJdbc.getQuery());
jdbcSource.setUpdateSql(this.proprietesSourceJdbc.getUpdate());
return jdbcSource;
}
#Bean
public IntegrationFlow fluxDeDonnees() {
IntegrationFlowBuilder flowBuilder = IntegrationFlows.from(jdbcSourceMessage());
flowBuilder
.split()
.log(LoggingHandler.Level.INFO, message ->
message.getHeaders().get("sequenceNumber")
+ " événements publiés sur le bus de message sur "
+ message.getHeaders().get("sequenceSize")
+ " événements lus (lot)")
.transform(Transformers.toJson())
.enrichHeaders(h -> h.headerExpression("type", "payload.typ_evenement"))
.publishSubscribeChannel(publishSubscribeSpec -> publishSubscribeSpec
.subscribe(flow -> flow
.transform(Transformers.toJson())
.transform(kafkaGuyTransformer())
.channel(this.demandeSource.demandePreinscriptionOuput()))
);
return flowBuilder.get();
}
#Bean
public KafkaGuyTransformer kafkaGuyTransformer() {
return new KafkaGuyTransformer();
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
public PollerMetadata defaultPoller() {
PollerMetadata pollerMetadata = new PollerMetadata();
PeriodicTrigger trigger = new PeriodicTrigger(this.proprietesSourceJdbc.getTriggerDelay(), TimeUnit.SECONDS);
pollerMetadata.setTrigger(trigger);
pollerMetadata.setMaxMessagesPerPoll(proprietesSourceJdbc.getMaxRowsPerPoll());
return pollerMetadata;
}
public class KafkaGuyTransformer implements GenericTransformer<Message, Message> {
#Override
public Message transform(Message message) {
Message<String> msg = null;
try {
DemandeRecueDTO dto = objectMapper.readValue(message.getPayload().toString(), DemandeRecueDTO.class);
msg = MessageBuilder.withPayload(dto.getTxtDonnee())
.copyHeaders(message.getHeaders())
.build();
} catch (Exception e) {
LOG.error(e.getMessage(), e);
}
return msg;
}
}
}
I am new In spring integration and sorry if is not well explained. Any help is appreciate.
Everything looks good and should be as you have described. Only the problem I see that there is no transaction configured for the IntegrationFlows.from(jdbcSourceMessage()).
Consider to PollerMetadata.setAdviceChain() with a TransactionInterceptor.
Another way is to use a PollerSpec with its transactional() option.
This way you won't use local data base transactions which are committed exactly after return from the ResultSet processing. With transaction on the application level there is not going to be a commit until you exit a thread.

Spring Integration - Dynamic MailReceiver configuration

I'm pretty new to spring-integration anyway I'm using it in order to receive mails and elaborate them.
I used this spring configuration class:
#Configuration
#EnableIntegration
#PropertySource(value = { "classpath:configuration.properties" }, encoding = "UTF-8", ignoreResourceNotFound = false)
public class MailReceiverConfiguration {
private static final Log logger = LogFactory.getLog(MailReceiverConfiguration.class);
#Autowired
private EmailTransformerService emailTransformerService;
// Configurazione AE
#Bean
public MessageChannel inboundChannelAE() {
return new DirectChannel();
}
#Bean(name= {"aeProps"})
public Properties aeProps() {
Properties javaMailPropertiesAE = new Properties();
javaMailPropertiesAE.put("mail.store.protocol", "imap");
javaMailPropertiesAE.put("mail.debug", Boolean.TRUE);
javaMailPropertiesAE.put("mail.auth.debug", Boolean.TRUE);
javaMailPropertiesAE.put("mail.smtp.socketFactory.fallback", "false");
javaMailPropertiesAE.put("mail.imap.socketFactory.class", "javax.net.ssl.SSLSocketFactory");
return javaMailPropertiesAE;
}
#Bean(name="mailReceiverAE")
public MailReceiver mailReceiverAE(#Autowired MailConfigurationBean mcb, #Autowired #Qualifier("aeProps") Properties javaMailPropertiesAE) throws Exception {
return ConfigurationUtil.getMailReceiver("imap://USERNAME:PASSWORD#MAILSERVER:PORT/INBOX", new BigDecimal(2), javaMailPropertiesAE);
}
#Bean
#InboundChannelAdapter( autoStartup = "true",
channel = "inboundChannelAE",
poller = {#Poller(fixedRate = "${fixed.rate.ae}",
maxMessagesPerPoll = "${max.messages.per.poll.ae}") })
public MailReceivingMessageSource pollForEmailAE(#Autowired MailReceiver mailReceiverAE) {
MailReceivingMessageSource mrms = new MailReceivingMessageSource(mailReceiverAE);
return mrms;
}
#Transformer(inputChannel = "inboundChannelAE", outputChannel = "transformerChannelAE")
public MessageBean transformitAE( MimeMessage mailMessage ) throws Exception {
// amministratore email inbox
MessageBean messageBean = emailTransformerService.transformit(mailMessage);
return messageBean;
}
#Splitter(inputChannel = "transformerChannelAE", outputChannel = "nullChannel")
public List<Message<?>> splitIntoMessagesAE(final MessageBean mb) {
final List<Message<?>> messages = new ArrayList<Message<?>>();
for (EmailFragment emailFragment : mb.getEmailFragments()) {
Message<?> message = MessageBuilder.withPayload(emailFragment.getData())
.setHeader(FileHeaders.FILENAME, emailFragment.getFilename())
.setHeader("directory", emailFragment.getDirectory()).build();
messages.add(message);
}
return messages;
}
}
So far so good.... I start my micro-service and there is this component listening on the specified mail server and mails are downloaded.
Now I have this requirement: mail server configuration (I mean the string "imap://USERNAME:PASSWORD#MAILSERVER:PORT/INBOX") must be taken from a database and it can be configurable. In any time a system administrator can change it and the mail receiver must use the new configuration.
As far as I understood I should create a new instance of MailReceiver when a new configuration is present and use it in the InboundChannelAdapter
Is there any best practice in order to do it? I found this solution: ImapMailReceiver NO STORE attempt on READ-ONLY folder (Failure) [THROTTLED];
In this solution I can inject the ThreadPoolTaskScheduler if I define it in my Configuration class; I can also inject the DirectChannel but every-time I should create a new MailReceiver and a new ImapIdleChannelAdapter without considering this WARN message I get when the
ImapIdleChannelAdapter starts:
java.lang.RuntimeException: No beanfactory at org.springframework.integration.expression.ExpressionUtils.createStandardEvaluationContext(ExpressionUtils.java:79) at org.springframework.integration.mail.AbstractMailReceiver.onInit(AbstractMailReceiver.java:403)
Is there a better way to satisfy my scenario?
Thank you
Angelo
The best way to do this is to use the Java DSL and dynamic flow registration.
Documentation here.
That way, you can unregister the old flow and register a new one, each time the configuration changes.
It will automatically handle injecting dependencies such as the bean factory.

Spring Integration : get poll expression from database

I have an FTP message source and I want to enable the user to configure the poll frequency through an application.
This is the current configuration of the Inbound channel adapter
#Bean
#InboundChannelAdapter(channel = "fromSmartpath", poller = #Poller(cron = "0 15 8 ? * MON,TUE,WED,THU,FRI,SAT"))
public MessageSource<File> sftpMessageSource() throws SftpException {
SftpInboundFileSynchronizingMessageSource source = new SftpInboundFileSynchronizingMessageSource(
sftpInboundFileSynchronizer());
source.setLocalDirectory(new File(Constants.LOCAL_REPOSITORY_PATH));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new FileSystemPersistentAcceptOnceFileListFilter(metaDataStore(), "metastore"));
return source;
}
My goal is to retrieve the cron expression from the database. Is There a way to achieve this?
Thank you
The cron expression ends up in the CronTrigger. You can develop some service which SELECT an expression from DB in its afterPropertiesSet() and returns it via getter.
Then you declare a #Bean for the CronTrigger and call that getter from the service during its definition.
The #Poller on the #InboundChannelAdapter has a trigger option to refer to the existing bean.

Send and receive files from FTP in Spring Boot

I'm new to Spring Framework and, indeed, I'm learning and using Spring Boot. Recently, in the app I'm developing, I made Quartz Scheduler work, and now I want to make Spring Integration work there: FTP connection to a server to write and read files from.
What I want is really simple (as I've been able to do so in a previous java application). I've got two Quartz Jobs scheduled to fired in different times daily: one of them reads a file from a FTP server and another one writes a file to a FTP server.
I'll detail what I've developed so far.
#SpringBootApplication
#ImportResource("classpath:ws-config.xml")
#EnableIntegration
#EnableScheduling
public class MyApp extends SpringBootServletInitializer {
#Autowired
private Configuration configuration;
//...
#Bean
public DefaultFtpsSessionFactory myFtpsSessionFactory(){
DefaultFtpsSessionFactory sess = new DefaultFtpsSessionFactory();
Ftp ftp = configuration.getFtp();
sess.setHost(ftp.getServer());
sess.setPort(ftp.getPort());
sess.setUsername(ftp.getUsername());
sess.setPassword(ftp.getPassword());
return sess;
}
}
The following class I've named it as a FtpGateway, as follows:
#Component
public class FtpGateway {
#Autowired
private DefaultFtpsSessionFactory sess;
public void sendFile(){
// todo
}
public void readFile(){
// todo
}
}
I'm reading this documentation to learn to do so. Spring Integration's FTP seems to be event driven, so I don't know how can I execute either of the sendFile() and readFile() from by Jobs when the trigger is fired at an exact time.
The documentation tells me something about using Inbound Channel Adapter (to read files from a FTP?), Outbound Channel Adapter (to write files to a FTP?) and Outbound Gateway (to do what?):
Spring Integration supports sending and receiving files over FTP/FTPS by providing three client side endpoints: Inbound Channel Adapter, Outbound Channel Adapter, and Outbound Gateway. It also provides convenient namespace-based configuration options for defining these client components.
So, I haven't got it clear as how to follow.
Please, could anybody give me a hint?
Thank you!
EDIT:
Thank you #M. Deinum. First, I'll try a simple task: read a file from the FTP, the poller will run every 5 seconds. This is what I've added:
#Bean
public FtpInboundFileSynchronizer ftpInboundFileSynchronizer() {
FtpInboundFileSynchronizer fileSynchronizer = new FtpInboundFileSynchronizer(myFtpsSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setPreserveTimestamp(true);
fileSynchronizer.setRemoteDirectory("/Entrada");
fileSynchronizer.setFilter(new FtpSimplePatternFileListFilter("*.csv"));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "ftpChannel", poller = #Poller(fixedDelay = "5000"))
public MessageSource<File> ftpMessageSource() {
FtpInboundFileSynchronizingMessageSource source = new FtpInboundFileSynchronizingMessageSource(inbound);
source.setLocalDirectory(new File(configuracion.getDirFicherosDescargados()));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
return source;
}
#Bean
#ServiceActivator(inputChannel = "ftpChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
Object payload = message.getPayload();
if(payload instanceof File){
File f = (File) payload;
System.out.println(f.getName());
}else{
System.out.println(message.getPayload());
}
}
};
}
Then, when the app is running, I put a new csv file intro "Entrada" remote folder, but the handler() method isn't run after 5 seconds... I'm doing something wrong?
Please add #Scheduled(fixedDelay = 5000) over your poller method.
You should use SPRING BATCH with tasklet. It is far easier to configure bean, crone time, input source with existing interfaces provided by Spring.
https://www.baeldung.com/introduction-to-spring-batch
Above example is annotation and xml based both, you can use either.
Other benefit Take use of listeners and parallel steps. This framework can be used in Reader - Processor - Writer manner as well.

Spring Boot Actuator Health Indicator

We have been using Spring Boot for several projects now, We are using the latest version 1.2.3. We are incorporating Actuator. So far things are working well except we are finding that the /health indicator [default] is showing that the service is down. This is not true. These services are that implement with datasources. It may call other SOAP or Rest services. What is the health service looking at to measure whether a service is down?
As #derFuerst said the DataSourceHealthIndicator has the default query to check whether the DB is up or not.
If you want to use this the proper vendor specific query you should write your own health indicator in your configuration class, like this in case of Oracle data source:
#Autowired(required = false)
private DataSource dataSource;
#Bean
#Primary
public DataSourceHealthIndicator dataSourceHealthIndicator() {
return new DataSourceHealthIndicator(dataSource, "SELECT 1 FROM DUAL");
}
The DataSourceHealthIndicator is used to check availablity. The default query is SELECT 1, but there are some product specific queries, too.
You can write your own HealthIndicator. Either you implement the interface or extend the AbstractHealthIndicator.
To disable the default db-health-check put this line into your application properties management.health.db.enabled=false.
Hope that helps
The above comment helpedme to init my research but for me it was not enough :
#Bean
#Primary
public DataSourceHealthIndicator dataSourceHealthIndicator() {
return new DataSourceHealthIndicator(dataSource, "SELECT 1 FROM DUAL");
}
Here is the configuration that helped me to make it run: Define HealthIndicator #Bean like the follow and provide the required query :
#Bean
#Primary
public HealthIndicator dbHealthIndicator() {
return new DataSourceHealthIndicator(dataSource, "SELECT 1 FROM DUMMY");
}
If no Query is providen the SELECT 1 will be used . As #derFuerst said will be used , Here is the defailt implementation of DataSourceHealthIndicator :
public DataSourceHealthIndicator(DataSource dataSource, String query) {
super("DataSource health check failed");
this.dataSource = dataSource;
this.query = query;
this.jdbcTemplate = dataSource != null ? new JdbcTemplate(dataSource) : null;
}
...
protected String getValidationQuery(String product) {
String query = this.query;
if (!StringUtils.hasText(query)) {
DatabaseDriver specific = DatabaseDriver.fromProductName(product);
query = specific.getValidationQuery();
}
if (!StringUtils.hasText(query)) {
query = "SELECT 1";
}
return query;
}
Hi everyone I'm a beginner to health check using actuator. I used the below solution and it worked,
#Autowired(required = false)
private DataSource dataSource;
#Bean
#Primary
public DataSourceHealthIndicator dataSourceHealthIndicator() {
return new DataSourceHealthIndicator(dataSource, "SELECT 1 FROM DUAL");
}
But can anyone please let me know how validation query is working even though there isn't a table named Dual. Also as per my understanding when we request "/actuator/health", all implementations of HealthIndicator are called automatically and health check methods are executed. Kindly correct me if I'm wrong

Resources