How to poll multiple files from a particular file directory using Spring Integration Api without xml configuration preferably using Java annotations based approach?
I want to get the polled files list and iterate through them and process further. This is the requirement. Any Sample
Code available to meet this requirement. Thanks in Advance. Below is the code fragment I used.
#Bean
#InboundChannelAdapter(value = "fileInputChannel", poller = #Poller(fixedDelay = "60000",maxMessagesPerPoll="5"))
public MessageSource<File> fileReadingMessageSource() {
txtSource = new FileReadingMessageSource();
txtSource.setDirectory(inputDir);
txtSource.setFilter(new SimplePatternFileListFilter("*.txt"));
txtSource.setScanEachPoll(true);
return txtSource;
}
#Bean
#Transformer(inputChannel = "fileInputChannel", outputChannel = "processFileChannel")
public FileToStringTransformer fileToStringTransformer() {
Message<File> message1 = txtSource.receive();
File file1 = message1.getPayload();
return new FileToStringTransformer();
}
But irrespective of no of files in the input dir the message source instance always fetch only one file. Not sure how to make it work for multiple files to be fetched.
You can try adding a task-executor to your implementation
Follow this post for polling multiple files concurrently, but it doesn't follow the annotation based approach. And this post for sequential polling of files. And this also involves the use of annotations.
Related
I am developing Spring batch application on Spring Boot. Batch process is based on fire and forget, which means migration process is triggered by Http GET method and doesn't wait for result. There are 6 (six) csv files which containes migration data and these data must be processed and transferred to DB. Source files path is provided by application.yml file. Batch config (template) looks like as follows:
#Configuration
public class BeginCureBtachConfig {
#Value("$batch-files.begincure-source")
private String dataSourceFilePath;
#Autowired
IteamReader<BeginCure> itemReader;
#Autowired
ItemProcessor<BeginCure, BeginCure> itemProcessor;
#Bean Job beginCureJob(JobBuilderFactory jbFactory, StepBuilderDactory stp, ItemWriter<BeginCure> begnCure {
//// code goes here
}
#Bean
public FlatFileItemReader<BeginCure> beginCureItemReader() {
FlatFileItemReader<BeginCure> ffReader = new FlatFileItemReader<>();
ffReader.setResource(new FileSystemResource(dataSourceFilePath));
ffReader.setLineMapper(lineMapper());
return ffReader;
}
#Bean
public LineMapper<BeginCure> lineMapper () {
//..
}
}
The problem is when project is deployed on openshift, batch looks for files even migratin is not triggered.
In the code, I think this part of code giving some error ffReader.setResource(new FileSystemResource(dataSourceFilePath));
Is there any workaround or solution? Did anybody have this kind of problem?
Thank you in advance^^
My current project is based on Spring Integration. I am developing this project by using spring Boot.
My goal is to use Spring Integration to complete the below task.
1.I want to create listener in spring integration, to know when a file has been uploaded to SFTP server.
Well want to get clarity why we use SftpInboundFileSynchronizer?
Logger logger = LoggerFactory.getLogger(SftpConfig.class);
#Bean
public SessionFactory<ChannelSftp.LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(sftpHost);
factory.setPort(sftpPort);
factory.setUser(sftpUser);
if (sftpPrivateKey != null) {
factory.setPrivateKey(sftpPrivateKey);
factory.setPrivateKeyPassphrase(privateKeyPassPhrase);
} else {
factory.setPassword("sftpPassword");
}
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<ChannelSftp.LsEntry>(factory);
}
#Bean
public SftpInboundFileSynchronizer sftpInboundFileSynchronizer() {
SftpInboundFileSynchronizer filesynchronizer = new SftpInboundFileSynchronizer(sftpSessionFactory());
filesynchronizer.setDeleteRemoteFiles(false);
filesynchronizer.setRemoteDirectory(sftpRemoteDirectoryDownload);
filesynchronizer.setFilter(new SftpSimplePatternFileListFilter(sftpRemoteDirectoryDownloadFilter));
return filesynchronizer;
}
Well i have refer some stackoverflow post, get some knowledge how to work with spring Integration. As I am new to Spring Integration, is this the correct approach i am going to create a listner and read files?
Please provide some sample code how to create a listener, that will detect when a file has been uploaded to SFTP?
There is events like that to listen from the remote SFTP server. What we suggest so far is a passive polling approach. So, the specific SourcePollingChannelAdapter endpoint asks the resource for data with some pre-configured timing trigger. On the other hand that endpoint is supplied with some MessageSource implementation, which in case of SFTP is SftpInboundFileSynchronizingMessageSource if you are going to rely on the synchronization with the local directory before processing files.
Please, consult more with docs for some clarifications and details: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-inbound
Here you can find some samples: https://github.com/spring-projects/spring-integration-samples
My current project is based on Spring Integration. I am developing this project by using spring Boot.
My goal is to use Spring Integration to complete the below task.
Connect to SFTP
check if directory is created in the local at a specific folder
check the eligible file extension of the file specific to (CSV & XLSX)
Down load all the content from SFTP remote directory to local directory & need to track the file transfer start time and file transfer end time.
Read the file from local directory line by line and extract the certain column info.
Can you give me some suggestions ?
And how can I get the transfer start time ?
Note : This requirement i have to develop as an rest api. Please provide some guidance how i can achieve this by using spring integration?
Thanks. :)
public class SftpConfig {
#Value("${sftp.host}")
private String sftpHost;
#Value("${sftp.port:22}")
private int sftpPort;
#Value("${sftp.user}")
private String sftpUser;
#Value("${sftp.password:#{null}}")
private String sftpPasword;
#Value("${sftp.remote.directory:/}")
private String sftpRemoteDirectory;
#Value("${sftp.privateKey:#{null}}")
private Resource sftpPrivateKey;
#Value("${sftp.privateKeyPassPhrase:}")
private String privateKeyPassPhrase;
#Value("${sftp.remote.directory.download.filter:*.*}")
private String sftpRemoteDirectoryDownloadFilter;
#Value("${sftp.remote.directory.download:/}")
private String sftpRemoteDirectoryDownload;
#Value("${sftp.local.directory.download:${java.io.tmpdir}/localDownload}")
private String sftpLocalDirectoryDownload;
/*
* The SftpSessionFactory creates the sftp sessions. This is where you define
* the host , user and key information for your sftp server.
*/
// Creating session for Remote Destination SFTP server Folder
#Bean
public SessionFactory<ChannelSftp.LsEntry> sftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost(sftpHost);
factory.setPort(sftpPort);
factory.setUser(sftpUser);
if (sftpPrivateKey != null) {
factory.setPrivateKey(sftpPrivateKey);
factory.setPrivateKeyPassphrase(privateKeyPassPhrase);
} else {
factory.setPassword("sftpPassword");
}
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<ChannelSftp.LsEntry>(factory);
}
/*
* The SftpInboundFileSynchronizer uses the session factory that we defined above.
* Here we set information about the remote directory to fetch files from.
* We could also set filters here to control which files get downloaded
*/
#Bean
public SftpInboundFileSynchronizer SftpInboundFileSynchronizer () {
SftpInboundFileSynchronizer synchronizer = new SftpInboundFileSynchronizer();
return null;
}
If you take a look into the SftpStreamingMessageSource instead: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-streaming, plus use a FileSplitter to read that file line by line (which supports a "first like as header", too), you won't need to worry about transferring the file to local dir. You will do everything with the remote content on demand.
On the other hand, since you talk about a REST API, you probably are going to have some #RestController or Spring Integration HTTP Inbound Gateway: https://docs.spring.io/spring-integration/docs/current/reference/html/http.html#http-inbound, then you need to think about using an SftpOutboundGateway with an MGET command: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-outbound-gateway.
If you still need to track a download time for every single file, you need to consider to use that gateway twice: with a LIST command and NAME_ONLY, and the second time for GET command. At this point you can add ChannelInterceptor for input and output channels of the second gateway, so you will have a file name info to correlate and catch start and stop time before and after this gateway.
I'm new to Spring Framework and, indeed, I'm learning and using Spring Boot. Recently, in the app I'm developing, I made Quartz Scheduler work, and now I want to make Spring Integration work there: FTP connection to a server to write and read files from.
What I want is really simple (as I've been able to do so in a previous java application). I've got two Quartz Jobs scheduled to fired in different times daily: one of them reads a file from a FTP server and another one writes a file to a FTP server.
I'll detail what I've developed so far.
#SpringBootApplication
#ImportResource("classpath:ws-config.xml")
#EnableIntegration
#EnableScheduling
public class MyApp extends SpringBootServletInitializer {
#Autowired
private Configuration configuration;
//...
#Bean
public DefaultFtpsSessionFactory myFtpsSessionFactory(){
DefaultFtpsSessionFactory sess = new DefaultFtpsSessionFactory();
Ftp ftp = configuration.getFtp();
sess.setHost(ftp.getServer());
sess.setPort(ftp.getPort());
sess.setUsername(ftp.getUsername());
sess.setPassword(ftp.getPassword());
return sess;
}
}
The following class I've named it as a FtpGateway, as follows:
#Component
public class FtpGateway {
#Autowired
private DefaultFtpsSessionFactory sess;
public void sendFile(){
// todo
}
public void readFile(){
// todo
}
}
I'm reading this documentation to learn to do so. Spring Integration's FTP seems to be event driven, so I don't know how can I execute either of the sendFile() and readFile() from by Jobs when the trigger is fired at an exact time.
The documentation tells me something about using Inbound Channel Adapter (to read files from a FTP?), Outbound Channel Adapter (to write files to a FTP?) and Outbound Gateway (to do what?):
Spring Integration supports sending and receiving files over FTP/FTPS by providing three client side endpoints: Inbound Channel Adapter, Outbound Channel Adapter, and Outbound Gateway. It also provides convenient namespace-based configuration options for defining these client components.
So, I haven't got it clear as how to follow.
Please, could anybody give me a hint?
Thank you!
EDIT:
Thank you #M. Deinum. First, I'll try a simple task: read a file from the FTP, the poller will run every 5 seconds. This is what I've added:
#Bean
public FtpInboundFileSynchronizer ftpInboundFileSynchronizer() {
FtpInboundFileSynchronizer fileSynchronizer = new FtpInboundFileSynchronizer(myFtpsSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setPreserveTimestamp(true);
fileSynchronizer.setRemoteDirectory("/Entrada");
fileSynchronizer.setFilter(new FtpSimplePatternFileListFilter("*.csv"));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "ftpChannel", poller = #Poller(fixedDelay = "5000"))
public MessageSource<File> ftpMessageSource() {
FtpInboundFileSynchronizingMessageSource source = new FtpInboundFileSynchronizingMessageSource(inbound);
source.setLocalDirectory(new File(configuracion.getDirFicherosDescargados()));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
return source;
}
#Bean
#ServiceActivator(inputChannel = "ftpChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
Object payload = message.getPayload();
if(payload instanceof File){
File f = (File) payload;
System.out.println(f.getName());
}else{
System.out.println(message.getPayload());
}
}
};
}
Then, when the app is running, I put a new csv file intro "Entrada" remote folder, but the handler() method isn't run after 5 seconds... I'm doing something wrong?
Please add #Scheduled(fixedDelay = 5000) over your poller method.
You should use SPRING BATCH with tasklet. It is far easier to configure bean, crone time, input source with existing interfaces provided by Spring.
https://www.baeldung.com/introduction-to-spring-batch
Above example is annotation and xml based both, you can use either.
Other benefit Take use of listeners and parallel steps. This framework can be used in Reader - Processor - Writer manner as well.
I have a Spring Batch service containg ItemWriter to write the data to the CSV.
I have used the example give by Spring Batch guide. https://spring.io/guides/gs/batch-processing/
I tried to modify the ItemWriter to create the CSV again.
The Problems which I am facing are -
It is not creating the CSV file if it is not present.
If I made it available before hand it is not writing data to it.
#Bean
public ItemWriter<Person> writer(DataSource dataSource) {
FlatFileitemWriter<Person> csvWriter = new FlatFileItemWriter<Person>();
csvWriter.setResource(new ClassPathResource("csv/new-data.csv"));
csvWriter.setShouldDeleteIfExists(true);
DelimitedLineAggregator<Person> lineAggregator = new DelimitedLineAggregator<Person>();
lineAggregator.setDelimiter(",");
BeanWrapperFieldExtractor<Person> fieldExtractor = new BeanWrapperFieldExtractor<Person>();
String[] names = {"firstName", "lastName"};
fieldExtractor.setNames(names);
lineAggregator.setFieldExtractor(fieldExtractor);
csvWriter.setLineAggregator(lineAggregator);
return csvWriter;
}
I have gone through various links but they show the example with XML based configuration. How to do it completely in JAVA ?
You are using a ClassPathResource to write. I'm not sure, but I don't think you can write to a ClassPathResource. Try using a normal FileSystemResource and try again.
Moreover, how do you inject the writer? are you sure that it really is instantiated as spring bean?
Why do you have DataSource as a parameter since you don't need a datasource to instantiate a FlatFileItemWriter.