I am currently new to Spring Integration.
Basically trying to poll onto multiple file locations asynchronously with Java Spring integration DSL. I am required to get the file name and perform some operations with filename and push the file to S3 finally, my question is can these tasks of performing operations with file be performed in the task executor or the service activator handler . I am not sure which is the right place.
#Autowired
private AWSFileManager awsFileManager;
#Bean
public IntegrationFlow inboundChannelFlow(#Value("${file.poller.delay}") long delay,
#Value("${file.poller.messages}") int maxMsgsPerPoll,
TaskExecutor taskExecutor, MessageSource<File> fileSource)
{
return IntegrationFlows.from(fileSource,
c -> c.poller(Pollers.fixedDelay(delay)
.taskExecutor(taskExecutor)
.maxMessagesPerPoll(maxMsgsPerPoll)))
.handle("AWSFileManager", "fileUpload")
.channel(ApplicationConfiguration.inboundChannel)
.get();
}
#Bean
TaskExecutor taskExecutor(#Value("${file.poller.thread.pool.size}") int poolSize) {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
//Runnable task1 = () -> {this.methodsamp();};
taskExecutor.setCorePoolSize(poolSize);
//taskExecutor.execute(task1);
return taskExecutor;
}
#Async
public void methodsamp()
{
try
{
awsFileManager.fileUpload();
System.out.println("test");
}
catch(Exception ex)
{
}
I have attached the sample code here.
Also is there a way I could retrieve the filename of the files in the channel as I need to pass this as parameter to the fileUpload method.
Please advise.
Your question isn't clear. The TaskExecutor is for the thread context in the flow. The Service Activator (.handle()) is exactly for your business logic method. This one can be performed on a thread from the executor. And you really use them in your IntegrationFlow correctly.
The FileReadingMessageSource produces message with the java.io.File as a payload. So, that is the way to get a file name - just from File.getName()!
Related
I have a Spring SFTP output adapter that I start via "adapter.start()" in my main program. Once started, the adapter transfers and uploads all the files in the specified directory as expected. But I want to stop the adapter after all the files have been transferred. How do I detect if all the files have been transferred so I can issue an adapter.stop()?
#Bean
public IntegrationFlow sftpOutboundFlow() {
return IntegrationFlows.from(Files.inboundAdapter(new File(sftpOutboundDirectory))
.filterExpression("name.endsWith('.pdf') OR name.endsWith('.PDF')")
.preventDuplicates(true),
e -> e.id("sftpOutboundAdapter")
.autoStartup(false)
.poller(Pollers.trigger(new FireOnceTrigger())
.maxMessagesPerPoll(-1)))
.log(LoggingHandler.Level.INFO, "sftp.outbound", m -> m.getPayload())
.log(LoggingHandler.Level.INFO, "sftp.outbound", m -> m.getHeaders())
.handle(Sftp.outboundAdapter(outboundSftpSessionFactory())
.useTemporaryFileName(false)
.remoteDirectory(sftpRemoteDirectory))
.get();
}
#Artem Bilan has already given the answer. But here's kind of a concrete implementation of what he said - for those who are a Spring Integration noob like me:
Define a service to get the PDF files on demand:
#Service
public class MyFileService {
public List<File> getPdfFiles(final String srcDir) {
File[] files = new File(srcDir).listFiles((dir, name) -> name.toLowerCase().endsWith(".pdf"));
return Arrays.asList(files == null ? new File[]{} : files);
}
}
Define a Gateway to start the SFTP upload flow on demand:
#MessagingGateway
public interface SFtpOutboundGateway {
#Gateway(requestChannel = "sftpOutboundFlow.input")
void uploadFiles(List<File> files);
}
Define the Integration Flow to upload the files to the SFTP server via Sftp.outboundGateway:
#Configuration
#EnableIntegration
public class FtpFlowIntegrationConfig {
// could be also bound via #Value
private String sftpRemoteDirectory = "/path/to/remote/dir";
#Bean
public SessionFactory<ChannelSftp.LsEntry> outboundSftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost("localhost");
factory.setPort(22222);
factory.setUser("client1");
factory.setPassword("password123");
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(factory);
}
#Bean
public IntegrationFlow sftpOutboundFlow(RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate) {
return e -> e
.log(LoggingHandler.Level.INFO, "sftp.outbound", Message::getPayload)
.log(LoggingHandler.Level.INFO, "sftp.outbound", Message::getHeaders)
.handle(
Sftp.outboundGateway(remoteFileTemplate, AbstractRemoteFileOutboundGateway.Command.MPUT, "payload")
);
}
#Bean
public RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate(SessionFactory<ChannelSftp.LsEntry> outboundSftpSessionFactory) {
RemoteFileTemplate<ChannelSftp.LsEntry> template = new SftpRemoteFileTemplate(outboundSftpSessionFactory);
template.setRemoteDirectoryExpression(new LiteralExpression(sftpRemoteDirectory));
template.setAutoCreateDirectory(true);
template.afterPropertiesSet();
template.setUseTemporaryFileName(false);
return template;
}
}
Wiring up:
public class SpringApp {
public static void main(String[] args) {
final MyFileService fileService = ctx.getBean(MyFileService.class);
final SFtpOutboundGateway sFtpOutboundGateway = ctx.getBean(SFtpOutboundGateway.class);
// trigger the sftp upload flow manually - only once
sFtpOutboundGateway.uploadFiles(fileService.getPdfFiles());
}
}
Import notes:
1.
#Gateway(requestChannel = "sftpOutboundFlow.input")
void uploadFiles(List files);
Here the DirectChannel channel sftpOutboundFlow.input will be used to pass message with the payload (= List<File> files) to the receiver. If this channel is not created yet, the Gateway is going to create it implicitly.
2.
#Bean
public IntegrationFlow sftpOutboundFlow(RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate) { ... }
Since IntegrationFlow is a Consumer functional interface, we can simplify the flow a little using the IntegrationFlowDefinition. During the bean registration phase, the IntegrationFlowBeanPostProcessor converts this inline (Lambda) IntegrationFlow to a StandardIntegrationFlow and processes its components. An IntegrationFlow definition using a Lambda populates DirectChannel as an inputChannel of the flow and it is registered in the application context as a bean with the name sftpOutboundFlow.input in the sample above (flow bean name + ".input"). That's why we use that name for the SFtpOutboundGateway gateway.
Ref: https://spring.io/blog/2014/11/25/spring-integration-java-dsl-line-by-line-tutorial
3.
#Bean
public RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate(SessionFactory<ChannelSftp.LsEntry> outboundSftpSessionFactory) {}
see: Remote directory for sftp outbound gateway with DSL
Flowchart:
But I want to stop the adapter after all the files have been transferred.
Logically this is not for what this kind of component has been designed. Since you are not going to have some constantly changing local directory, probably it is better to think about an even driver solution to list files in the directory via some action. Yes, it can be a call from the main, but only once for all the content of the dir and that's all.
And for this reason the Sftp.outboundGateway() with a Command.MPUT is there for you:
https://docs.spring.io/spring-integration/reference/html/sftp.html#using-the-mput-command.
You still can trigger an IntegrationFlow, but it could start from a #MessagingGateway interface to be called from a main with a local directory to list files for uploading:
https://docs.spring.io/spring-integration/reference/html/dsl.html#java-dsl-gateway
I had written rest services in spring that is running perfectly fine.
Now, I need to add perform some db transactions before returning response to user.
This db transaction is independent to response retrieved.
For example,
#PostMapping("login")
public TransactionResponse loginAuthentication(#Valid #RequestBody LoginRequestBody loginRequest) {
TransactionResponse transactionResponse = new TransactionResponse();
try {
transactionResponse = loginService.validateUser(loginRequest);
//independent transaction needs to be executed in a separate thread
loginSerice.addLoginLog(transactionResponse);
//return below response without waiting to compelete above log transaction
return transactionResponse;
}
catch (Exception e) {
return CommonUtils.setErrorResponse(transactionResponse, e);
}
}
I read upon async controller in spring mvc link. Although controller
executes respective functionality in a separate thread but I don't want to wait for db transaction to be completed. After getting response from service layer, it should be forwarded to user without any delay.
Any Suggestions !!
Spring version is 4.3
I posted this answer to help the fellow developers with same kind of requirement (to execute a void function in a separate thread).
Since I am not experienced in multithreading/asynchronous environment, I want to keep it simple by using spring asynchronous methods.
So, First I created the Thread Pool
#Configuration
#EnableAsync
public class ThreadConfig {
#Bean
public TaskExecutor threadPoolTaskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(4);
executor.setMaxPoolSize(4);
executor.setThreadNamePrefix("WEBAPP");
executor.initialize();
return executor;
}
}
Then I created a service that will execute my code in a separate thread.
#Async
#Service
#Transactional
public class LoggingService {
public void logintransaction() throws Exception{
System.out.println("start login loggin");
Thread.sleep(5000);
System.out.println("exit");
}
}
Lastly, I called the above service on my controller. As I can see Total Time Taken is printed first, then "start login loggin" is printed. This means my new method is executed in a new thread.
#Autowired
private LoggingService loggingService;
#PostMapping("login")
public TransactionResponse loginAuthentication(#Valid #RequestBody LoginRequestBody loginRequest) {
long startTime = System.currentTimeMillis();
TransactionResponse transactionResponse = new TransactionResponse();
try {
transactionResponse = loginService.validateUser(loginRequest);
//independent transaction needs to be executed in a separate thread
//loginSerice.addLoginLog(transactionResponse);
loggingService.logintransaction();
//return below response without waiting to compelete above log transaction
System.err.println("Total Time Taken=>"+(System.currentTimeMillis() - startTime));
return transactionResponse;
}
catch (Exception e) {
return CommonUtils.setErrorResponse(transactionResponse, e);
}
}
Thanks
Please refer to system diagram attached.
system diagram here
ISSUE: When I try to post message to input channel, the code tries to connect to the DB and throws an exception that it is unable to connect.
Code inside 5 -> Read from a channel, apply Business Logic (empty for now) and send the response to another channel.
#Bean
public IntegrationFlow sendToBusinessLogictoNotifyExternalSystem() {
return IntegrationFlows
.from("CommonChannelName")
.handle("Business Logic Class name") // Business Logic empty for now
.channel("QueuetoAnotherSystem")
.get();
}
I have written the JUnit for 5 as given below,
#Autowired
PublishSubscribeChannel CommonChannelName;
#Autowired
MessageChannel QueuetoAnotherSystem;
#Test
public void sendToBusinessLogictoNotifyExternalSystem() {
Message<?> message = (Message<?>) MessageBuilder.withPayload("World")
.setHeader(MessageHeaders.REPLY_CHANNEL, QueuetoAnotherSystem).build();
this.CommonChannelName.send((org.springframework.messaging.Message<?>) message);
Message<?> receive = QueuetoAnotherSystem.receive(5000);
assertNotNull(receive);
assertEquals("World", receive.getPayload());
}
ISSUE: As you can see from the system diagram, my code also has a DB connection on a different flow.
When I try to post message to producer channel, the code tries to connect to the DB and throws an exception that it is unable to connect.
I do not want this to happen, because the JUnit should never be related to the DB, and should run anywhere, anytime.
How do I fix this exception?
NOTE: Not sure if it matters, the application is a Spring Boot application. I have used Spring Integration inside the code to read and write from/to queues.
Since the common channel is a publish/subscribe channel, the message goes to both flows.
If this is a follow-up to this question/answer, you can prevent the DB flow from being invoked by calling stop() on the sendToDb flow (as long as you set ignoreFailures to true on the pub/sub channel, like I suggested there.
((Lifecycle) sendToDb).stop();
JUNIT TEST CASE - UPDATED:
#Autowired
PublishSubscribeChannel CommonChannelName;
#Autowired
MessageChannel QueuetoAnotherSystem;
#Autowired
SendResponsetoDBConfig sendResponsetoDBConfig;
#Test
public void sendToBusinessLogictoNotifyExternalSystem() {
Lifecycle flowToDB = ((Lifecycle) sendResponsetoDBConfig.sendToDb());
flowToDB.stop();
Message<?> message = (Message<?>) MessageBuilder.withPayload("World")
.setHeader(MessageHeaders.REPLY_CHANNEL, QueuetoAnotherSystem).build();
this.CommonChannelName.send((org.springframework.messaging.Message<?>) message);
Message<?> receive = QueuetoAnotherSystem.receive(5000);
assertNotNull(receive);
assertEquals("World", receive.getPayload());
}
CODE FOR 4: The flow that handles message to DB
public class SendResponsetoDBConfig {
#Bean
public IntegrationFlow sendToDb() {
System.out.println("******************* Inside SendResponsetoDBConfig.sendToDb ***********");
return IntegrationFlows
.from("Common Channel Name")
.handle("DAO Impl to store into DB")
.get();
}
}
NOTE: ******************* Inside SendResponsetoDBConfig.sendToDb *********** never gets printed.
I'm new to Spring Framework and, indeed, I'm learning and using Spring Boot. Recently, in the app I'm developing, I made Quartz Scheduler work, and now I want to make Spring Integration work there: FTP connection to a server to write and read files from.
What I want is really simple (as I've been able to do so in a previous java application). I've got two Quartz Jobs scheduled to fired in different times daily: one of them reads a file from a FTP server and another one writes a file to a FTP server.
I'll detail what I've developed so far.
#SpringBootApplication
#ImportResource("classpath:ws-config.xml")
#EnableIntegration
#EnableScheduling
public class MyApp extends SpringBootServletInitializer {
#Autowired
private Configuration configuration;
//...
#Bean
public DefaultFtpsSessionFactory myFtpsSessionFactory(){
DefaultFtpsSessionFactory sess = new DefaultFtpsSessionFactory();
Ftp ftp = configuration.getFtp();
sess.setHost(ftp.getServer());
sess.setPort(ftp.getPort());
sess.setUsername(ftp.getUsername());
sess.setPassword(ftp.getPassword());
return sess;
}
}
The following class I've named it as a FtpGateway, as follows:
#Component
public class FtpGateway {
#Autowired
private DefaultFtpsSessionFactory sess;
public void sendFile(){
// todo
}
public void readFile(){
// todo
}
}
I'm reading this documentation to learn to do so. Spring Integration's FTP seems to be event driven, so I don't know how can I execute either of the sendFile() and readFile() from by Jobs when the trigger is fired at an exact time.
The documentation tells me something about using Inbound Channel Adapter (to read files from a FTP?), Outbound Channel Adapter (to write files to a FTP?) and Outbound Gateway (to do what?):
Spring Integration supports sending and receiving files over FTP/FTPS by providing three client side endpoints: Inbound Channel Adapter, Outbound Channel Adapter, and Outbound Gateway. It also provides convenient namespace-based configuration options for defining these client components.
So, I haven't got it clear as how to follow.
Please, could anybody give me a hint?
Thank you!
EDIT:
Thank you #M. Deinum. First, I'll try a simple task: read a file from the FTP, the poller will run every 5 seconds. This is what I've added:
#Bean
public FtpInboundFileSynchronizer ftpInboundFileSynchronizer() {
FtpInboundFileSynchronizer fileSynchronizer = new FtpInboundFileSynchronizer(myFtpsSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setPreserveTimestamp(true);
fileSynchronizer.setRemoteDirectory("/Entrada");
fileSynchronizer.setFilter(new FtpSimplePatternFileListFilter("*.csv"));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "ftpChannel", poller = #Poller(fixedDelay = "5000"))
public MessageSource<File> ftpMessageSource() {
FtpInboundFileSynchronizingMessageSource source = new FtpInboundFileSynchronizingMessageSource(inbound);
source.setLocalDirectory(new File(configuracion.getDirFicherosDescargados()));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
return source;
}
#Bean
#ServiceActivator(inputChannel = "ftpChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
Object payload = message.getPayload();
if(payload instanceof File){
File f = (File) payload;
System.out.println(f.getName());
}else{
System.out.println(message.getPayload());
}
}
};
}
Then, when the app is running, I put a new csv file intro "Entrada" remote folder, but the handler() method isn't run after 5 seconds... I'm doing something wrong?
Please add #Scheduled(fixedDelay = 5000) over your poller method.
You should use SPRING BATCH with tasklet. It is far easier to configure bean, crone time, input source with existing interfaces provided by Spring.
https://www.baeldung.com/introduction-to-spring-batch
Above example is annotation and xml based both, you can use either.
Other benefit Take use of listeners and parallel steps. This framework can be used in Reader - Processor - Writer manner as well.
I am trying to set up an integration flow where an input channel is a queue that would act as a non-blocking queue. What I see now is that if I trigger a message processing from a Spring MVC Controller, if a message processing takes time, controller won't return and will wait for message handler (Service Activator) to complete.
Here's an integration configuration
#Bean(name = "createIssue.input")
MessageChannel queueInput() {
return MessageChannels.queue(10)
.get();
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
PollerMetadata poller() {
return Pollers.fixedRate(100)
.maxMessagesPerPoll(1)
.get();
}
#Bean
IntegrationFlow createIssue() {
return IntegrationFlows.from(queueInput())
.split()
.transform(mytransformer, "convert")
.handle(myservice, "createIssue")
.get();
}
myservice and mytransformer are just regular Spring beans.
I have a Spring MVC REST Controller that writes to the createIssue.input queue using a gateway in one of its GET handlers.
If I set a breapoint in myservice.createIssue() method, I can see that controller does not return from it's method, so an external service that triggered controller has to wait for my service to complete.
What I am trying to achieve is to have an async processing queue where a gateway would just write a message into a queue and return immediately. How can I achieve that?
You should use void Gateway there, which really acts as a "just send" component:
public interface Cafe {
#Gateway(requestChannel="orders")
void placeOrder(Order order);
}
And your .handle() in the end of IntegrationFlow should replies to the nullChannel or just doesn't return anything from the createIssue() method.