Spring SFTP Outbound Adapter - determining when files have been sent - spring
I have a Spring SFTP output adapter that I start via "adapter.start()" in my main program. Once started, the adapter transfers and uploads all the files in the specified directory as expected. But I want to stop the adapter after all the files have been transferred. How do I detect if all the files have been transferred so I can issue an adapter.stop()?
#Bean
public IntegrationFlow sftpOutboundFlow() {
return IntegrationFlows.from(Files.inboundAdapter(new File(sftpOutboundDirectory))
.filterExpression("name.endsWith('.pdf') OR name.endsWith('.PDF')")
.preventDuplicates(true),
e -> e.id("sftpOutboundAdapter")
.autoStartup(false)
.poller(Pollers.trigger(new FireOnceTrigger())
.maxMessagesPerPoll(-1)))
.log(LoggingHandler.Level.INFO, "sftp.outbound", m -> m.getPayload())
.log(LoggingHandler.Level.INFO, "sftp.outbound", m -> m.getHeaders())
.handle(Sftp.outboundAdapter(outboundSftpSessionFactory())
.useTemporaryFileName(false)
.remoteDirectory(sftpRemoteDirectory))
.get();
}
#Artem Bilan has already given the answer. But here's kind of a concrete implementation of what he said - for those who are a Spring Integration noob like me:
Define a service to get the PDF files on demand:
#Service
public class MyFileService {
public List<File> getPdfFiles(final String srcDir) {
File[] files = new File(srcDir).listFiles((dir, name) -> name.toLowerCase().endsWith(".pdf"));
return Arrays.asList(files == null ? new File[]{} : files);
}
}
Define a Gateway to start the SFTP upload flow on demand:
#MessagingGateway
public interface SFtpOutboundGateway {
#Gateway(requestChannel = "sftpOutboundFlow.input")
void uploadFiles(List<File> files);
}
Define the Integration Flow to upload the files to the SFTP server via Sftp.outboundGateway:
#Configuration
#EnableIntegration
public class FtpFlowIntegrationConfig {
// could be also bound via #Value
private String sftpRemoteDirectory = "/path/to/remote/dir";
#Bean
public SessionFactory<ChannelSftp.LsEntry> outboundSftpSessionFactory() {
DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true);
factory.setHost("localhost");
factory.setPort(22222);
factory.setUser("client1");
factory.setPassword("password123");
factory.setAllowUnknownKeys(true);
return new CachingSessionFactory<>(factory);
}
#Bean
public IntegrationFlow sftpOutboundFlow(RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate) {
return e -> e
.log(LoggingHandler.Level.INFO, "sftp.outbound", Message::getPayload)
.log(LoggingHandler.Level.INFO, "sftp.outbound", Message::getHeaders)
.handle(
Sftp.outboundGateway(remoteFileTemplate, AbstractRemoteFileOutboundGateway.Command.MPUT, "payload")
);
}
#Bean
public RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate(SessionFactory<ChannelSftp.LsEntry> outboundSftpSessionFactory) {
RemoteFileTemplate<ChannelSftp.LsEntry> template = new SftpRemoteFileTemplate(outboundSftpSessionFactory);
template.setRemoteDirectoryExpression(new LiteralExpression(sftpRemoteDirectory));
template.setAutoCreateDirectory(true);
template.afterPropertiesSet();
template.setUseTemporaryFileName(false);
return template;
}
}
Wiring up:
public class SpringApp {
public static void main(String[] args) {
final MyFileService fileService = ctx.getBean(MyFileService.class);
final SFtpOutboundGateway sFtpOutboundGateway = ctx.getBean(SFtpOutboundGateway.class);
// trigger the sftp upload flow manually - only once
sFtpOutboundGateway.uploadFiles(fileService.getPdfFiles());
}
}
Import notes:
1.
#Gateway(requestChannel = "sftpOutboundFlow.input")
void uploadFiles(List files);
Here the DirectChannel channel sftpOutboundFlow.input will be used to pass message with the payload (= List<File> files) to the receiver. If this channel is not created yet, the Gateway is going to create it implicitly.
2.
#Bean
public IntegrationFlow sftpOutboundFlow(RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate) { ... }
Since IntegrationFlow is a Consumer functional interface, we can simplify the flow a little using the IntegrationFlowDefinition. During the bean registration phase, the IntegrationFlowBeanPostProcessor converts this inline (Lambda) IntegrationFlow to a StandardIntegrationFlow and processes its components. An IntegrationFlow definition using a Lambda populates DirectChannel as an inputChannel of the flow and it is registered in the application context as a bean with the name sftpOutboundFlow.input in the sample above (flow bean name + ".input"). That's why we use that name for the SFtpOutboundGateway gateway.
Ref: https://spring.io/blog/2014/11/25/spring-integration-java-dsl-line-by-line-tutorial
3.
#Bean
public RemoteFileTemplate<ChannelSftp.LsEntry> remoteFileTemplate(SessionFactory<ChannelSftp.LsEntry> outboundSftpSessionFactory) {}
see: Remote directory for sftp outbound gateway with DSL
Flowchart:
But I want to stop the adapter after all the files have been transferred.
Logically this is not for what this kind of component has been designed. Since you are not going to have some constantly changing local directory, probably it is better to think about an even driver solution to list files in the directory via some action. Yes, it can be a call from the main, but only once for all the content of the dir and that's all.
And for this reason the Sftp.outboundGateway() with a Command.MPUT is there for you:
https://docs.spring.io/spring-integration/reference/html/sftp.html#using-the-mput-command.
You still can trigger an IntegrationFlow, but it could start from a #MessagingGateway interface to be called from a main with a local directory to list files for uploading:
https://docs.spring.io/spring-integration/reference/html/dsl.html#java-dsl-gateway
Related
How to create instance specific message queues in springboot rest api
I have a number of microservices, each running in its own container in a load balanced environment. I have a need for each instance of these microservices to create a rabbitmq queue when it starts up and delete it when it stops. I have currently defined the following property in my application properties file: config_queue: config_${PID} My message queue listener looks like this: public class ConfigListener { Logger logger = LoggerFactory.getLogger(ConfigListener.class); // https://www.programcreek.com/java-api-examples/index.php?api=org.springframework.amqp.rabbit.annotation.RabbitListener #RabbitListener(bindings = #QueueBinding( value = #Queue(value = "${config_queue}", autoDelete = "true"), exchange = #Exchange(value = AppConstants.TOPIC_CONFIGURATION, type= ExchangeTypes.FANOUT) )) public void configChanged(String message){ ... application logic } } All this works great when I run the microservice. A queue with prefix config and process id gets created and is auto deleted when I stop the service. However, when I run this service and others in their individual docker containers, all services have the same PID and that is 1. Does anybody have any idea how I can create specify a queue that is unique to that instance. Thanks in advance for your help.
Use an AnonymousQueue instead: #SpringBootApplication public class So72030217Application { public static void main(String[] args) { SpringApplication.run(So72030217Application.class, args); } #RabbitListener(queues = "#{configQueue.name}") public void listen(String in) { System.out.println(in); } } #Configuration class Config { #Bean FanoutExchange fanout() { return new FanoutExchange("config"); } #Bean Queue configQueue() { return new AnonymousQueue(new Base64UrlNamingStrategy("config_")); } #Bean Binding binding() { return BindingBuilder.bind(configQueue()).to(fanout()); } } AnonymousQueues are auto-delete and use a Base64 encoded UUID in the name.
Spring Cloudstream 3 + RabbitMQ configuration to existing queue
I'm learning Cloudstream and cannot map the cloudstream Function<String, String> into existing queue. I'm just creating the hello world app from spring cloud documentation, but don't really understand this part regarding binding names. I have q.test (existing) on my rabbitmq app, but when I use this code and configuration, my app always create new queue q.test.anonymous.someRandomString. Anybody has configuration example for this? #SpringBootApplication public class CloudstreamApplication { public static void main(String[] args) { SpringApplication.run(CloudstreamApplication.class, args); } #Bean public Function<String, String> uppercase() { return value -> { System.out.println("Received: " + value); return value.toUpperCase(); }; } } application.yml spring.cloud.stream: function.bindings: uppercase-in-0: q.test bindings: uppercase-in-0.destination: q.test Thanks
See the binder documentation - Using Existing Queues/Exchanges. If you have an existing exchange/queue that you wish to use, you can completely disable automatic provisioning as follows, assuming the exchange is named myExchange and the queue is named myQueue: spring.cloud.stream.bindings.<binding name>.destination=myExhange spring.cloud.stream.bindings.<binding name>.group=myQueue spring.cloud.stream.rabbit.bindings.<binding name>.consumer.bindQueue=false spring.cloud.stream.rabbit.bindings.<binding name>.consumer.declareExchange=false spring.cloud.stream.rabbit.bindings.<binding name>.consumer.queueNameGroupOnly=true
Spring Integration: how to access the returned values from last Subscriber
I'm trying to implement a SFTP File Upload of 2 Files which has to happen in a certain order - first a pdf file and after successfull upload of that an text file with meta information about the pdf. I followed the advice in this thread, but can't get it to work properly. My Spring Boot Configuration: #Bean public SessionFactory<LsEntry> sftpSessionFactory() { final DefaultSftpSessionFactory factory = new DefaultSftpSessionFactory(true); final Properties jschProps = new Properties(); jschProps.put("StrictHostKeyChecking", "no"); jschProps.put("PreferredAuthentications", "publickey,password"); factory.setSessionConfig(jschProps); factory.setHost(sftpHost); factory.setPort(sftpPort); factory.setUser(sftpUser); if (sftpPrivateKey != null) { factory.setPrivateKey(sftpPrivateKey); factory.setPrivateKeyPassphrase(sftpPrivateKeyPassphrase); } else { factory.setPassword(sftpPasword); } factory.setAllowUnknownKeys(true); return new CachingSessionFactory<>(factory); } #Bean #BridgeTo public MessageChannel toSftpChannel() { return new PublishSubscribeChannel(); } #Bean #ServiceActivator(inputChannel = "toSftpChannel") #Order(0) public MessageHandler handler() { final SftpMessageHandler handler = new SftpMessageHandler(sftpSessionFactory()); handler.setRemoteDirectoryExpression(new LiteralExpression(sftpRemoteDirectory)); handler.setFileNameGenerator(message -> { if (message.getPayload() instanceof byte[]) { return (String) message.getHeaders().get("filename"); } else { throw new IllegalArgumentException("File expected as payload."); } }); return handler; } #ServiceActivator(inputChannel = "toSftpChannel") #Order(1) public String transferComplete(#Payload byte[] file, #Header("filename") String filename) { return "The SFTP transfer complete for file: " + filename; } #MessagingGateway public interface UploadGateway { #Gateway(requestChannel = "toSftpChannel") String upload(#Payload byte[] file, #Header("filename") String filename); } My Test Case: final String pdfStatus = uploadGateway.upload(content, documentName); log.info("Upload of {} completed, {}.", documentName, pdfStatus); From the return of the Gateway upload call i expect to get the String confirming the upload e.g. "The SFTP transfer complete for file:..." but I get the the returned content of the uploaded File in byte[]: Upload of 123456789.1.pdf completed, 37,80,68,70,45,49,46,54,13,37,-30,-29,-49,-45,13,10,50,55,53,32,48,32,111,98,106,13,60,60,47,76,105,110,101,97,114,105,122,101,100,32,49,47,76,32,50,53,52,55,49,48,47,79,32,50,55,55,47,69,32,49,49,49,55,55,55,47,78,32,49,47,84,32,50,53,52,51,53,57,47,72,32,91,32,49,49,57,55,32,53,51,55,93,62,62,13,101,110,100,111,98,106,13,32,32,32,32,32,32,32,32,32,32,32,32,13,10,52,55,49,32,48,32,111,98,106,13,60,60,47,68,101,99,111,100,101,80,97,114,109,115,60,60,47,67,111,108,117,109,110,115,32,53,47,80,114,101,100,105,99,116,111,114,32,49,50,62,62,47,70,105,108,116,101,114,47,70,108,97,116,101,68,101,99,111,100,101,47,73,68,91,60,57,66,53,49,56,54,69,70,53,66,56,66,49,50,52,49,65,56,50,49,55,50,54,56,65,65,54,52,65,57,70,54,62,60,68,52,50,68,51,55,54,53,54,65,67,48,55,54,52,65,65,53,52,66,52,57,51,50,56,52,56,68,66 etc. What am I missing?
I think #Order(0) doesn't work together with the #Bean. To fix it you should do this in that bean definition istead: final SftpMessageHandler handler = new SftpMessageHandler(sftpSessionFactory()); handler.setOrder(0); See Reference Manual for more info: When using these annotations on consumer #Bean definitions, if the bean definition returns an appropriate MessageHandler (depending on the annotation type), attributes such as outputChannel, requiresReply etc, must be set on the MessageHandler #Bean definition itself. In other words: if you can use setter, you have to. We don't process annotations for this case because there is no guarantee what should get a precedence. So, to avoid such a confuse we have left for you only setters choice. UPDATE I see your problem and it is here: #Bean #BridgeTo public MessageChannel toSftpChannel() { return new PublishSubscribeChannel(); } That is confirmed by the logs: Adding {bridge:dmsSftpConfig.toSftpChannel.bridgeTo} as a subscriber to the 'toSftpChannel' channel Channel 'org.springframework.context.support.GenericApplicationContext#b3d0f7.toSftpChannel' has 3 subscriber(s). started dmsSftpConfig.toSftpChannel.bridgeTo So, you really have one more subscriber to that toSftpChannel and it is a BridgeHandler with an output to the replyChannel header. And a default order is like private volatile int order = Ordered.LOWEST_PRECEDENCE; this one becomes as a first subscriber and exactly this one returns you that byte[] just because it is a payload of request. You need to decide if you really need such a bridge. There is no workaround for the #Order though...
Discussion about spring integration sftp
I use spring integration sftp to download and upload files.In the document ,I found Spring Integration supports sending and receiving files over SFTP by providing three client side endpoints: Inbound Channel Adapter, Outbound Channel Adapter, and Outbound Gateway When I want to download files I must assign the local directory and when I want to upload files I must assign the remote directory.But if I can't assign the directory when I write the code such as my directory is association with date.How can I assign the directory at runtime? Here is my code: #Bean public SessionFactory<LsEntry> sftpSessionFactory(){ DefaultSftpSessionFactory defaultSftpSessionFactory = new DefaultSftpSessionFactory(); defaultSftpSessionFactory.setHost(host); defaultSftpSessionFactory.setPort(Integer.parseInt(port)); defaultSftpSessionFactory.setUser(username); defaultSftpSessionFactory.setPassword(password); defaultSftpSessionFactory.setAllowUnknownKeys(true); return new CachingSessionFactory<LsEntry>(defaultSftpSessionFactory); } #Bean public SftpRemoteFileTemplate sftpRemoteFileTemplate(){ SftpRemoteFileTemplate sftpRemoteFileTemplate = new SftpRemoteFileTemplate(sftpSessionFactory()); return sftpRemoteFileTemplate; } #Bean #ServiceActivator(inputChannel = "sftpChannel") public MessageHandler handlerGet() { SftpOutboundGateway sftpOutboundGateway = new SftpOutboundGateway(sftpSessionFactory(), "mget", "payload"); sftpOutboundGateway.setLocalDirectory(new File(localDirectory)); sftpOutboundGateway.setFilter(new SftpSimplePatternFileListFilter("*.txt")); sftpOutboundGateway.setSendTimeout(1000); return sftpOutboundGateway; } In the messageHandler,I must assign the localDirectory in the outboundGateway. And when I want change my localDirectory by days.I must download the file to the localDirectory and move to the target directory. How can I assign the localDirectory at runtime .such as today I download to 20170606/ and tomorrow I download to 20170607 ? edit this is my option and test public interface OutboundGatewayOption { #Gateway(requestChannel = "sftpChannel") public List<File> getFiles(String dir); } #Test public void test2(){ outboundGatewayOption.getFiles("upload/20160920/"); }
sftpOutboundGateway.setLocalDirectoryExpression( new SpelExpressionParser().parseExpression("headers['whereToPutTheFiles']"); or parseExpression("#someBean.getDirectoryName(payload)") etc. The expression must evaluate to a String representing the directory absolute path. While evaluating the expression, the remote directory is available as a variable #remoteDirectory.
Send and receive files from FTP in Spring Boot
I'm new to Spring Framework and, indeed, I'm learning and using Spring Boot. Recently, in the app I'm developing, I made Quartz Scheduler work, and now I want to make Spring Integration work there: FTP connection to a server to write and read files from. What I want is really simple (as I've been able to do so in a previous java application). I've got two Quartz Jobs scheduled to fired in different times daily: one of them reads a file from a FTP server and another one writes a file to a FTP server. I'll detail what I've developed so far. #SpringBootApplication #ImportResource("classpath:ws-config.xml") #EnableIntegration #EnableScheduling public class MyApp extends SpringBootServletInitializer { #Autowired private Configuration configuration; //... #Bean public DefaultFtpsSessionFactory myFtpsSessionFactory(){ DefaultFtpsSessionFactory sess = new DefaultFtpsSessionFactory(); Ftp ftp = configuration.getFtp(); sess.setHost(ftp.getServer()); sess.setPort(ftp.getPort()); sess.setUsername(ftp.getUsername()); sess.setPassword(ftp.getPassword()); return sess; } } The following class I've named it as a FtpGateway, as follows: #Component public class FtpGateway { #Autowired private DefaultFtpsSessionFactory sess; public void sendFile(){ // todo } public void readFile(){ // todo } } I'm reading this documentation to learn to do so. Spring Integration's FTP seems to be event driven, so I don't know how can I execute either of the sendFile() and readFile() from by Jobs when the trigger is fired at an exact time. The documentation tells me something about using Inbound Channel Adapter (to read files from a FTP?), Outbound Channel Adapter (to write files to a FTP?) and Outbound Gateway (to do what?): Spring Integration supports sending and receiving files over FTP/FTPS by providing three client side endpoints: Inbound Channel Adapter, Outbound Channel Adapter, and Outbound Gateway. It also provides convenient namespace-based configuration options for defining these client components. So, I haven't got it clear as how to follow. Please, could anybody give me a hint? Thank you! EDIT: Thank you #M. Deinum. First, I'll try a simple task: read a file from the FTP, the poller will run every 5 seconds. This is what I've added: #Bean public FtpInboundFileSynchronizer ftpInboundFileSynchronizer() { FtpInboundFileSynchronizer fileSynchronizer = new FtpInboundFileSynchronizer(myFtpsSessionFactory()); fileSynchronizer.setDeleteRemoteFiles(false); fileSynchronizer.setPreserveTimestamp(true); fileSynchronizer.setRemoteDirectory("/Entrada"); fileSynchronizer.setFilter(new FtpSimplePatternFileListFilter("*.csv")); return fileSynchronizer; } #Bean #InboundChannelAdapter(channel = "ftpChannel", poller = #Poller(fixedDelay = "5000")) public MessageSource<File> ftpMessageSource() { FtpInboundFileSynchronizingMessageSource source = new FtpInboundFileSynchronizingMessageSource(inbound); source.setLocalDirectory(new File(configuracion.getDirFicherosDescargados())); source.setAutoCreateLocalDirectory(true); source.setLocalFilter(new AcceptOnceFileListFilter<File>()); return source; } #Bean #ServiceActivator(inputChannel = "ftpChannel") public MessageHandler handler() { return new MessageHandler() { #Override public void handleMessage(Message<?> message) throws MessagingException { Object payload = message.getPayload(); if(payload instanceof File){ File f = (File) payload; System.out.println(f.getName()); }else{ System.out.println(message.getPayload()); } } }; } Then, when the app is running, I put a new csv file intro "Entrada" remote folder, but the handler() method isn't run after 5 seconds... I'm doing something wrong?
Please add #Scheduled(fixedDelay = 5000) over your poller method.
You should use SPRING BATCH with tasklet. It is far easier to configure bean, crone time, input source with existing interfaces provided by Spring. https://www.baeldung.com/introduction-to-spring-batch Above example is annotation and xml based both, you can use either. Other benefit Take use of listeners and parallel steps. This framework can be used in Reader - Processor - Writer manner as well.