Spring SFTP & file outbound gateway chain - spring

new to Spring and could use some help on figuring out how to correctly chain together an sftp outbound gateway with a file outbound gateway. I want to confirm a file has sftp'ed, and then move it to an archive location.
Essentially, I have a directory where files are sent to be sftp'd somewhere else. The file is then supposed to be moved to an Archive directory, after the file has been transferred.
Each code piece works independently, but fails when I attempt to connect the two. I am unable to use the reply channel that I would normally because the reply channel confirms where the file has been saved to remotely, and that .msg is moved to the archive directory.
I suspect that order does not do what I think it does.
Currently, the file moves 90% of the time to the archive directory, without sftping the file.
Is this possible, or am I just barking up the wrong tree? Is there a way to configure the sftp:outbound-gateway downstream, or should I try using a different method?
<!-- START: SFTP files-->
<int-file:inbound-channel-adapter
directory="file:${sftp.repo}"
channel="SFTPchannel"
prevent-duplicates="false"
ignore-hidden="true" />
<int-sftp:outbound-gateway
session-factory="SFTPFactory"
request-channel="SFTPchannel"
order="1"
command="mput"
command-options="-1"
expression="payload"
mode="REPLACE"
use-temporary-file-name="false"
remote-filename-generator="filenameGenerator"
auto-create-directory="false"
remote-directory="${sftp.remote.destination}"/>
<int-file:outbound-gateway
request-channel="SFTPchannel"
order="2"
directory-expression="'${repository.directory}/'+new java.text.SimpleDateFormat('yyyyMMdd').format(new java.util.Date())"
mode="REPLACE"
auto-create-directory="true"
filename-generator="filenameGenerator"
delete-source-files="true"
reply-channel="nullChannel" />
<!-- END: SFTP files-->

See the retry-and-more sample, and particularly the Expression Evaluating Advice Demo therein. It covers your exact use case.

Related

Delete source file from s3 bucket : s3-inbound-streaming-channel-adapter

Source files are not deleting from s3 bucket once after successfully transfer it to target directory.
steps 1. Using Inbound streaming channel adapter to stream source files from S3 to local directory.(working fine)
step 2 : Want to delete source files once successfully transferred (not working)
configuration code are below
<int-aws:s3-inbound-streaming-channel-adapter id="s3FilesInbound"
channel="s3FilesChannel"
session-factory="s3SessionFactory"
filename-regex="^.*\\.(txt|csv)$"
remote-directory-expression="bucket_name"
auto-startup="true" >
<integration:poller id="s3FilesChannelPoller"
fixed-delay="1000"
max-messages-per-poll="1">
</integration:poller>
</int-aws:s3-inbound-streaming-channel-adapter>
<integration:stream-transformer id="streamTransformer" input-channel="s3FilesChannel" output-channel="s3FilesChannelOut"/>
<integration:chain id="filesS3ChannelChain"
input-channel="s3FilesChannelOut">
<file:outbound-gateway
id="fileInS3ArchiveChannel"
directory="local_directory"
filename-generator-expression="headers.file_remoteFile">
<file:request-handler-advice-chain>
<ref bean="retryAdvice" />
</file:request-handler-advice-chain>
</file:outbound-gateway>
<integration:gateway request-channel="nullChannel"
error-channel="errorChannel" />
</integration:chain>
Regards,
Since you use there a <integration:stream-transformer>, I don't see reason to rely on the <int-aws:s3-inbound-streaming-channel-adapter>. With the first one you just eliminate the streaming purpose of the last one.
I'd suggest you take a look into the regular <int-aws:s3-inbound-channel-adapter>, which already has a delete-remote-files="true" option.
On the other hand you still can do that with what you have so far, but you need to something like <integration:outbound-channel-adapter expression="#s3SessionFactory.getSession().remove(headers[file_remoteDirectory] + '/' + headers[file_remoteFile])">.
Those headers are populated by the AbstractRemoteFileStreamingMessageSource.

How to upload to SFTP user's Home directory using Spring Integration

I'm trying to upload a file via SFTP using Spring Integration (version 4.1.2).
Does anyone know how to configure the sftp:outbound-channel-adapter so that the file gets uploaded automatically to user's home directory without indicating the full directory path in the remote-directory's attribute (ex: remote-directory="/home/sftp_user")?
The solution is that the remote-directory must be set as an empty string.
Please note that the following won't work as it fails the XML model validation:
<sftp:outbound-channel-adapter
...
remote-directory=""
...
/>
I ended up reading the remote-directory from a configuration property which I set as an empty string.

NFS inbound adapter configuration for multiple directory using single inbound adapter

In spring integration, I want to poll files from different source directories (each interface configured have different source directories) which is configured in as sourcePath in yml file (dynamically) like below. N number of interfaces can be added by user.
interfaces:
-
sourceType: NFS
sourcePath: /Interface-1/Inbound/text
target: Interface-1
targetType: S3
targetPath: test-bucket-1
-
sourceType: NFS
sourcePath: /Interface-2/Inbound/text
target: Interface-2
targetType: S3
targetPath: test-bucket-2
Is it possible to poll the files from different source folders using single inbound adapter (using atomic reference) or need more than one inbound adapter?
Currently application polls files from base directory.
<file:inbound-channel-adapter id="filesInboundChannel"
directory="file:${base.path}" auto-startup="false" scanner="scanner" auto-create-directory="true">
<integration:poller id="poller" max-messages-per-poll="${max.messages.per.poll}" fixed-rate="${message.read.frequency}" task-executor="pollingExecutor">
<integration:transactional transaction-manager="transactionManager" />
</integration:poller>
</file:inbound-channel-adapter>
Can someone give an advice on this or is there any other way can also achieve the same goal
Yes, you can use a single <file:inbound-channel-adapter> for this task. To make it rotate over a list of directories for scanning you need to configure an AbstractMessageSourceAdvice implementation for the <poller> of that adapter to change a directory when afterReceive(boolean messageReceived, MessageSource<?> source) gets a false for the receive operation. So, this way the next poll will get already a new directory for scanning.
As a sample you can take a look into the recently introduced a RotatingServerAdvice: https://github.com/spring-projects/spring-integration/blob/master/spring-integration-file/src/main/java/org/springframework/integration/file/remote/aop/RotatingServerAdvice.java
https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-channels-section.html#conditional-pollers

Downloading all files from a FTP directory, running into problems with escaping spaces

I have my camel configured to download all files from a specific FTP directory. Now this is all easy enough and everything seems to be working fine. However, I am running into errors when the files contain a space in their names such as File 123.csv. I know I could specifically target the files with an escape character. The only difficulty is that these files are dynamic in nature and change daily, so I will not know which files may or may not have spaces.
I figure I can just read all the file URI and make adjustments from there. But I was wondering if there is any Camel specific way to handle this.
Errors: java.lang.IllegalArgumentException: Illegal character in path at index 60: hdfs://test.net/user/CamelTests/File Layout.csv
GenericFileOnCompletion - Rollback file strategy: org.apache.camel.component.file.strategy.GenericFileRenameProcessStrategy#fe8d1b for file: RemoteFile[File Layout.csv]
Camel Code
from("{{ftp.serverLP}}/Memo/Loss?username=ftp&password=pass")
.to("hdfs2://Test.net/user/CamelTests/?fileSystemType=HDFS")
.log("Downloaded file ${file:name} complete.");
Try changing the .to(..) to use a non-HDFS file system.
The error posted seems to indicate a problem with the destination to which the files are being copied (HDFS), not the FTP source.

Configuring Spring batch xml

Issue: Using spring batch, i need to read a file which has todays date. E.g test_02032015.txt.This file will be in a directory /test/example. Its an unix environment that i need to fetch file from.
question is how to configure spring batch xml so that above mentioned file is read
Any pointers to relevant website or solution would be of great help.
You have a few ways to address a requirement like this:
If you don't need to worry about the other files in a directory, you can just use a wild card in the file name like this:
<property name="resource" value="data/iosample/input/*.xml" />
Another alternative would be to pass the value into the job as a parameter and reference it like this:
<property name="resource" value="#{jobParameters['input.file']}" />
Finally you could use SpEL to build the file name (Sorry I don't have an example of that handy).

Resources