I am using spring-integration with ftp. When i use int-ftp:inbound-channel-adapter, then it is working fine, but int-ftp:inbound-streaming-channel-adapter yields the following error:
cvc-complex-type.2.4.c: The matching wildcard is strict, but no declaration can be found for element 'int-ftp:inbound-streaming-channel-adapter
What could I be missing?
The relevant portion of my code is
<int-ftp:inbound-streaming-channel-adapter id="ftpInbound"
channel="ftpChannel"
session-factory="ftpClientFactory"
filename-pattern="*.txt"
filename-regex=".*\.txt"
filter="customFilter"
remote-file-separator="/"
comparator="comparator"
remote-directory-expression="'/OUT/SDI402_CARATT_JD'">
<int:poller fixed-rate="1000" />
</int-ftp:inbound-streaming-channel-adapter>
Use in your dependencies spring integration 4.3.1.RELEASE. That solved the exact same problem in my case.
Related
I have some legacy programs in C that work based on the input file and the result go to the output file. Both files are specified in the program arguments. So the call looks like the following:
prj.exe a.dat a.out
Based on Artem Bilan suggestion I created the project with the following Spring Configuration file. It works in terms of invoking executable! However, I still have the problem with the outbound channel. First, it contains nothing and I am getting the error "unsupported Message payload type". Second, what is more important, I need to process the output file a.out by a Java program. What is the best way to organize this workflow? Is it possible to substitute the useless in this case inbound-channel-adapter to something useful?
<int-file:inbound-channel-adapter id="producer-file-adapter"
channel="inboundChannel" directory="file:/Users/anarinsky/springint/chem"
prevent-duplicates="true">
<int:poller fixed-rate="5000" />
</int-file:inbound-channel-adapter>
<int:channel id="inboundChannel" />
<int:channel id="outboundChannel" />
<int:service-activator input-channel="inboundChannel" output-channel="outboundChannel"
expression="new ProcessBuilder('/Users/anarinsky/springint/chem/prj', '/Users/anarinsky/springint/chem/a.dat', '/Users/anarinsky/springint/chem/a.out').start()">
</int:service-activator>
<int-file:outbound-channel-adapter
channel="outboundChannel" id="consumer-file-adapter"
directory="file:/Users/anarinsky/springint/chem"/>
Something like this:
<int:service-activator expression="new ProcessBuilder('prj.exe', 'a.dat', 'a.out').start()"/>
?
I am attempting to define a simple message flow in Spring Integration that reads from one channel and then dumps the messages into a Kafka queue. To do this, I am using spring-integration-kafka. The problem is I get an EvaluationContext error I can't decipher.
Here is my configuration in XML:
<int:channel id="myStreamChannel"/>
<int:gateway id="myGateway" service-interface="com.myApp.MyGateway" >
<int:method name="process" request-channel="myStreamChannel"/>
</int:gateway>
<int:channel id="activityOutputChannel"/>
<int:transformer input-channel="myStreamChannel" output-channel="activityOutputChannel" ref="activityTransformer"/>
<int-kafka:outbound-channel-adapter id="kafkaOutboundChannelAdapter"
kafka-producer-context-ref="kafkaProducerContext"
auto-startup="false"
channel="activityOutputChannel"
topic="my-test"
message-key-expression="header.messageKey">
<int:poller fixed-delay="1000" time-unit="MILLISECONDS" receive-timeout="0" task-executor="taskExecutor"/>
</int-kafka:outbound-channel-adapter>
<task:executor id="taskExecutor"
pool-size="5-25"
queue-capacity="20"
keep-alive="120"/>
<int-kafka:producer-context id="kafkaProducerContext" producer-properties="producerProperties">
<int-kafka:producer-configurations>
<int-kafka:producer-configuration broker-list="kafkaserver.com:9092"
key-class-type="java.lang.String"
value-class-type="java.lang.String"
topic="my-test"
key-encoder="stringEncoder"
value-encoder="stringEncoder"
compression-codec="snappy"/>
</int-kafka:producer-configurations>
</int-kafka:producer-context>
When I run my application via Spring Boot, I get this exception:
Error creating bean with name 'org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler#0': Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: [Assertion failed] - this argument is required; it must not be null
This is the offending line in the stack trace:
at org.springframework.integration.kafka.outbound.KafkaProducerMessageHandler.onInit(KafkaProducerMessageHandler.java:68)
Here is what happens at Line 68:
Assert.notNull(this.evaluationContext);
So the EvaluationContext is null. I have no idea why.
By the way, when I replace the Kafka endpoint with a trivial stdout endpoint to print the message body, everything works fine.
Can you tell me what is wrong with my Kafka endpoint configuration that there is no EvaluationContext available?
Your issue belongs to the version mismatch hell.
The Spring Boot pulls that Spring Integration Core version, which doesn't support IntegrationEvaluationContextAware to populate the EvaluationContext in the KafkaProducerMessageHandler.
So, you should upgrade Spring Integration Kafka to most recent release: https://github.com/spring-projects/spring-integration-kafka/releases
I am trying to build a simple utility that will copy the files from multiple directories from one sftp server to another server.
I tried use sftp outbound gateway to poll single high level directory with command "mget" , but it did not work. So I thought of writing two inbound adapters ( not a good solution, but still wanted this to be done badly !) .
<int-sftp:inbound-channel-adapter
id="pdbInbound"
session-factory="sftpSessionFactory"
auto-create-local-directory="true" delete-remote-files="true"
filename-pattern="*.*" remote-directory="${remote.pdb.directory}"
local-directory="${local.pdb.directory}">
<int:poller fixed-rate="5000"/>
</int-sftp:inbound-channel-adapter>
<int-sftp:inbound-channel-adapter
id="galaxyInbound"
session-factory="sftpSessionFactory"
auto-create-local-directory="true" delete-remote-files="true"
filename-pattern="*.*" remote-directory="${remote.galaxy.directory}"
local-directory="${local.galaxy.directory}" >
<int:poller fixed-rate="5000"/>
</int-sftp:inbound-channel-adapter>
Above code works perfectly fine and files are copied to local directories as expected.
Problem appears when I need to transfer these files to remote directory with the same directory structure as that of source directory. I could not achieve it using sftp-outbound gateway with command = "mput" and command-options= "-R". So, I tried to write two outbound adapters as below. But only one directory is written to remote.
Any idea what is going wrong here ?
<int:service-activator input-channel="pdbInbound" output-channel="pdbOutbound" expression="payload"/>
<int:service-activator input-channel="galaxyInbound" output-channel="galaxyOutbound" expression="payload"/>
<int-sftp:outbound-channel-adapter id="sftPdbOutboundAdapter" auto-create-directory="true"
session-factory="sftpSessionFactory"
auto-startup="true"
channel="pdbOutbound"
charset="UTF-8"
remote-file-separator="/"
remote-directory="${remote.out.pdb.directory}"
mode="REPLACE">
</int-sftp:outbound-channel-adapter>
<int-sftp:outbound-channel-adapter id="sftpGalaxyOutboundAdapter" auto-create-directory="true"
auto-startup="true"
session-factory="sftpSessionFactory"
channel="galaxyOutbound"
charset="UTF-8"
remote-file-separator="/"
remote-directory="${remote.out.galaxy.directory}"
mode="REPLACE">
</int-sftp:outbound-channel-adapter>
<int:poller default="true" fixed-delay="50"/>
Note: I am using same sftp server (but different directories) for inbound and outbound files for testing purpose.
You need to explain your issues in more details - "did not work" is woefully inadequate and you won't get much help here with such a question. You need to show what you tried and what you observed.
There are test cases for both recursive mget and recursive mput.
The directory structure for the tests is shown in a comment at the top of that file.
I suggest you compare those with what you tried and come back here if you have a specific question/observation. Best thing to do to solve these issues is to turn on DEBUG logging; including for jsch.
I am trying to using CXF with multiples endpoints and 2 bus definitions, here is my configurations:
<jaxws:endpoint id="csSegSEndPoint"
implementor="#csSegServices"
address="/ESTSServices"
bus="busEST">
</jaxws:endpoint>
<cxf:bus name="busEST">
<cxf:inInterceptors>
<ref bean="logInbound"/>
</cxf:inInterceptors>
<cxf:outInterceptors>
<ref bean="logOutbound"/>
</cxf:outInterceptors>
</cxf:bus>
The thing is that when I start the application I get:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'csSegSEndPoint': Could not resolve matching constructor (hint: specify index/type/name arguments for simple parameters to avoid type ambiguities)
I did some search and I cannot get what I am doing wrong. Could you please help me?
NOTE: I am using CXF 2.2.10.
salu2..
masch...
This never really worked until CXF 2.4.x sometime. Definitely upgrade. 2.2.10 is ancient, buggy, unsupported and has a BUNCH of security issues.
I am trying to move a working spring WAR to OSGI environment (in glassfish 3.1 and blueprint, spring 3.0.5).
The application loads properties file from disk, like this:
<bean id="myProperties" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="location" value="${my_conf}/my.properties"/>
<property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE"/>
</bean>
I see in debugger that ${my_conf}/my.properties is translated to the existing path (c:\conf\my.properties)
I use the property jms.url defined in my.properties in the next bean declaration
<amq:broker useJmx="false" persistent="false" brokerName="embeddedbroker">
<amq:transportConnectors>
<amq:transportConnector uri="tcp://${jms.url}"/>
<amq:transportConnector uri="vm://embeddedbroker" />
</amq:transportConnectors>
</amq:broker>
And in deployment I get an exception "Could not resolve placeholder ${jms.url}"
Why it fails? Is there another way to load properties from file on disk?
thank you
Since its an OSGI environment, you will need spring-osgi-core jar added to your application. Take a look at this link to configure property-placeholder for OSGI framework.
It isn't a solution, but an explanation of my problem.
The problem is related to this bug in spring 3 and osgi.
I had to open spring logs to debug level to understand it.