I'm implementing the TCP client using the Spring Integration.
The requirements are:
1. Through the UDP connection (from somewhere) receive the ip or host address of the TCP server.
2. Open TCP connection to the server, to the destination host from previous step and send some business data to this server.
I use the Spring Integration framework, version "2.2.0.RELEASE", and the problem is that in the default configuration of the tcp-connection-factory the host attribute should be "hardcoded" in xml. For example:
<ip:tcp-connection-factory id="client" type="client" host="localhost" port="1234" single-use="true"/>
The question is how to avoid the static definition of the destination host in application context, and be able to 'lazy' initialise the tcp-connection-factory when the destination host will be known.
I know that this flow could be easily implemented by the standard Network APIs of Java, and the question is specific about the Spring-Integration API
At this time, the configuration is static.
You could however use a similar technique to that used in the dynamic ftp sample which configures ftp outbound adapters at runtime.
As far as <int-ip:tcp-connection-factory> provides some instance of AbstractConnectionFactory. And from other side <int-ip:tcp-outbound-channel-adapter> applies that instance via connection-factory, so, there is no stops to implement your own RoutingConnectionFactory.
The implementation may rely on some value from ThreadLocal. The idea is here:
https://github.com/spring-projects/spring-amqp/blob/master/spring-rabbit/src/main/java/org/springframework/amqp/rabbit/connection/AbstractRoutingConnectionFactory.java,
https://github.com/spring-projects/spring-framework/blob/master/spring-jdbc/src/main/java/org/springframework/jdbc/datasource/lookup/AbstractRoutingDataSource.java
It's not currently possible/easy - even if you customize or extend the class for tcp-connection-factory to be able to connect to changing hosts. There is an open new feature request in JIRA to provide this functionality.
Related
i'm not quite sure if spring integration is the right toolset for me.
I would like to enter connection data (SFTP/FTP) into a database and use it time scheduled to fetch data.
But I have several problems now,
can I dynamically add SFTP /FTP jobs at spring integration?
can I cluster spring integration jobs?
I have found several solutions to have multiple SFTP polls, but they don't work.
For example: spring integration : solutions/tips on connect multiple sftp server?
Thanks for your feedback.
You can do that using Spring Integration Java DSL dynamic flows: https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-runtime-flows
So:
you do a JDBC Inbound Channel Adapter to poll settings from the database: https://docs.spring.io/spring-integration/docs/current/reference/html/jdbc.html#jdbc-inbound-channel-adapter
You create dynamic flows using a IntegrationFlowContext populate SFTP server connection factory and remote directory into a SFTP Inbound Channel Adapter and start that dynamic flow: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-inbound
Another option is to consider to use a RotatingServerAdvice: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-rotating-server-advice
To make such a solution robust in the cluster you should use SftpPersistentAcceptOnceFileListFilter configured with shared MetadataStore: https://docs.spring.io/spring-integration/docs/current/reference/html/system-management.html#metadata-store.
This sample demonstrate a technique how to use dynamic flows for TCP/IP, but principle is the same: https://github.com/spring-projects/spring-integration-samples/tree/master/advanced/dynamic-tcp-client.
Also see this SO thread: how can i connect with different SFTP server dynamically?
I want to know the major difference between VM and JMS component of Mule ESB. Can someone help me to know it.
As per Mule documentation, VM transport is for intra-JVM communication between Mule flows. So, that means when you use a VM in your flow, you can communicate between different flows in the application.
A flow containing VM inbound cannot be called externally from external application as thus the flow is equivalent to a private flow used within the application. By default uses in-memory queues.
Please go through the documentation :- https://docs.mulesoft.com/mule-user-guide/v/3.8/vm-transport-reference
On the other hand as per Mule documentation, JMS is an external host, allows communication between different components of a distributed application and JMS transport lets you easily send and receive messages to queues and topics for any message service which implements the JMS specification.
A flow, which has JMS inbound can be called from externally unlike VM. Documentation is here :- https://docs.mulesoft.com/mule-user-guide/v/3.8/jms-transport-reference
Within the application, if you send the control from one flow to another flow we use VM.VM can be used as both inbound and outbound.
Outside the application, for example, A application want to send something to B application(external application) there we use JMS.
I need to know which framework or API to use for my following requirement. I am currently using native java code for all this.
Requirement
I have an application where there could be multiple JMS/Rest/TCP connections. These connections can grow at runtime. User wil have a screen to define new incoming or outgoing connections. I am using Native and works fine but I want to make use of an efficient framework or API like Spring, Camel etc ?
Need Guidance.
I have been able to get this all working. There are multiple solutions to get do dynamic JMS
1. I used Spring JMS API and created the dynamic JMS connections by loading the dynamic child context into application. For this I followed the spring's dynamic FTP sample and inserted JMS beans in the example instead of FTP ones.
Spring Dynamic FTP Sample
I've been searching extensively for a description of how to set up JMS access from a remote client to a file based JNDI MQ Series provider without success.
My JMS client works Ok on the same Linux machine as my MQSeries 7.5 server using file based JNDI.
How does one set up a remote client to use file based JNDI? Is it even possible or must one use LDAP?
I've seen hints that one should be able to have a remote client but nothing very clear.
I'm using Spring JMSTemplate which uses a provider url. On the same machine my Tomcat context.xml file uses a file: fileName url which, as I say, works ok collocated with the MQSeries server.
Thanks
Not a problem. If you are using a File based JNDI then you just need to add a QCF that contains the appropriate information for the remote queue manager. i.e. hostname, port # and channel name
DEFINE QCF(myQCF) QMANAGER(MQWT1) CHANNEL(TEST.CHL) HOSTNAME(22.22.22.22) PORT(1414) TRANSPORT(CLIENT) FAILIFQUIESCE(YES)
I was assuming that there was more than there is to file based JNDI. All it is is reading a property file. Using the "file"" url format allows you to read remote files.
The example given for a spring injected endpoint is as follows:
<endpoint id="hl7listener" uri="mina:tcp://localhost:8888?sync=true&codec=hl7codec"/>
How do I setup a client mode endpoint such that is will connect to a specific port on another server?
How do I configure the endpoint to listen for inbound connections? (the example seems to be a listener as indicated by its descriptive id but why?)
Note: I am not actually using the HL7 protocol or codec. I will be developing my own for a proprietary protocol codec.
was this answered on the thread here?