i'm not quite sure if spring integration is the right toolset for me.
I would like to enter connection data (SFTP/FTP) into a database and use it time scheduled to fetch data.
But I have several problems now,
can I dynamically add SFTP /FTP jobs at spring integration?
can I cluster spring integration jobs?
I have found several solutions to have multiple SFTP polls, but they don't work.
For example: spring integration : solutions/tips on connect multiple sftp server?
Thanks for your feedback.
You can do that using Spring Integration Java DSL dynamic flows: https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-runtime-flows
So:
you do a JDBC Inbound Channel Adapter to poll settings from the database: https://docs.spring.io/spring-integration/docs/current/reference/html/jdbc.html#jdbc-inbound-channel-adapter
You create dynamic flows using a IntegrationFlowContext populate SFTP server connection factory and remote directory into a SFTP Inbound Channel Adapter and start that dynamic flow: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-inbound
Another option is to consider to use a RotatingServerAdvice: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-rotating-server-advice
To make such a solution robust in the cluster you should use SftpPersistentAcceptOnceFileListFilter configured with shared MetadataStore: https://docs.spring.io/spring-integration/docs/current/reference/html/system-management.html#metadata-store.
This sample demonstrate a technique how to use dynamic flows for TCP/IP, but principle is the same: https://github.com/spring-projects/spring-integration-samples/tree/master/advanced/dynamic-tcp-client.
Also see this SO thread: how can i connect with different SFTP server dynamically?
Related
I'm trying to find examples of kafka connect with springboot. It looks like there is no spring boot integration for kafka connect. Can some one point me in the right direction to be able to listen to changes on mysql db?
Kafka Connect doesn't really need Spring Boot because there is nothing for you to code for it, and it really works best when ran in distributed mode, as a cluster, not embedded within other (single-instance) applications. I suppose if you did want to do it, then you could copy relevent portions of the source code, but that of course isn't using Spring Boot, and you'd have to wire it all yourself
The framework itself consists of a few core Java dependencies that have already been written (Debezium or Confluent JDBC Connector, for your mysql example), and two config files. One for Kafka Connect to know the bootstrap servers, serializers, etc. and another for the actual MySQL connector. So, if you want to use Kafka Connect, run it by itself, then just write the consumer in the Spring app.
The alternatives to Kafka Connect itself would be to use Apache Camel within a Spring application (Spring Integration) or Spring Cloud Dataflow and interfacing with those Kafka "components" (which aren't using the Connect API, AFAIK)
Another option, specific for listening to MySQL, is to use Debezium Engine within your code.
In our project we have a requirement to connect to IBM IMS and get data. Many of the existing applications are done it through code more coupled with IMS.
In one of the application we are using Spring CCI support and providing the CCIConnectionFactory to the JDBCTemplate and using it in a relational (kind of) manner.
However we are building a new application which is not using Spring framework. We are making use of JAVA CDI and it's aspects. But to integrate it with IMS through CCI I can see Spring is the best option. Anyone have experienced on this CCI connections? What way is the best you think? And any other frameworks in Java you are familiar with - apart from Spring's support?
Appreciate your help and input.
I had the same question 5 Month ago and it was very hard to collect information about jca. If your project works with wildfly or jboss take a look on my inbound-ra-example project. At first you must know what kind of resource adapter (RA) you need, inbound or outbound. In short, an inbound RA acts as a server for external data and send the data to a message driven bean. An outbound RA is called from an EJB via a connection factory and initiate the connection to the external information system. Read the readme.md of my example project. The inbound RA is much more difficult as an outbound RA. Generate the skeleton of your ra with the ironjacamar codegenerator. I described the process in my example project.
I need to know which framework or API to use for my following requirement. I am currently using native java code for all this.
Requirement
I have an application where there could be multiple JMS/Rest/TCP connections. These connections can grow at runtime. User wil have a screen to define new incoming or outgoing connections. I am using Native and works fine but I want to make use of an efficient framework or API like Spring, Camel etc ?
Need Guidance.
I have been able to get this all working. There are multiple solutions to get do dynamic JMS
1. I used Spring JMS API and created the dynamic JMS connections by loading the dynamic child context into application. For this I followed the spring's dynamic FTP sample and inserted JMS beans in the example instead of FTP ones.
Spring Dynamic FTP Sample
I've managed to get Spring Xd working for a scenario where I have data coming in from one JMS broker.
I potentially am facing a scenario where data ingestion could happen from different sources thereby needing me to connect to different brokers.
Based on my current understanding, I'm not quite sure how to do this as there exists a JMS config file which allows you to setup only one broker.
Is there a workaround to this?
At the moment, you would have to create a separate jms-[provider]-infrastructure-context.xml for each broker (in modules/common), say call the provider activemq2.
Then use --provider=activemq2 in the module definition.
(I recently used this technique to test sonicmq and hornetq providers).
I am new to spring batch. I want to run spring batch jobs on server a and want to launch those jobs from server b using spring batch admin.is it possible? I have searched the following two ways:
1.JMX way: i could convert spring batch beans into mbeans but i cant read them from spring batch admin.can you tell how to read mbeans from spring batch admin and launch them?
2.common repository: i think if i use the same db repository for both spring batch and spring batch admin then i can launch remote jobs from spring batch admin (from server b).but in the job xml file in spring batch admin what should be the classpath for tasklet?
can you help in the above or tell me if any new way exists?
we ended up implementing a framework using mq communication to handle this. each 'batch node' registers itself and any 'batch class' parameters such as 'nodeType=A' or 'jobSizeiCanHandle=BIG' (these are fictitious but you get the point). The client console reads this information and queries the nodes via MQ for the job list. It then submits job requests with parameters via a rudimentary text based protocol (property file format).
command=START_JOB
job=JobABC
param1=x
param2=y
One of the batch nodes will pick up the message and start the job, it will return success/fail status in the same manner with a message with the same correlation id. so the client can show response to the user.
this allows us to do what you're talking about AND spark the jobs via an external scheduler (Control-M) . The 'nodeType=A' mentioned above allows us to query individual nodes (the nodes listen where 'nodeType=A or nodeType=*'. This allows commands to be 'targeted' to specific nodes if that is necessary.
Keep in mind, this is our own console, not the spring batch admin console. So perhaps that doesn't help you, but building up a simple console doesn't take that long using the spring batch APIs (4 or 5 asps).
The batch nodes could also have started up simple services like HTTP REST services or 'whatever' but we use MQ heavily and i liked the idea of not having to preregister nodes (the framework code doesn't know/care that it's in an HTTP container, so it couldn't register the endpoint easily). With MQ, the channel is preconfigured and all apps just 'use it' so it seemed easier.
Good luck.
I am trying to do the same thing. But it seems that in order to launch job directly from Spring batch admin, all the job resource has to be added to the spring batch web app. May be try restful job submission with spring MVC
#chau
One way to use Spring batch admin as is, but "discover" and "invoke" remote jobs is to provide your own implementations for org.springframework.batch.admin.service.JobService and org.springframework.batch.core.launch.JobOperator that can query and invoke jobs from remote job registry/repository.
You can find custom implementation for JobService and JMX enabled Job administrator in : https://github.com/regunathb/Trooper/tree/master/batch-core as: org.trpr.platform.batch.impl.spring.admin.SimpleJobService and org.trpr.platform.batch.impl.spring.jmx.JobAdministrator
Spring beans XML that uses these beans are here : https://github.com/regunathb/Trooper/blob/master/batch-core/src/main/resources/packaged/common-batch-config.xml