Using Citrus to mock SFTP and Kafka for Integration Testing Spring-Boot apache-camel xml based routes? - spring-boot

I am working with a Spring Boot application that was written using Apache Camel spring-xml routes. There is very little java based application logic, and it is nearly entirely written in xml and based on the various camel routes.
The routes are configured to connect to the different environments and systems through property files, using a property such as KAFKA_URL and KAFKA_PORT. Inside one of the implemented routes, the application connects with the following and consumes/produces messages to it:
<to id="route_id_replaced_for_question" uri="kafka:{{env:KAFKA_URL:{{KAFKA_URL}}}}:{{env:KAFKA_PORT:{{KAFKA_PORT}}}}?topic={{env:KAFKA_TOPIC:{{topic_to_connect_to}}}}&kerberosRenewJitter=1&kerberosRenewWindowFactor=1&{{kafka.ssl.props}}&{{kafka.encryption.props}}"/>
Additionally, we connect to an SFTP server, which I am also trying to mock using Citrus. That follows a similar pattern where:
<from id="_requestFile" uri="{{env:FTP_URL:{{FTP_URL}}}}:{{env:FTP_PORT:{{FTP_PORT}}}}/{{env:FTP_FILE_DIR:{{FTP_FILE_DIR}}}}/?delete=true&fileExist=Append&password={{env:FTP_PASSWORD:{{FTP_PASSWORD}}}}&delay={{env:FTP_POLL_DELAY:{{FTP_POLL_DELAY}}}}&username={{env:FTP_USER:{{FTP_USER}}}}"/>
Inside of my integration test, I have configured a Citrus' EmbeddedKafkaServer class with the following:
#Bean
public EmbeddedKafkaServer embeddedKafkaServer() {
return new EmbeddedKafkaServerBuilder()
.kafkaServerPort(9092)
.topics("topic_to_connect_to")
.build();
}
and a Citrus FTP server with:
#Bean
public SftpServer sftpServer() {
return CitrusEndpoints.sftp()
.server()
.port(2222)
.autoStart(true)
.user("username")
.password("passwordtoconnectwith")
.userHomePath("filedirectory/filestoreadfrom")
.build();
}
Ideally, my test will connect to the mock sftp server, and I will push a file to the appropriate directory using Citrus, which my application will then read in, process, and publish to a topic on the embedded kafka cluster and verify in the test.
I was under the impression that I would set KAFKA_PORT to 9092 and KAFKA_URL to localhost, as well as FTP_URL to localhost and FTP_PORT to 2222 (amongst the other properties needed) inside of my properties file, but that does not seem to connect me to the embedded cluster or sftp servers..
What piece of the puzzle am I missing to have my spring boot application connect to both of these mocked instances and run its' business logic processing from there?

I resolved this issue - it was due to using a very old version of Kafka (1.0.0 or older), which was missing some of the methods that are called when Citrus attempts to build new topics. If someone encounters a similar problem to this using Citrus, I recommend starting with evaluating the version of Kafka your service is on and determining if it needs to be updated.
For the sftp connection, the server or client was not being autowired, and therefore never starting.

Related

How to pass Camel VM messages between multiple Spring Boot applications in Tomcat

We have an application stack, deployed in Tomcat, that consists of several Spring Boot applications. As part of our operations, we want to send some messages to a vm endpoint, where a camel route will consume those messages and then publish them to a JMS topic for any of the other Spring Boot applications that are interested in messages on that topic.
When I start the application stack, there are three spring boot apps that utilize camel, and I see camel start properly in the logs. But when one of the apps sends a message to the vm endpoint, the route that consumes from that endpoint and routes the messages to the jms topic does not seem to get that message. I have placed the camel-core jar in my tomcat lib directory. In the spring boot maven plugin configuration, I have specified an exclusion of the camel-core jar. Oddly enough, that jar is in the WEB-INF/lib of the war anyway! So I have stopped Tomcat, removed that jar from the exploded war, and restarted Tomcat, but that does not change the behavior of the messaging.
Here are the versions that we are using:
Spring Boot 2.3.1
Camel 3.4.2
Tomcat 8.5.5
The first spring boot app that links everything together, with the camel route that consumes from the vm endpoint and produces that message on the jms topic is our "routing engine". It uses camel-spring-boot-starter, spring-boot-starter-artemis, camel-vm-starter, artemis-jms-server and camel-jms-starter. Its RouteBuilder's configure method looks like this:
from("vm:task")
.log(LoggingLevel.WARN, "********** Received task message");
.to("jms:topic:local.private.task")
.routeId("taskToJms");
The app that produces messages to the vm endpoint uses camel-spring-boot-starter and camel-vm-starter. In that app, it has a #Service class that receives a ProducerTemplate that is auto-wired in the constructor. When the application invokes this component to send the message, I see a line in the logs that says
o.a.c.impl.engine.DefaultProducerCache (169) - >>>> vm://task Exchange[]
so it appears that the message is being produced and sent properly to the vm endpoint. However, I see no indication that it has been received/consumed in the routing engine's camel route, since the route's log line is not logging anything, and since I see no other indications of receiving the message in the log. The strange thing is that I am not getting the error of not having any consumers on the vm:task endpoint that I was getting before I put the camel-core jar in tomcat's lib directory.
Am I doing anything obviously wrong? How can I get the spring boot maven plugin to really exclude camel-core? And why are the messages (sent to the vm endpoint) not being consumed by the route in the routing engine? Thanks in advance for any help.
Edit: I was able to keep camel-core out of the war files by adding an exclusion to the configuration of the war plugin, but I was still not able to consume the message on the vm endpoint.
I will post the answer, or at least "an" answer, for anyone who might have found themselves in the puzzling situation that I found myself in.
In short, the answer is that it is best to avoid trying to send VM messages across separate contexts within one big JVM like Tomcat. Instead, use something like JMS. I used Artemis, and I stood up an embedded broker in one of the spring boot apps in tomcat. In other apps (that will be clients), I needed to connect to the embedded artemis server, which requires that you add a #Configuration class (in the module that stands up the embedded broker) that implements ArtemisConfigurationCustomizer:
#Configuration
public class ArtemisConfig implements ArtemisConfigurationCustomizer {
#Override
public void customize(final org.apache.activemq.artemis.core.config.Configuration configuration) {
configuration.addConnectorConfiguration("nettyConnector", new TransportConnfiguration(NettyConnectorFactory.class.getName()));
configuration.addAcceptorConfiguration(new TransportConfiguration(NettyAcceptorFactory.class.getName()));
}
}
That lets your other stuff connect to the embedded Artemis broker. Also, you do not have to worry about upgrading camel-core jars in your tomcat shared lib folder when you upgrade camel to a different version. It's good to keep things simple for maintenance purposes!
Anyway, I hope this helps somebody else who might find themselves here someday.

Spring Integration - dynamic multible sftp/ftp sessions with different folders

i'm not quite sure if spring integration is the right toolset for me.
I would like to enter connection data (SFTP/FTP) into a database and use it time scheduled to fetch data.
But I have several problems now,
can I dynamically add SFTP /FTP jobs at spring integration?
can I cluster spring integration jobs?
I have found several solutions to have multiple SFTP polls, but they don't work.
For example: spring integration : solutions/tips on connect multiple sftp server?
Thanks for your feedback.
You can do that using Spring Integration Java DSL dynamic flows: https://docs.spring.io/spring-integration/docs/current/reference/html/dsl.html#java-dsl-runtime-flows
So:
you do a JDBC Inbound Channel Adapter to poll settings from the database: https://docs.spring.io/spring-integration/docs/current/reference/html/jdbc.html#jdbc-inbound-channel-adapter
You create dynamic flows using a IntegrationFlowContext populate SFTP server connection factory and remote directory into a SFTP Inbound Channel Adapter and start that dynamic flow: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-inbound
Another option is to consider to use a RotatingServerAdvice: https://docs.spring.io/spring-integration/docs/current/reference/html/sftp.html#sftp-rotating-server-advice
To make such a solution robust in the cluster you should use SftpPersistentAcceptOnceFileListFilter configured with shared MetadataStore: https://docs.spring.io/spring-integration/docs/current/reference/html/system-management.html#metadata-store.
This sample demonstrate a technique how to use dynamic flows for TCP/IP, but principle is the same: https://github.com/spring-projects/spring-integration-samples/tree/master/advanced/dynamic-tcp-client.
Also see this SO thread: how can i connect with different SFTP server dynamically?

Apache Kafka Connect With Springboot

I'm trying to find examples of kafka connect with springboot. It looks like there is no spring boot integration for kafka connect. Can some one point me in the right direction to be able to listen to changes on mysql db?
Kafka Connect doesn't really need Spring Boot because there is nothing for you to code for it, and it really works best when ran in distributed mode, as a cluster, not embedded within other (single-instance) applications. I suppose if you did want to do it, then you could copy relevent portions of the source code, but that of course isn't using Spring Boot, and you'd have to wire it all yourself
The framework itself consists of a few core Java dependencies that have already been written (Debezium or Confluent JDBC Connector, for your mysql example), and two config files. One for Kafka Connect to know the bootstrap servers, serializers, etc. and another for the actual MySQL connector. So, if you want to use Kafka Connect, run it by itself, then just write the consumer in the Spring app.
The alternatives to Kafka Connect itself would be to use Apache Camel within a Spring application (Spring Integration) or Spring Cloud Dataflow and interfacing with those Kafka "components" (which aren't using the Connect API, AFAIK)
Another option, specific for listening to MySQL, is to use Debezium Engine within your code.

Spring JMS injection causing application not to startup

We have a spring application that publishes and listens to queues on a remote application server. My publisher and listener which are spring based listen within our own application server.
One of the problems we have for our test environments is the other app. server is not up so when our test application goes to start and it tries to inject JmsTemplate with its connectionFactory it blows up because it is not a valid connection and our entire application fails to load. This is causing grief with the other developers in our group that have nothing to do with JMS. All they want to do is run and test their code but the JmsTemplate connectionFactory is down.
Does anyone have any suggestion for enabling spring ignore some bad injections which will allow our application to start properly?
Thanks
I believe this could be achieved by defining separate spring profiles and then passing it as a parameter in your test environments while starting your application. You could mock or ignore any beans then.
Example
import org.springframework.context.annotation.Profile;
#Profile("test")
public class AppConfigTest {
....
....
}
JVM param / System property
-Dspring.profiles.active=test

configure jndi.xml in serviceMix to work with MQseries

My j2EE app is currently running on ServiceMix. Now i want to add JMS to my app. The application should able to send/receive the JMS message to/from the queue that stays on MQSeries.
mq.hostname=10.3.6.19
mq.channel=CHANNEL
mq.queueManager=QManager
mq.port=1422
What i would like to do is:
1. Create a jndi.xml file and do configuration for jms stuff.
2. my app will initialize the context, look up jndi name, and create a connection, queueManager, queue. .etc
3. Develop send and receive methods.
My question is:
Can you tell me how to do 1st and 2nd steps.
(the script inside ServiceMix's jndi is diffrent with tomcat's
jndi and others.
ServiceMix using Spring based JNDI provider.
http://servicemix.apache.org/jndi-configuration.html)
I just ran into something similar with Weblogic. The following link uses spring-dm to integrate with websphere. It also takes it to the next logical step and adds camel to the mix.
http://lowry-techie.blogspot.com/2010/11/camel-integration-with-websphere-mq.html
Without using Spring-dm, you may run into classloader issues when trying to load the InitialContextFactory from the websphere jar (this is an issue I had with the Weblogic jar)

Resources