Merge other java Server code to Spring boot code - spring

I have some legacy java code which has tcp server and sends JSON data to some call back .
And have other spring a server which is working for current generation application which receives the data from clients,validates, persist and sends to other stake holders .
So now I need to have same features for my legacy code but due to internal and legacy formats I can not rewrite in spring. I am looking to merge that legacy tcp server and my spring utilities written (validation,persist etc), I am using all spring features like #Auto-wired, #component, #Service and #repository etc so how to use that tcp server along with these features.

Related

Using Citrus to mock SFTP and Kafka for Integration Testing Spring-Boot apache-camel xml based routes?

I am working with a Spring Boot application that was written using Apache Camel spring-xml routes. There is very little java based application logic, and it is nearly entirely written in xml and based on the various camel routes.
The routes are configured to connect to the different environments and systems through property files, using a property such as KAFKA_URL and KAFKA_PORT. Inside one of the implemented routes, the application connects with the following and consumes/produces messages to it:
<to id="route_id_replaced_for_question" uri="kafka:{{env:KAFKA_URL:{{KAFKA_URL}}}}:{{env:KAFKA_PORT:{{KAFKA_PORT}}}}?topic={{env:KAFKA_TOPIC:{{topic_to_connect_to}}}}&kerberosRenewJitter=1&kerberosRenewWindowFactor=1&{{kafka.ssl.props}}&{{kafka.encryption.props}}"/>
Additionally, we connect to an SFTP server, which I am also trying to mock using Citrus. That follows a similar pattern where:
<from id="_requestFile" uri="{{env:FTP_URL:{{FTP_URL}}}}:{{env:FTP_PORT:{{FTP_PORT}}}}/{{env:FTP_FILE_DIR:{{FTP_FILE_DIR}}}}/?delete=true&fileExist=Append&password={{env:FTP_PASSWORD:{{FTP_PASSWORD}}}}&delay={{env:FTP_POLL_DELAY:{{FTP_POLL_DELAY}}}}&username={{env:FTP_USER:{{FTP_USER}}}}"/>
Inside of my integration test, I have configured a Citrus' EmbeddedKafkaServer class with the following:
#Bean
public EmbeddedKafkaServer embeddedKafkaServer() {
return new EmbeddedKafkaServerBuilder()
.kafkaServerPort(9092)
.topics("topic_to_connect_to")
.build();
}
and a Citrus FTP server with:
#Bean
public SftpServer sftpServer() {
return CitrusEndpoints.sftp()
.server()
.port(2222)
.autoStart(true)
.user("username")
.password("passwordtoconnectwith")
.userHomePath("filedirectory/filestoreadfrom")
.build();
}
Ideally, my test will connect to the mock sftp server, and I will push a file to the appropriate directory using Citrus, which my application will then read in, process, and publish to a topic on the embedded kafka cluster and verify in the test.
I was under the impression that I would set KAFKA_PORT to 9092 and KAFKA_URL to localhost, as well as FTP_URL to localhost and FTP_PORT to 2222 (amongst the other properties needed) inside of my properties file, but that does not seem to connect me to the embedded cluster or sftp servers..
What piece of the puzzle am I missing to have my spring boot application connect to both of these mocked instances and run its' business logic processing from there?
I resolved this issue - it was due to using a very old version of Kafka (1.0.0 or older), which was missing some of the methods that are called when Citrus attempts to build new topics. If someone encounters a similar problem to this using Citrus, I recommend starting with evaluating the version of Kafka your service is on and determining if it needs to be updated.
For the sftp connection, the server or client was not being autowired, and therefore never starting.

Apache Kafka Connect With Springboot

I'm trying to find examples of kafka connect with springboot. It looks like there is no spring boot integration for kafka connect. Can some one point me in the right direction to be able to listen to changes on mysql db?
Kafka Connect doesn't really need Spring Boot because there is nothing for you to code for it, and it really works best when ran in distributed mode, as a cluster, not embedded within other (single-instance) applications. I suppose if you did want to do it, then you could copy relevent portions of the source code, but that of course isn't using Spring Boot, and you'd have to wire it all yourself
The framework itself consists of a few core Java dependencies that have already been written (Debezium or Confluent JDBC Connector, for your mysql example), and two config files. One for Kafka Connect to know the bootstrap servers, serializers, etc. and another for the actual MySQL connector. So, if you want to use Kafka Connect, run it by itself, then just write the consumer in the Spring app.
The alternatives to Kafka Connect itself would be to use Apache Camel within a Spring application (Spring Integration) or Spring Cloud Dataflow and interfacing with those Kafka "components" (which aren't using the Connect API, AFAIK)
Another option, specific for listening to MySQL, is to use Debezium Engine within your code.

Best way to use JCA CCI connections - Alternative to Spring CCI Support

In our project we have a requirement to connect to IBM IMS and get data. Many of the existing applications are done it through code more coupled with IMS.
In one of the application we are using Spring CCI support and providing the CCIConnectionFactory to the JDBCTemplate and using it in a relational (kind of) manner.
However we are building a new application which is not using Spring framework. We are making use of JAVA CDI and it's aspects. But to integrate it with IMS through CCI I can see Spring is the best option. Anyone have experienced on this CCI connections? What way is the best you think? And any other frameworks in Java you are familiar with - apart from Spring's support?
Appreciate your help and input.
I had the same question 5 Month ago and it was very hard to collect information about jca. If your project works with wildfly or jboss take a look on my inbound-ra-example project. At first you must know what kind of resource adapter (RA) you need, inbound or outbound. In short, an inbound RA acts as a server for external data and send the data to a message driven bean. An outbound RA is called from an EJB via a connection factory and initiate the connection to the external information system. Read the readme.md of my example project. The inbound RA is much more difficult as an outbound RA. Generate the skeleton of your ra with the ironjacamar codegenerator. I described the process in my example project.

Dynamic JMS Connections

I need to know which framework or API to use for my following requirement. I am currently using native java code for all this.
Requirement
I have an application where there could be multiple JMS/Rest/TCP connections. These connections can grow at runtime. User wil have a screen to define new incoming or outgoing connections. I am using Native and works fine but I want to make use of an efficient framework or API like Spring, Camel etc ?
Need Guidance.
I have been able to get this all working. There are multiple solutions to get do dynamic JMS
1. I used Spring JMS API and created the dynamic JMS connections by loading the dynamic child context into application. For this I followed the spring's dynamic FTP sample and inserted JMS beans in the example instead of FTP ones.
Spring Dynamic FTP Sample

How to send events (push notifications) from a Java EE application?

I have a Java EE application where the model is updated very frequently from various sources. In addition I have a rich client application which triggers some actions via remote EJBs but which should also display the model changes ate least every second.
What is the easiest/ best option for sending the changes from the Java EE application to Java client application? Till now I have the following options:
polling a remote EJB every second from the client
polling a servlet (Is it preferable to use json/ xml instead of java object serialization? Any other serialization?)
websockets (Is it possible to send Java objects here? Or must the result be serialized to Json for for example?)
tcp socket connection (The server would provide a port where the client connects to on startup. Model changes are send via standard object serialization. Is this "allowed" in a Java EE app?)
Option 1 is the easiest one, you can use there asynchronous EJB methods:
SERVER
#Asynchronous
public Future<String> getUpdatedModel() {
//here create blocking process until something interesting happen
return new AsyncResult<String>("model has changed!");
}
CLIENT
Future<String> updatedModel = bean.getUpdatedModel();
while(true){
String response = updatedModel.get();
//process response here
}
Option 2 looks like option 1, but you have to take care of marshaling objects, so don't bother in using plain servlet.
Option 3 looks interesting, as websockets are going to be included in Java EE7 (now you can use opensource comet implementations for servlets). In my opinion it is not designed for communication in enterprise applications, but might be fine for your usecase. There is a plenty of JSON serializers available (e.g. gson), I use that kind of communication between JS and java, and works fine.
Option 4 breaks major Java EE principles (opening your own sockets is forbidden), and I would discourage you to use it.
Option 5 Listen to Xie and use JMS! If you will use JMS topic, you could just send the message when particular event occur, and all connected client will receive the message asynchronously. It is natural Java EE way of solving this kind of problems, with out of box transactions, message redelivery and persistence if nessesary.

Resources