How to leverage Armeria's fantastic JSON to GRPC transcoding functions to springboot project - spring-boot

We have a existing springboot project which has terrible API management system. So we wanna do something like grpc-gateway related work. But we don't want to add sidecar to our existing service. We found that Armeria has a wonderful json grpc transcoding function. How do we leverage this thing to our existing spring boot project.

We found that Armeria has a wonderful json grpc transcoding function.
I guess a minimal example may look like the following:
final GrpcService grpcService = GrpcService.builder()
.addService(new MyGrpcService())
.enableHttpJsonTranscoding(true) // enable http json transcoding
.build();
final ServerBuilder sb = Server.builder();
sb.service(grpcService).serviceUnder("/foo", grpcService); // add the grpc service to the server
final Server server = sb.build();
Runtime.getRuntime().addShutdownHook(new Thread(() -> {
server.stop().join();
}));
server.start().join(); // start the server
How do we leverage this thing to our existing spring boot project.
Armeria also offers spring-boot integration. An example can be found in the following repository.
You can also ask at slack or github issues if you have any additional/follow up questions.

Related

what's the best solution to make kotlin and elasticsearch comunicate

Using google i found two possible solution for now:
-using spring boot with kotlin
-using this kotlin client https://github.com/jillesvangurp/kt-search
I've already finished the android client application in kotlin but now i have to find a way to make this client comunicate with elasticsearch.
What would it be the best solution for my problem that i could look up online
Thanks in advance
First, if you don't use Spring in the current application, integrating it (if possible) would be a lot of work and most likely not worth it.
Another alternative would be to use the officially supported Java client for Elasticsearch (https://www.elastic.co/guide/en/elasticsearch/client/java-api-client/current/index.html).
BUT, you should keep in mind that, if you put the Elasticsearch client in your client application, that the credentials, you may use, will also be in the client application.
They could be accessible for everyone using the Android client and the user could perform any request on the Elasticsearch. Also you couldn't change them without updating the client.
So, it is likely better to use a Three-Tier-Architecture (https://www.ibm.com/topics/three-tier-architecture) and creating an API service to handle the Elasticsearch requests.

Spring Integration to poll folder and call REST with file contents

The use case is that I need to poll a folder on a remote server, and if a new file is copied to that folder, I need to call a REST get API with the file contents. The REST API will process the file contents using Spring batch.
I am trying to use Spring boot integration for that purpose but having issues finding my way. Is Spring Integration best suited for this purpose? If yes, can I have a simple example of just the Spring Integration picking up the file and calling the REST API?
Or can I simply use the Java Watcher service?
Not clear what is your remote folder, but you can't use Java WatchService for that purpose anyway. Typically the remote directory is on an (S)FTP. Spring Integration provides channel adapters to poll such a remote directory under the mentioned protocol.
See more info in docs: https://docs.spring.io/spring-integration/docs/current/reference/html/ftp.html#ftp-inbound
You probably don't need to have a local copy of a remote service, then you can consider to use a streaming channel adapter instead: https://docs.spring.io/spring-integration/docs/current/reference/html/ftp.html#ftp-streaming
As far as a file content is emitted into a channel configured in that channel adapter, you can use an HTTP Outbound Channel Adapter to call some REST API: https://docs.spring.io/spring-integration/docs/current/reference/html/http.html#http-outbound
You can investigate samples project for some inspiration: https://github.com/spring-projects/spring-integration-samples

Using Citrus to mock SFTP and Kafka for Integration Testing Spring-Boot apache-camel xml based routes?

I am working with a Spring Boot application that was written using Apache Camel spring-xml routes. There is very little java based application logic, and it is nearly entirely written in xml and based on the various camel routes.
The routes are configured to connect to the different environments and systems through property files, using a property such as KAFKA_URL and KAFKA_PORT. Inside one of the implemented routes, the application connects with the following and consumes/produces messages to it:
<to id="route_id_replaced_for_question" uri="kafka:{{env:KAFKA_URL:{{KAFKA_URL}}}}:{{env:KAFKA_PORT:{{KAFKA_PORT}}}}?topic={{env:KAFKA_TOPIC:{{topic_to_connect_to}}}}&kerberosRenewJitter=1&kerberosRenewWindowFactor=1&{{kafka.ssl.props}}&{{kafka.encryption.props}}"/>
Additionally, we connect to an SFTP server, which I am also trying to mock using Citrus. That follows a similar pattern where:
<from id="_requestFile" uri="{{env:FTP_URL:{{FTP_URL}}}}:{{env:FTP_PORT:{{FTP_PORT}}}}/{{env:FTP_FILE_DIR:{{FTP_FILE_DIR}}}}/?delete=true&fileExist=Append&password={{env:FTP_PASSWORD:{{FTP_PASSWORD}}}}&delay={{env:FTP_POLL_DELAY:{{FTP_POLL_DELAY}}}}&username={{env:FTP_USER:{{FTP_USER}}}}"/>
Inside of my integration test, I have configured a Citrus' EmbeddedKafkaServer class with the following:
#Bean
public EmbeddedKafkaServer embeddedKafkaServer() {
return new EmbeddedKafkaServerBuilder()
.kafkaServerPort(9092)
.topics("topic_to_connect_to")
.build();
}
and a Citrus FTP server with:
#Bean
public SftpServer sftpServer() {
return CitrusEndpoints.sftp()
.server()
.port(2222)
.autoStart(true)
.user("username")
.password("passwordtoconnectwith")
.userHomePath("filedirectory/filestoreadfrom")
.build();
}
Ideally, my test will connect to the mock sftp server, and I will push a file to the appropriate directory using Citrus, which my application will then read in, process, and publish to a topic on the embedded kafka cluster and verify in the test.
I was under the impression that I would set KAFKA_PORT to 9092 and KAFKA_URL to localhost, as well as FTP_URL to localhost and FTP_PORT to 2222 (amongst the other properties needed) inside of my properties file, but that does not seem to connect me to the embedded cluster or sftp servers..
What piece of the puzzle am I missing to have my spring boot application connect to both of these mocked instances and run its' business logic processing from there?
I resolved this issue - it was due to using a very old version of Kafka (1.0.0 or older), which was missing some of the methods that are called when Citrus attempts to build new topics. If someone encounters a similar problem to this using Citrus, I recommend starting with evaluating the version of Kafka your service is on and determining if it needs to be updated.
For the sftp connection, the server or client was not being autowired, and therefore never starting.

Circuit breaker open when uploading large file

I have an existing project with the following microservice architect. Client --> API Gateway(Spring cloud using Hystrix as circuit breaker) --> UploadService. When uploading small file(POST /upload/video) everything is fine. But when the file is larger then the upload time is very long and Hystrix will be OPEN and return fallback.
Does anyone have practice for my case or how can I set up the timeout for only POST /upload/video request on Hystrix?
It appears that you need to configure a larger timeout in the Hystric client;
in your example this is the "API Gateway (Spring cloud using Hystrix as circuit breaker)"
I imagine that your code will look something like this:
HystrixCommand.Setter yourHystrixCommand; ... blah your HystrixCommand
HystrixCommandProperties.Setter hystrixCommandPropertiesSetter = HystrixCommandProperties.Setter();
hystrixCommandPropertiesSetter.withExecutionTimeoutInMilliseconds(yourDesiredTimeoutValue);
yourHystrixCommand.andCommandPropertiesDefaults(commandProperties);
Here is an introduction to Hystrix at Baeldung
Edit:
"Hystrix Client" here means the client software that is using Hystrix.

How to monitor streaming apps Inside SCDF?

I am novice to Spring Cloud Data flow and Stream Cloud Streaming Applications.
Currently my project diagram looks like following :
I route a POST request from outside client using zuul API gateway to a microservice called Composite. Composite creates a stream using REST POST and deployes onto Spring Cloud Data Flow Server. As far as I know the microservices mongodb and file run as co-existing JVM processes. If My client has to know the status of stream, status of the processed data, How should Composite Microservice interact with Spring Cloud Data Flow Server? Currently when I make POST call to deploy the stream I dont even get the status from SCDF Server. Does SCDF expose any hooks to look at the individual apps? Also how can I change the flow #runtime to create a dynamic mesh?
Currently I am using Local Spring Cloud Data Flow Server for development.
Runtime platform is local
Local runtime is recommended only for development purpose and if you're preparing for production, please make sure to choose a platform variant (eg: cf, k8s, yarn, ..) that comes with non-functional requirements to support reliable and durable execution of all the applications running in streaming pipeline.
As far as I know the microservices mongodb and file run as co-existing JVM processes.
If your stream definition is file | mongodb, you'd have 2 different JVM's even when using Local runtime. They're independent Boot applications.
How should Composite Microservice interact with Spring Cloud Data Flow Server?
Not clear what you mean by "composite" here. All the microservice applications in SCDF communicate via messaging middleware such as Kafka or Rabbit. SCDF provides the orchestration capability to run such applications into various runtime platforms.
Currently when I make POST call to deploy the stream I dont even get the status from SCDF Server
You can use SCDF's REST-APIs to query for current status of the apps and it is platform agnostic. You can view the list of supported APIs by hitting the root URL (see image below) - there's a gap in docs - we will fix it. Following APIs could be useful for status checks.
Does SCDF expose any hooks to look at the individual apps?
Once the apps are deployed in a runtime platform, you can take advantage of Boot's actuator endpoints to explore more details such as trace, metrics, health, env among others at each application level. See Boot's actuator endpoints for more details. For instance, if your mongodb app is running locally and on port 23000, then you can check granular metrics for this application at: http://localhost:23000/metrics.
[As an FYI: future SCDF releases would include integrating Spring Boot + Spring Cloud Sleuth metrics and visual representation of the same.]
Also how can I change the flow #runtime to create a dynamic mesh?
If you're referring to editing a running streaming pipeline with addition/deletes, we are currently exploring design approach to support this functionality.

Resources