I'm trying to build a microservice using quarkus that uses thrift as the communication protocol. I'm still fairly new to Apache Thrift and Quarkus.
Is there a way to implement a thrift server without the need to migrate over to gRPC?
Apache Thrift is a complete RPC (remote procedure call) system. You can build clients and servers using just Apache Thrift in a wide range of languages and any Thrift client can call any Thrift server hosting the desired service interface regardless of language/platform. You can find tutorials for each of the supported languages here: https://thrift.apache.org/tutorial/
The Apache Software Foundation hosted Thrift project and the CNCF hosted gRPC project are competing RPC systems and do not inter-operator or share code.
Related
I've seen brokers like RabbitMQ or Apache Pulsar provide a Websocket API to connect directly your browser to the broker.
AFAIK, I've not seen the same for Apache Kafka. You have to implement yourself an intermediary Websocket server.
Why Confluent, which owns Apache Kafka development, does not provide an out-of-the-box websocket API like Rabbitmq or Pulsar ?
Confluent does not own Apache Kafka development. Apache Kafka is a project owned by the Apache Software Foundation (ASF).
There are several examples of how to use WebSockets with Apache Kafka:
https://www.confluent.io/blog/data-stream-processing-with-kafka-streams-bitrock-and-confluent/
https://dev.to/victorgil/kafka-websockets-angular-event-driven-microservices-all-the-way-to-the-frontend-12aa
https://medium.com/swlh/angular-spring-boot-kafka-how-to-stream-realtime-data-the-reactive-way-510a0f1e5881
Your question seems somewhat a rhetorical one, but if you actually would like to see Websockets as part of Apache Kafka then the first step would be to raise a Kafka Improvement Proposal (KIP) for discussion in the community. There is a dev mailing list where you can get further guidance on the process on contributing code.
There is already an open source connecting Kafka with Websocket. https://github.com/b/kafka-websocket
If you have requirements connecting to Kafka from a browser, I suggest to consider to browserfiy this http://github.com/confluentinc/kafka-rest-node
There is a kafka connector to Ably hosted on the Confluent Hub. Ably is essentially a serverless WebSocket option (with pub/sub and message queues). You will also find a kafka rule on Ably's website.
I've seen Apache Nifi compared to similar ETL tools like Apache Flume, Airflow and Kafka. These are ETL tools more than ESBs or request mediators.
ESBs/request mediators can be used to orchestrate web services and expose a single service (a proxy service) which is expected to serve concurrent HTTP requests efficiently.
My question is, can I use Apache Nifi for the same purpose? To provide service orchestration and serve proxy service endpoints using Nifi's processors such as HandleHttpRequest? Is it designed to handle real-time concurrent requests efficiently?
You brought up a few technologies that are quite different..
Apache NiFi is a dataflow management tool. Unlike, Kafka Streams, Airflow or Apache Flume, it does not require you to write your own code. You can do almost anything you need using the existing processors developed by Apache.
Besides, Airflow is a workflow management tool, could be compared with Oozie.
NiFi is made for real time performance but not for serving as a Rest API. It can start a flow based on an http request like you said though.
Hope it helps
I am trying to export data from Kafka to Oracle db. I've searched related questions and web but could not understand that we need a platform (confluent etc.. ) or not. I'd been read the link below but it's not clear enough.
https://docs.confluent.io/3.2.2/connect/connect-jdbc/docs/sink_connector.html
So, what we actually need to export data without 3rd party platform? Thanks in advance.
It's not clear what you mean by "third-party" here
What you linked to is Kafka Connect, which is Apache 2.0 Licensed and open source.
Kafka Connect is a plugin ecosystem, you install connectors individually, written by anyone, or write your own, just like any other Java dependency (i.e. a third-party)
The JDBC connector just happens to be maintained by Confluent. and you can configure the Confluent Hub CLI
to install within any Kafka Connect distribution (or use Kafka Connect Docker images from Confluent)
Alternatively, you use Apache Spark, Flink, Nifi, and many other Kafka Consumer libraries to read data and then start an Oracle transaction per record batch
Or you can explore non-JVM kafka libraries as well and use a language you're more familiar with doing Oracle operations with
I have a functioning application using Spring Boot, Rabbit MQ & MySQL DB locally. I'm curious, how I can upload this app to the AWS Environment and get it working seamlessly.
The only part where I'm lost is how to get RabbitMQ in the cloud? Any suggestions?
I see three options for your needs :
Use AmazonMQ managed service. This uses ActiveMQ under the hood, and supports the AMQP protocol (so you can continue to use the RabbitMQ client). Here's an article on how to do it : https://aws.amazon.com/blogs/compute/migrating-from-rabbitmq-to-amazon-mq/.
Use a third-party managed service (such as CloudAMQP). This is similar to the first option, but you can choose a RabbitMQ provider if you wish.
Install RabbitMQ on an EC2 instance and manage it yourself. This is the most flexible option, but it will require more effort on your part and it will probably cost more. I would recommend this option only if you have special requirements that are not met by using a hosted service.
In all cases, I would also recommend to use a messaging library such as Spring Messaging or Apache Camel to isolate your code from your messaging implementation. This will reduce the boilerplate code you need for messaging and allows you to focus on your application logic.
Recently I am working with along with IOT department, right our project is on discussion and creating core architecture of an application. client specification is we must use MQTT protocol to communicate between device and java application (eclipse paho client).
its a web application based on spring boot and microservice architecture. but I an not able to find any good solution for API gateways that provide MQTT support.
I found zuul is good but do we have any alternative like kong..
MQTT is a TCP stream based protocol, so API Gateways that operate on the HTTP / Layer 7 are not going to fit the bill.
There are extensions to commercial API Gateways available, such as the Axway MQTT Proxy described here.
While not an API Gateway, Confluent also have a MQTT proxy that allows simple integration with Kafka, however if you have already written an application that implements the backend then Kafka is going to require some re-architecting.
The other options are really going for a simple TCP stream reverse proxy like nginx or HAProxy.
If I was asked to build something like this, I'd go straight to Kafka. It and MQTT are a very neat architectural fit and also operate very well together but it really depends on your requirements.