Kafka-Connect : Can source and sink use same topic? - apache-kafka-connect

I am using Kafka-MongoDB-Connect where
Source is Debezium Connector
Sink is Kafka-MongoDB-Connector by MongoDB
Can I use same topic and MongoDB collection for both Source and Sink Connector? Will it cause messages to go in infinite loop?

Related

Using Apache Flink's Elastic Search sink vs Kafka sink with Elastic Search connector

I have several Flink data streams that will ultimately end up in Elasticsearch. Is it better to use Flink's Elasticsearch sink or Flink's Kafka sink combined with Kafka's Elasticsearch sink? What would the tradeoffs be?

How to connect Flink and Elasticsearch in Pyflink?

I aim to create a project related to Kafka > Flink > ElasticSearch > Kibana with real time processing.
I can consume messages from Kafka in Flink but can not to connect Flink and ElasticSearch. How can I send kafka messages Flink consumed to ElasticSearch?
My python 3.8 environment includes: apache-flink=1.15.0
You can use the Table API to create an Elasticsearch Sink table:
https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/python/table/python_table_api_connectors/#how-to-use-connectors
https://nightlies.apache.org/flink/flink-docs-stable/docs/connectors/table/elasticsearch/#how-to-create-an-elasticsearch-table
If you need to convert your DataStream to the Table API you can find some help in here: https://nightlies.apache.org/flink/flink-docs-stable/docs/dev/table/data_stream_api/

Kafka Connect Elastic Search Sink Connector support for pushing to alias indexes

I am planning to use a shared index associated with alias indexes to support multi-tenancy for an event monitoring solution. I want to use Kafka Connect ElasticSearch Sink Connector to push events from a Kafka topic to an alias index instead of using the index directly. Does Kafka Connect support such feature currently ?

Formatting data to use Confluent JDBC Sink Connector via ksql

I'd like to use the Confluent JDBC Sink Connector via ksql to write to ClickHouse database.
I have a c# application that writes the data to Kafka topic. How can I format the message from my application, so that it is acceptable for sink to write to the database? I don't want to use the Schema Registry or other ksql constructs.
KSQL accepts JSON or CSV data, however ClickHouse has it's own Kafka Connector, so shouldn't need JDBC Sink, which will only work with a message with a schema (meaning you will need to use the Schema Registry, which is not only a KSQL construct and can be used in your C# code as well)

Kafka-Connect vs Filebeat & Logstash

I'm looking to consume from Kafka and save data into Hadoop and Elasticsearch.
I've seen 2 ways of doing this currently: using Filebeat to consume from Kafka and send it to ES and using Kafka-Connect framework. There is a Kafka-Connect-HDFS and Kafka-Connect-Elasticsearch module.
I'm not sure which one to use to send streaming data. Though I think that if I want at some point to take data from Kafka and place it into Cassandra I can use a Kafka-Connect module for that but no such feature exists for Filebeat.
Kafka Connect can handle streaming data and is a bit more flexible. If you are just going to elastic, Filebeat is a clean integration for log sources. However, if you are going from Kafka to a number of different sinks, Kafka Connect is probably what you want. I'd recommend checking out the connector hub to see some examples of open source connectors at your disposal currently http://www.confluent.io/product/connectors/

Resources