sending Kafka-connector logs/events to NewRelic - debugging

Currently we are using kafka connector to get data from a source which is happening fine as expected, now I am planning to send the kafka-connector logs/events to NewRelic for better debugging.
Is there a way to let kafka connect send the connector logs to Newrelic? and if so ow to achieve this.
Thanks in advance.

I believe you can implement the New Relic Sink Collector
More Links:
https://newrelic.com/blog/how-to-relic/kafka-connect-for-new-relic
https://docs.newrelic.com/whats-new/2020/10/kafka-connect-unlock-open-source-alternative-instrumentation-sources/

Related

ruby-kafka: is it possible to publish to two kafka instances at the same time

Current flow of the project that I'm working on involves pushing to a local kafka using ruby-kafka gem.
Now the need arose to add producer for the remote kafka, and duplicate also messages there.
And I'm looking for a better way, than calling Kafka.new(...) twice...
Could you please help me, and do you happen to have any ideas?
Another approach to consider would be writing the data once from your application, and then asynchronously replicating the message from one Kafka cluster to another. There are multiple ways of doing this including Apache Kafka's MirrorMaker, Confluent's Replicator, Uber's uReplicator etc.
Disclaimer: I work for Confluent.

streaming data from oracle with kafka

I'm starting with kafka and I need to control the inserts in a specific Oracle table, send the new records through kafka at the moment. I have no control over the database, so, in principle, Debizium is excluded. How can I do this? Without using triggers.
I've made a producer read data from Oracle with a java program in eclipse but, that would make constant requests to the database. I use java for simulated a ETL with consumer.
PS: I work with Windows but that's secondary.
If I understand your problem correctly, you are trying to route inserts from Kafka to Oracle Database. There could be few possibilities:
You implement Kafka consumer and as soon as your kafka cluster gets a message consumer makes a insert. You could reuse your java code here- just remove the polling part. Please visit here
If you have kafka deployed in a cloud environment and are using it as a service(aws msk) you would have the option to handling the events. Again you can use java program or can write a python script to make inserts. Please visit here
I would like to understand your throughput requirements, whether you really need kafka as a distributed messaging system or a simple aws sqs would work just fine. If you can use sqs things would be straightforward for you. You create a queue and you write a listener in
python or java
boto3 is an excellent python library for working with sqs

Apache NiFi stream into InfluxDB

Within a cloud application I'm using NiFi (=> I'm a newbee) to work with data streams published by a mqtt broker. So far so good.
In the end I want to stream into an InfluxDB. That's the point I'm struggling with.
Does anybody have some experiences with a processor for such a setup? Is there a suitable processor for writing data into an InfluxDB?
Thanks a lot.
Kind regards,
T_F
There is a PutInfluxDB processor which accepts the incoming flowfile and writes the content as 'line content' in InfluxDB.

Consume protobuf messages from Graphite

I'd like to know if I can send data to Graphite using protobuf.
I have an application that sends statistics in protobuf format and I want to start sending those statistics to Graphite.
I searched in google and I just found this https://graphite.readthedocs.io/en/latest/search.html?q=protobuf&check_keywords=yes&area=default# but it's not clear if it's only for graphite internal core usage.
Thanks!
Yes you can, think it is available since version 1.x and up.
See for an example in Python:
https://github.com/graphite-project/carbon/blob/master/examples/example-protobuf-client.py
You will have to enable the listener in the Carbon configuration:
https://github.com/graphite-project/carbon/blob/master/conf/carbon.conf.example#L113-L117

Monitoring Kafka Spout

I am trying to monitor the performance of Kafka spout for my project. I have used the KafkaSpout that is included in apache-storm-0.9.2-incubating release.
Is it possible to monitor the throughput of kafka spout using the kafka offset monitoring tool? Is there another, better way to monitor the spout?
Thanks,
Palak Shah
The latest Yahoo Kafka Manager has added metrics information and you see TPS, bytes in/out etc.
https://github.com/yahoo/kafka-manager
We could not find any tool that provides the offset for all the consumers including the kafka-spout consumer. So, we ended up building one ourselves. You can get the tool from here:
https://github.com/Symantec/kafka-monitoring-tool
It might be of use to you.

Resources