How to put two KSQLDB tables in the same index in Elasticsearch - elasticsearch

I have two tables in KSQLDB that I want to put in the same index in Elasticsearch
But the Elasticsearch Service Sink Connector for Confluent Platform does not support
topic changes like:
io.confluent.connect.transforms.ExtractTopic$Key
io.confluent.connect.transforms.ExtractTopic$Value
as seen in the documentation https://docs.confluent.io/kafka-connect-elasticsearch/current/overview.html#limitations
Are there other ways of doing it?

Related

For Kafka sink Connector I send a single message to multiple indices documents in elasticseach?

I am recieving a very complex json inside a topic message, so i want to do some computations with it using SMTs and send to different elasticsearch indice documents. is it possible?
I am not able to find a solution for this.
The Elasticsearch sink connector only writes to one index, per record, based on the topic name. It's explicitly written in the Confluent documentation that topic altering transforms such as RegexRouter will not work as expected.
I'd suggest looking at logstash Kafka input and Elasticsearch output as an alternative, however, I'm still not sure how you'd "split" a record into multiple documents there either.
You may need an intermediate Kafka consumer such as Kafka Streams or ksqlDB to extract your nested JSON and emit multiple records that you expect in Elasticsearch.

JanusGraph - How is the data is stored in ElasticSearch and Cassandra?

I'm using JanusGraph with ElasticSearch and Cassandra.
My question is how JanusGraph stores the data when I create a new entity in case that I'm using two databases (JanusGraph and ElasticSearch)
I could understand that ElasticSearch is used as index backend and Cassandra is the storage, but:
What JanusGraph does when I persist a new data ? It'll duplicate the same data into Cassandra and also on ElasticSearch (because it's also a database)?
If the answer for the first item is yes, so, when we perform a query that will traversal the graph, the JanusGraph will understand and perform the query on Cassandra and when this is a full text search then JanusGraph switch the query to ElasticSearch ?
If the answer for the first item is no, so, all the data will be stored on Cassandra and in some way JanusGraph will just use the index from ElasticSearch to do a search on Cassandra database ?
ElasticSearch indexes the data stored in Cassandra.
When you do graph traversals, it uses the search index to retrieve the data from Cassandra. Cheers!

Confluent Elasticsearch Sink connector, write.method : "UPSERT" on different key

In COnfluent Elasticsearch Sink connector, I am trying to write in same Elasticsearch index from two different topics. First topic is INSERT and another topic is UPSERT. For UPSERT, I want to update the JSON document based on some other field instead of "_id". IS that possible ? If yes, How can I do that ?
Use key.ignore=false and use existing primary key columns as _id for each json document.

Elasticsearch Run Node Client and index data in memory

I want to run elasticsearch with my tomcat server and index the data pulling from database and put it in elasticsearch index. Any pointers will help.
Elasticsearch has a Java API that you can use to do what you want to: https://www.elastic.co/guide/en/elasticsearch/client/java-api/current/index.html
If you are new to ES Definitive Guide is a very very good document: https://www.elastic.co/guide/en/elasticsearch/guide/current/index.html
By the way Elasticsearch is a full-text search engine. If you are looking an in memory data solution may be you should consider something like Apache Ignite: http://ignite.apache.org/

How does ELK (Elastichsearch, Logstash, Kibana) work

How are events indexed and stored by Elasticsearch when using ELK (Elastichsearch, Logstash, Kibana)
How does Elasticsearch work in ELK
Looks like you got downvoted for not just reading up at elastic.co, but...
logstash picks up unstructured data from log files and other sources, transforms it into structured data, and inserts it into elasticsearch.
elasticsearch is the document repository. While it's not useful for log information, it's a text engine at heart and can analyze the data (tokenization, stop words, stemming, etc).
kibana reads from elasticsearch and allows you to explore the data and make dashboards.
That's the 30,000-ft overview.
Elasticsearch have the function of database on ELK Stack.
You can read more information about Elasticsearch and ELK Stack here: https://www.elastic.co/guide/en/elasticsearch/guide/current/index.html.
first of all you will have logs file that you used to write system logs on it
for example when you add new record to database you will write the record in any form you need to log file like
date,"name":"system","serial":"1234" .....
after that you will add your configuration in logstash to parse the data from the logs
and it will be like
name : system
.....
and the data will saved in elastic search
kibana is used to preview the elastic search data
and you can use send a request to elasticsearch with the required query and get your data from it

Resources