Merging data from SQL azure database to elasticsearch using logstash - elasticsearch

I have an SQL SERVER database hosted on azure,
i want to migrate data to elastik stack, for the configuration of my logstash file , I tried this code : https://medium.com/#erangadulshan.14/pushing-relational-data-to-elasticsearch-using-logstash-jdbc-input-plugin-48af81ed1000 and its doesnt work, i have installed sql server driver msodbcsql17 but doesnt work, (I don't have the path of any jar file )
What should i do ?

You can reference this tutorial: How to copy SQL Server data to Elasticsearch using LogStash.
As a developer working with SQL Server there was a need to import data from the database to Elasticsearch and analyze data in Kibana.
As Elasticsearch is an open source project built with Java and handles mostly other open source projects, documentations on importing data from SQL Server to ES using LogStash.
This tutorial shared how to import SQL Server data to Elasticsearch (version 6.2) using LS and verify the result on Kibana.
Hope this helps.

Related

Logtsash Plugin for CosmosDB to Elasticsearch

Can you please suggest which logstash plugin is used for pulling data from Cosmos DB to Elasticsearch using Logstash?
If no such plug-ins, is there any other way to do the same?
Based on the Logstash plugins for Microsoft Azure Services and this thread,it seems that the cosmos db input plugin is not supported so far.
All i can find by now,you could use ADF copy activity to transfer your cosmos db data into above supported input source data residences,then complete subsequent work.
For example,use ADF to transfer cosmos db into sql db and follow this link to integrate with your elasticsearch service.

How to load data from Cassandra into ELK

I have installed Cassandra 3.11.3 in my ubuntu virtual machine. I have also installed the ELK(elasticsearch, logstash, kibana).
What is the way using which I can visualize the Cassandra data into Kibana using the ELK. Please let me know the detail configurations that i will need to do in order to get data from Cassandra database into the Kibana dashboard.
I did the similar thing using Kafka, where I used below structure:
Cassandra -> Confluent Kafka -> Elastic search.
Its pretty easy to do as connectors are provided by Confluent.
But if you only need to visualize the data , you can try Banana which gels well with Cassandra.
Note: Banana is forked version of Kibana.
Se

How can I get elasticsearch data in h2o?

I have data loaded in elasticsearch.
How can I get elasticsearch data in h2o?
There is no direct way or API available to load data into H2O from elasticsearch. h2o supports files and JDBC, so you can write the data into the CSV file from ES. Then import data into the h2o using POST /3/ImportFiles. You can refer my answer related to it at how to create an h2oframe
The latest version of elasticsearch comes with an sql interface that can be connected to via jdbc or odbc. I haven't attempted to use this with H2O but in theory...

Where does elasticsearch store data

I have managed to install and try elasticsearch.
I thought i need to install a nosql server like mongodb.
elasticsearch seems to embbed its own storage or database system.
So, i think elasticsearch is not just a search tool.
It also provides storage and database functions. Is this correct ?
Thanks

Run query on couchbase data imported using sqoop and hadoop connector

I am using sqoop with hadoop couchbase connector to import some data from couchbase to hdfs.
As stated in
http://docs.couchbase.com/hadoop-plugin-1.1/#limitations
querying is not supported for couchbase.
I want a solution to run query using the hadoop connector.
For ex:
I have 2 documents in db as follows:
{'doctype':'a'}
and
{'doctype':'b'}
I need to get only the docs which belong to docType=a.
Is there a way to do this?
If you want to select data from Couchbase, you don't need hadoop connector...you can just use couchbase view that filters on doc.doctype=='a'
See couchbase views documentation
On other hand, I recommend using new N1QL query functionality from Couchbase. It is quite flexible query language (similar to SQL), see online N1QL tutorial.
Note: If you look at compatibility for N1QL to run it has v2.2 and higher, see N1QL Compatibility You will need to deploy Couchbase N1QL Query server and point to your existing CB v2.2 cluster. see: Couchbase N1QL queries on server
Suggesting another alternative for Sqoop for the above requirement called 'Couchdoop'.
Couchdoop uses views to fetch data from Couchbase. Hence we can write a query as per our need and use Couchdoop to hit the view and fetch data.
https://github.com/Avira/couchdoop
Worked for me.

Resources