I've used Nutch and Elasticsearch many times before, however, I believe I was using the default setup to where Nutch used the binary transport method for communicating with Elasticsearch. It was simple and worked out of the box so I've used it alot.
I've been in the process of updating crawl system and it seems now the better option is to use the Jest REST api library.
However, I'm a bit confused about it...
First how do I install the Jest library to be used with Nutch and Elasticsearch. I know I can download or clone via Github but.. how is it connected?
Do I literally just update the dependencies in the /indexer-elastic-rest *.xml files for Nutch and then just build again with ant?
My first install of Nutch was using the binary zip. I just recently started using the src package so ant/maven is somewhat new to me - which is why this all a bit confusing. All the blogs and articles say to "and then rebuild with ant"...
Second - does the Jest library take care of all Java REST api code or do I have to write Java code now?
Related
In my app, I'd like to use "Elastic App Search" functionality, especially facets. I except it work like this: https://github.com/elastic/search-ui
At this point, I have installed Elastic Search & Kibana (using brew) and populated it with data. I am able to run it locally and make queries.
To install the App Search (which is included in Elastic Enterprise Search), I use the following instructions: https://www.elastic.co/downloads/enterprise-search.
I have done everything up to point 3.
In point 4:
I can't locate the elastic user password in the logs, I haven't set any security/passwords so far, so I guess there's no password at this moment.
I haven't seen or used any Kibana token so far. I tried to generate it, as it showed here, but it does not work for me. It seems like the default path for elasticsearch should be /usr/local/etc/elasticsearch, but I don't even have etc directory in my /usr/local. Instead, elasticsearch is inside the homebrew directory.
I can't find http_ca.crt file anywhere in my homebrew, should I enable security in elasticsearch first to generate this file?
Unlike Elastic Search and Kibana, the Elastic Enterprise Search file I downloaded in step 1 is not an application, but a regular directory. Where should I put it?
Does my approach even make sense? Is it possible to run this service locally just like I'm running ES/Kibana? Most of the examples on the Internet show how to run this service on Docker only.
I was trying to find an opensource plugin to use LDAP/AD authentication for Elasticsearch/Kibana. I found Open Distro which is currently based on Elasticsearch 7.10.2, and I wanted to use the security plugin in my existing regular ES stack which works with 7.11.2, but it complains that it can't work with newer versions of ES. The problem is that I cant downgrade anyway without losing my data.
Is there another way (opensource) to integrate LDAP whether using Open Disto or another plugin?
You need to "downgrade" in that case if you want to stick with the latest version of this plugin I think.
You could start a new cluster with 7.10.2 and use reindex from remote to reindex your data in the "new" cluster. So read from 7.11 and write to 7.10.
I'm using Google cloud composer which essentially gives you an Apache Airflow environment. Since my application is written in Golang, I was looking for a golang client library to create and execute DAGs in the Cloud Composer Airflow environment. Please let me know if there's any.
Thanks to your clarification, you can't! Indeed, Composer is a managed version of Apache Airflow, where dags are described in Python and in Python only.
You can reach the Composer/Airflow API with Go, you can generate Python code with Go and Go template. You can also define a custom Operator in Airflow which run a Go binary with the right parameters. However, the custom operator itself must be wrote in Python.
I am spending days trying to find solution what to use for this "simple" test:
inside of test stand up ES node that will be used for test only
connect with transport client to that ES instance and add new index.
(that is just a start of all integration tests, but I need to see how
to do one if anyone has an example)
Note: we have to continue to use transport client, we know it is going away.
We are upgrading from ES version 2 to version 6 and trying to keep integration tests as much unchanged as possible.
This is exactly the same issue as this person had in lower version, we had it that way but it is not supported anymore in 6, and how do I accomplish the same now: Start elasticsearch within gradle build for integration tests
I am finding comments about "existing gradle plugin" will do the job - what is that plugin and how to use it?
Or, example with ESIntegTestCase would be perfect too, anything that works. Thank you!
elastic search have publish lastest java api ,After much search,i failed to find the detail changes log.
I have got a open-source elastic-helper code to synchronize mysql with es.when i used the lastest java api(elasticsearch-5.0.1.jar),i got a lot of error because of the changes of API.
I want to find useful document to refer to.