Jackrabbit Oak and integrate Solr - jackrabbit-oak

I'm new to Jackrabbit Oak, I'm trying to integrate with Solr remote server. I could not do it, I read the documentation, I have studied the source Jackrabbit 1.4.0 Oak without result. Someone has something that can illustrate me how. Thank you.

Assuming your Oak 1.4.x repository is setup correctly, creating a Solr index with the following structure should be enough to start seeing content being ingested into the remote Solr instance (e.g. placed at http://localhost:8983/solr/oak) configured with the config provided at [1]:
/oak:index/solrRemote
- jcr:primaryType = "oak:QueryIndexDefinition"
- type = "solr"
- async = "async"
+ server
- jcr:primaryType = "nt:unstructured"
- solrServerType = "remote"
- httpUrl = "http://localhost:8983/solr/oak"
[1] : https://github.com/apache/jackrabbit-oak/tree/trunk/oak-solr-core/src/main/resources/solr/oak

Related

Can ElasticSearch be used as a persistent store for Apache Ignite?

I want to know if there's a way to configure the datasource for Ignite as Elastic Search. I was browsing the web. But I did not find a solution.
I want to implement this integration for a Java application.
If I understand your idea correctly there's a way to do it. As far as I can see Elasticsearch supports SQL table-like data access and it's available through jdbc connection. From the Ignite's side we have 3rd party persistance, it uses jdbc to connect to an underlying store system. To be honest I haven't tested it but I suppose it should work.
Also I need mention that you can use GridGain WebConsole to generate simple Ignite project from existing jdbc connection. This functionality could be found on Configuration tab -> Create Cluster Configuration.

Elastic Node On local in 6.2

How to start localhost elasticsearch node in my junit test, and use high level rest client( Elasticsearch 6.2.)
Here is the code I tried
String clusterName = "test";
Settings settings = Settings.builder()
.put("path.home", ES_WORKING_DIR)
.build();
new Node(settings).start();
and the error is.
java.lang.IllegalStateException: Unsupported transport.type []
Embedded Elasticsearch node is not supported anymore.
You can read here for more detail
Embedded Elasticsearch not supported
Btw, in case you still want to have that feature (run component integration test, for example), there're some third-parties library may support you:
https://github.com/allegro/embedded-elasticsearch
https://github.com/nitishgoyal13/elasticsearch-6-embedded-cluster

ElasticSearch - Index a large file using Java API

We have a requirement wherein we have to use ElasticSearch for performing full text search. We have a Spring based application and for integration with ES we can use either Java API of Elastic Search or Spring Data for ElasticSearch.
The input will be of a file type having size around 5MB.
I went through examples for both ES Java API and SpringData, they do have
tutorials available for inserting a JSON document.
But any help with regards to using File as an input to create documents/index is not available.
I am newbie with Elastic Search, any guidance/help on this will be much appreciated.
EDIT:
I could see that there is a Ingest Attachment Processor plugin available in ES (https://www.elastic.co/guide/en/elasticsearch/plugins/master/ingest-attachment.html).
Can anybody point me to a sample CURL request to use this plugin or any Java code to use this plugin
1.You may use Elasticsearch mapper attachments plugin. This plugin uses Apache Tika to ingest almost any well known type of document and make it searchable by Elasticsearch.
https://www.elastic.co/guide/en/elasticsearch/plugins/2.3/mapper-attachments.html
2.You can use Apache Tika to extract useful content from file and use elasticsearch Bulk Indexing api to index to ES
Hope that helps

How to combine neo4j and elasticsearch

I am developing a Question answering application and for that I need to use neo4j and elasticsearch in the same maven project. I am using elasticsearch to make my application more robust.
As we know that neo4j and elasticsearch works on different version of lucene, so whichever version I include in dependency, it gives an error.
Here is what I am doing:
First elasticsearch will index the data and the data and relationships will be stored as graphdatabase using neo4j. Then the user will input as a query, through which the data will be retrieved with the help of indexes. This data will be trigerred in graphdatabasev using trigger score which will be then propagated along the graphdatabase to find relevant results according to the user query.
Is there any way that I can integrate neo4j and elasticsearch in same maven project, or is there any other way through which these two modules can interact seperately.
Thanks
Please check out our integration page:
http://neo4j.com/developer/elastic-search/
Which has some discussion and also an example project to get you started.
http://github.com/neo4j-contrib/neo4j-elasticsearch

No support of elastic search river with informix

When i try to connect the elastic search jdbc river plugin with postgres or h2 db to get the data into the elastic search engine, it behaves properly.
But in case of informix it always give this kind of error :-
java.sql.SQLException: No suitable driver found for jdbc:informix-sqli:
even after i put the jar file into the plugin/jdbc folder.
can anybody has any idea on that.
The issue was with the Jar, I had all the 6 jars but the thing wat elastic search engine accepts a jar in specific way which means jar should contail Meta-Inf->services->jdbc.sql.Driver, which was not there so explicitly I had mension the driver name in the elastic search configuration. which is
set JAVA_OPTS=%JAVA_OPTS% -Djdbc.drivers=com.informix.jdbc.IfxDriver

Resources