Using elasticsearch Java API - maven

I am very new to elasticsearch. I found some simple java code for using elasticsearch:
import static org.elasticsearch.node.NodeBuilder.*;
// on startup
Node node = nodeBuilder().node();
Client client = node.client();
// on shutdown
node.close();
I am getting the following error:
package org.elasticsearch.node doesn't exist
Later, I found that I have put some information in pom.xml. What is that? How to make this simple program run?

Do you know what Maven is?
I mean that if you are using Maven, then you need to add elasticsearch-VERSION.jar
as a dependency in your pom.xml.
If not, then you need to add elasticsearch jar in your project classpath and some other libs such as (it depends of elasticsearch version you are using):
antlr-runtime-3.5.jar
asm-4.1.jar
asm-commons-4.1.jar
jna-3.3.0.jar
jts-1.12.jar
log4j-1.2.17.jar
lucene-analyzers-common-4.6.0.jar
lucene-codecs-4.6.0.jar
lucene-core-4.6.0.jar
lucene-expressions-4.6.0.jar
lucene-grouping-4.6.0.jar
lucene-highlighter-4.6.0.jar
lucene-join-4.6.0.jar
lucene-memory-4.6.0.jar
lucene-misc-4.6.0.jar
lucene-queries-4.6.0.jar
lucene-queryparser-4.6.0.jar
lucene-sandbox-4.6.0.jar
lucene-spatial-4.6.0.jar
lucene-suggest-4.6.0.jar
spatial4j-0.3.jar
I'd recommend to use Maven because it's much easier to deal with dependencies.
Hope this helps

Related

Gradle Idea plugin - issues with specifing test sources

I'm trying to create a custom source set and mark its contents in Intellij Idea as a Test Sources Root. I tried to use idea plugin and do it according to the gradle official website but it is not clear for me how it works.
First of all the documentation specifies the following configuration setup
idea {
module {
testSources.from(sourceSets["intTest"].java.srcDirs)
}
}
When I try to use it i receive Unresolved reference: testSources. Where is it coming from?
Then I tried to use:
idea {
module {
testSourceDirs = intTest.java.srcDirs
}
}
it works fine as long as I use only Java. After applying Kotlin plugin however, both kotlin and java + resources folder are again treated as Sources Root not Test Sources. To fix that I had to change from:
testSourceDirs = intTest.java.srcDirs
to:
testSourceDirs = intTest.kotlin.srcDirs
and now all folders are Test Source Root again. Since kotlin.srcDirs also includes java.srcDirs it looks like you have to specify all, otherwise it is ignored...
Now the real issue came when I used gradle-avro-plugin. Applying it made my folders marked as Sources Root again. I believe it is because it adds another avro directory, but just to main source set.
Does anyone know how to make it marked as Test Sources having both kotlin and avro plugin applied? Am I doing something wrong here? Beacause this beheviour seems to be buggy in the first place.
Tested with:
IntelliJ IDEA 2022.3.1 (Ultimate Edition)
Gradle 6.8.3 and 7.4.2
Plugin id("com.github.davidmc24.gradle.plugin.avro") version "1.5.0"
Plugin kotlin("jvm") version "1.7.0"

Can't find ParameterizedTest and ValueSource

I had done a simple project, trying to understand how ParameterizedTest and ValueSource works.
From the below picture it finds the import path, but it throws an error when I try to run the code:
Also the gradle file:
Here is a link to the entire project.
You need to put junit-jupiter-params in the testCompile source set.
junit-jupiter-params exports types like #ParameterizedTest and #ValueSource that are needed at compile (and run~) time.
See also: Missing org.junit.jupiter.params from JUnit5
Starting with version 5.4.0-M1 JUnit Jupiter provides an aggregator artifact that bundles all available Jupiter-defining artifacts for easy consumption. See https://sormuras.github.io/blog/2018-12-26-junit-jupiter-aggregator.html for details.

Running spark job using Yarn giving error:com.google.common.util.concurrent.Futures.withFallback

I am trying run spark job using yarn,but getting below error
java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture;
at com.datastax.driver.core.Connection.initAsync(Connection.java:176)
at com.datastax.driver.core.Connection$Factory.open(Connection.java:721)
at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:248)
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:194)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:82)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1307)
at com.datastax.driver.core.Cluster.init(Cluster.java:159)
at com.datastax.driver.core.Cluster.connect(Cluster.java:249)
at com.figmd.processor.ProblemDataloader$ParseJson.call(ProblemDataloader.java:46)
at com.figmd.processor.ProblemDataloader$ParseJson.call(ProblemDataloader.java:34)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:140)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:140)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:618)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:618)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:247)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
cluster Details:
Spark 1.2.1,hadoop 2.7.1
I have provided class path using spark.driver.extraClassPath.
hadoop user has access to that class path as well.But I think yarn is not getting the JAR's on that classpath.
I am not able to reach root cause of it.Any help will be appreciated.
Thanks.
I faced the same problem, and the solution was shade guava to avoid classpath collision.
If you're using sbt assembly to build your jar, you can just add this to your build.sbt:
assemblyShadeRules in assembly := Seq(
ShadeRule.rename("com.google.**" -> "shadeio.#1").inAll
)
I wrote a blog post which describes my process to arrive to this solution: Making Hadoop 2.6 + Spark-Cassandra Driver Play Nice Together.
Hope it helps!
Issue is related to guava version mismatch.
withFallback was added to version 14 of Guava. It looks like you have Guava < 14 on your classpath
Adding to #Arjones answer, if you are using gradle + GradleShadow, you can add this to your build.gradle to relocate or rename the Guava classes.
shadowJar {
relocate 'com.google.common', 'com.example.com.google.common'
}

OSGi bundle build issue in Gradle

I have a simple use case of building an OSGi bundle using Gradle build tool. The build is successful if there are java files present in the build path, but it fails otherwise.
I am using 'osgi' plugin inside the gradle script and trying to build without any java files. The build always fails with following error:
Could not copy MANIFEST.MF to
I am sure there must be some way to do it in Gradle but not able to fine. Any idea what can be done to resolve this depending on your experience.
I ran into this today as well, and #Peter's fix didn't work for me (I hadn't applied the java plugin in the first place...). However, after hours of Googling I did find this thread, which helped me find the problem.
Basically, it seems that the error occurs (as Peter stated) when no class files are found in the jar - my guess is because the plugin then cannot scan the classes for package names on which to base all the Import and Export information.
My solution was to add the following to the manifest specification:
classesDir = theSourceSet.output.classesDir
classpath = theSourceSet.runtimeClasspath
In my actual build code, I loop over all source sets to create jar tasks for them, so then it looks like this:
sourceSets.each { ss ->
assemble.dependsOn task("jar${ss.name.capitalize()}", type: Jar, dependsOn: ss.getCompileTaskName('Java')) {
from ss.output
into 'classes'
manifest = osgiManifest {
classesDir = ss.output.classesDir
classpath = ss.runtimeClasspath
// Other properties, like name and symbolicName, also set based on
// the name of the source set
}
baseName = ss.name
}
}
Running with --stacktrace indicates that the osgi plugin doesn't deal correctly with the case where both the osgi and the java plugins are applied, but no Java code is present. Removing the java plugin should solve the problem.
I had the same issue also when java code was present.
Adding these two lines to the osgiManifest closure fixed the problem:
classesDir = sourceSets.main.output.classesDir
classpath = sourceSets.main.runtimeClasspath
-- erik

using cxf in osgi: Provider org.apache.cxf.jaxws.spi.ProviderImpl not found

I am trying to publish some web services (using EndpointImpl.publish()) but I am gettings this error:
Provider org.apache.cxf.jaxws.spi.ProviderImpl not found
the cxf-bundle is installed:
[ 79] [Active ] [Created ] [ 50] Apache CXF Bundle Jar (2.4.3.fuse-01-02)
an extract of the osgi:headers shows the imported package
Import-Package =
javax.jws,
javax.persistence;version="[1.1,2)",
javax.servlet;version="[2.5,3)",
javax.xml.bind,
javax.xml.bind.annotation,
javax.xml.bind.annotation.adapters,
javax.xml.datatype,
javax.xml.namespace,
javax.xml.parsers,
javax.xml.transform,
javax.xml.transform.stream,
javax.xml.validation,
javax.xml.ws;version="[2.2,3)",
javax.xml.ws.soap;version="[2.2,3)",
javax.xml.ws.wsaddressing;version="[2.2,3)",
org.apache.commons.lang;version="[2.5,3)",
org.apache.commons.logging;version="[1.1,2)",
org.apache.cxf.jaxws;version="[2.4,3)",
org.apache.cxf.jaxws.spi;version="[2.4,3)", <--- imported
org.apache.cxf.ws.addressing;version="[2.4,3)",
org.apache.felix.gogo.commands;version="[0.10,1)",
org.apache.openjpa.enhance;version="[2.2,3)",
org.apache.openjpa.util;version="[2.2,3)",
org.osgi.framework;version="[1.5,2)",
org.osgi.service.blueprint;version="[1.0.0,2.0.0)",
org.springframework.beans.factory.xml;version="[3.0,4)",
org.springframework.context;version="[3.0,4)",
org.springframework.context.support;version="[3.0,4)",
org.w3c.dom,
org.xml.sax
Require-Bundle =
org.apache.cxf.bundle
I am not sure what else I need to do.
in case it is important. the container is a karaf 2.2.7
to address pooh's answer:
1- cxf-bundle is exporting this package: org.apache.cxf.jaxws.spi;version="2.4.3.fuse-01-02"
2- bundle was started. the error was during runtime.
3- the manifest was created using maven-bundle-plugin which should create the entire list
4- the error happen while creating a webservice endpoint:
TopologyIFPortType impl = new TopologyWS();
String addressTopology = "http://localhost:" + port
+ "/nsp/webservice/topology";
topologyEndpoint = (EndpointImpl) Endpoint.create(impl);
topologyEndpoint.getFeatures().add(new WSAddressingFeature());
topologyEndpoint.publish(addressTopology);
the complete trace:
javax.xml.ws.spi.FactoryFinder$ConfigurationError: Provider org.apache.cxf.jaxws.spi.ProviderImpl not found
at javax.xml.ws.spi.FactoryFinder$2.run(FactoryFinder.java:130)
at javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:220)
at javax.xml.ws.spi.FactoryFinder.newInstance(FactoryFinder.java:124)
at javax.xml.ws.spi.FactoryFinder.access$200(FactoryFinder.java:44)
at javax.xml.ws.spi.FactoryFinder$3.run(FactoryFinder.java:211)
at javax.xml.ws.spi.FactoryFinder.doPrivileged(FactoryFinder.java:220)
at javax.xml.ws.spi.FactoryFinder.find(FactoryFinder.java:160)
at javax.xml.ws.spi.Provider.provider(Provider.java:43)
at javax.xml.ws.Endpoint.create(Endpoint.java:41)
at javax.xml.ws.Endpoint.create(Endpoint.java:37)
at org.opennaas.extensions.idb.webservice.WebServiceHolder.startTopology(WebserviceControl.java:78)
at org.opennaas.extensions.idb.webservice.WebServiceHolder.start(WebserviceControl.java:60)
at org.opennaas.extensions.idb.webservice.WebserviceControl.startWebservices(WebserviceControl.java:32)
at org.opennaas.extensions.idb.shell.StartWebservices.doExecute(StartWebservices.java:16)
at org.apache.karaf.shell.console.OsgiCommandSupport.execute(OsgiCommandSupport.java:38)
at org.apache.felix.gogo.commands.basic.AbstractCommand.execute(AbstractCommand.java:35)
at org.apache.felix.gogo.runtime.CommandProxy.execute(CommandProxy.java:78)
at org.apache.felix.gogo.runtime.Closure.executeCmd(Closure.java:474)
at org.apache.felix.gogo.runtime.Closure.executeStatement(Closure.java:400)
at org.apache.felix.gogo.runtime.Pipe.run(Pipe.java:108)
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:183)
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:120)
at org.apache.felix.gogo.runtime.CommandSessionImpl.execute(CommandSessionImpl.java:89)
at org.apache.karaf.shell.console.jline.Console.run(Console.java:240)
at java.lang.Thread.run(Thread.java:679)
The version of CXF you use seem to be quite old. You should try with the current release 2.6.1. In 2.6 a lot of OSGi improvements were introduced.
You can install it using:
features:chooseurl cxf 2.6.1
features:install cxf
Don't worry, OSGi gives you full access to the information which bundle uses which package etc. You only have to know how to ask the system to give you the info you need for debugging the problem.
Unfortunately I am not familiar with karaf console commands, I am working more with ProSyst's mBeddedServer OSGi framework, but since all this is standard in OSGi, I can tell you what to look for and you can find the needed commands in karaf.
So, check the following:
1. Is Apache cxf bundle successfully installed? Is it in the "active" state?
(from your posting it seems that it is)
What is the version of the org.apache.cxf.jaxws.spi package that it exports?
This is different from the cxf bundle version!!!
In order to see the package version, look inside the manifest of the cxf bundle, and look for the Export-package header.
Is your bundle installed and started successfully? Is it in the active state?
If the error "Provider not found" appears during starting of your bundle, then your dependencies are not matching the provided packages from the cxf bundle, see point 2.
If, however, the error appears during runtime, it could have several causes:
You haven't imported all needed packages in your manifest. Try using analysis tools which can generate the manifest for you based on your source code.
or:
The code which does the publishing is located e.g. on the system classpath and uses the system classloader, which in OSGI due to modularity and security reasons doesn't have access to the bundle classloaders.
Check what is provided by the system classpath instead of as OSGi bundles. Anything there which uses Class.forName or other reflection methods won't work in the modular OSGi framework.
There are also other possibilites, but you'll need to provide more info. Was there an exception stack trace? What classes are involved in this piece of code and where on the classpath are they located? etc.

Resources