java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator; - elasticsearch

I've been trying to connect kafka to elasticsearch using kafka-connect api.Kafka version is 0.11.0.0.These are the steps I followed:
1.Buiding Elasticsearch Connector:
https://github.com/confluentinc/kafka-connect-elasticsearch.git
2.Build the connector
$ cd kafka-connect-elasticsearch
$ mvn clean package
3.Finally running the script:
$ bin/connect-standalone.sh config/connect-standalone.properties config/elasticsearch-connect.properties
It throws the following exception:
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;
at org.reflections.Reflections.expandSuperTypes(Reflections.java:380)
at org.reflections.Reflections.<init>(Reflections.java:126)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:221)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:198)
at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:159)
at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.java:47)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:68)
Can't understand what's going wrong.

From experience, this error means that you are bringing an older version of guava earlier in your classpath. Connect worker requires guava >= 20 for org.reflections to work correctly.
kafka-connect-elasticsearch or any other connector that brings with it guava 18.0 or older will prohibit the worker from starting up. This error message means that the older guava jar was encountered first in the classpath.
Two solutions:
Indeed, as Hans Jespersen mentions, using classloading isolation by setting your plugin.path in Connect worker's configuration, will allow the connector to work as-is without interfering with the Connect framework.
If adding the connector to the CLASSPATH is your only option, make sure it's added after Kafka Connect's dependencies, so that the most recent guava will be picked up.

This appears to describe the answer for your problem https://github.com/confluentinc/kafka-connect-elasticsearch/issues/104
It's a little confusing, but after you build the connector there are a
number of things in the target directory. The
kafka-connect-elasticsearch-.jar is only the JAR file with
the connector code, but that doesn't include all the libraries. One of
those directories in the target directory, namely
target/kafka-connect-elasticsearch-*-development/share/java/kafka-connect-elasticsearch/,
does contain all the libraries. Add this directory to the Kafka
Connect worker's classpath, or copy all of those JAR files into a
directory that is already on the classpath.

For future readers, I have a similar issue w/ no usage for kafka. I spent a lot of hours to try to figure out the issue, and finally, I have a pattern.
In my project, I used ro.isdc.wro4j:wro4j-extensions:1.8.0, and org.reflections:reflections:0.9.11
<dependency>
<groupId>ro.isdc.wro4j</groupId>
<artifactId>wro4j-extensions</artifactId>
<version>1.8.0</version>
</dependency>
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.11</version>
</dependency>
The conflict happened becuase ro.isdc.wro4j:wro4j-extensions:1.8.0 used com.google.javascript:closure-compiler:jar:v20160315, I solved it by exclude com.google.javascript:closure-compiler:jar:v20160315 from the pom.
<dependency>
<groupId>ro.isdc.wro4j</groupId>
<artifactId>wro4j-extensions</artifactId>
<version>1.8.0</version>
<exclusions>
<exclusion>
<groupId>com.google.javascript</groupId>
<artifactId>closure-compiler</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.11</version>
</dependency>
However, the problem is a build-dependent which we don't know which jar may cause the conflict. In this case, you need to know the location of the jar by running something like the following:
CodeSource source = com.google.common.collect.Sets.class.getProtectionDomain().getCodeSource();
if(source != null)
logger.warn(source.getLocation().toString());
and see what is the output. In my case, the output is
file:/tmp/jetty-0.0.0.0-8087-ROOT.war-_-any-1541284168668078443.dir/webapp/WEB-INF/lib/closure-compiler-v20160315.jar
Hopefully, the answer will help you to find a way to solve the issue,
Mughrabi

I ran into this error recently (Jan 2020) on my localhost (macOS Catalina) and was able to resolve this by updating my .zprofile file (zsh shell). If you have bash shell, you can make changes to .bash_profile or .bashrc file.
• Error was caused because my jvm.classpath was including dependencies of guava verison < 20. My .zprofile file had exports to hadoop-3.1.1 and apache-hive-3.1.1 which was the issue for error.
• I am using kafka_2.12-2.0.0. So, I just commented the hadoop-3.1.1 & apache-hive-3.1.1in my .zprofile and ran my twitter kafka connector successfully.
• I was able to trace this error by following the comments in this thread by #Konstantine Karantasis.
Hope this is helpful.

Related

Spark-submit is not using the protobuf version of my project

In my work project, I use spark-submit to launch my application into a yarn cluster. I am quite new to Maven projects and pom.xml use, but the problem I seem to be having is that hadoop is using an older version of google protobuf (2.5.0) than the internal dependencies I'm importing at work (2.6.1).
The error is here:
java.lang.NoSuchMethodError:
com/google/protobuf/LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;
(loaded from file:/usr/hdp/2.6.4.0-91/spark2/jars/protobuf-java-2.5.0.jar
by sun.misc.Launcher$AppClassLoader#8b6f2bf7)
called from class protobuf.com.mycompany.group.otherproject.api.JobProto$Query
Since I'm not quite sure how to approach dependency issues like this, and I can't change the code of the internal dependency that uses 2.6.1, I added the required protobuf version as a dependency to my project, as well:
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.6.1</version>
</dependency>
Unfortunately, this hasn't resolved the issue. When the internal dependency (which does import 2.6.1 on its own) tries to use its proto, the conflict occurs.
Any suggestions on how I could force the usage of the newer, correct version would be greatly appreciated.
Ultimately I found the Maven Shade Plugin to be the answer. I shaded my company's version of protobufs, deployed our service as an uber jar, and the conflict was resolved.

unable to resolve dependency for akamai edgegrid API

I am trying to use akamai edgegrid API for invalidating akamai chache. I have added below dependency in my pom.xml, but my bundle keeps in installed state. Below are more details-
pom.xml dependency-
<dependency>
<groupId>com.akamai.edgegrid</groupId>
<artifactId>edgegrid-signer-apache-http-client</artifactId>
<version>2.1.0</version>
<scope>provided</scope>
</dependency>
Bundle is in installed state, on felix console it says-
Imported Packages com.akamai.edgegrid.signer -- Cannot be resolved
error.log says -
Unable to resolve 497.82: missing requirement [497.82] osgi.wiring.package; (osgi.wiring.package=com.akamai.edgegrid.signer)
You have used <scope>provided</scope> , it means this jar will be used during compile time and during run time it will use the jar available on the run time environment. Unfortunately edgegrid-signer-apache-http-client-2.1.0.jar is not available on the AEM instance.
To resolve the issue, Do not use <scope>provided</scope> .
Updated POM -
<dependency>
<groupId>com.akamai.edgegrid</groupId>
<artifactId>edgegrid-signer-apache-http-client</artifactId>
<version>2.1.0</version>
</dependency>
Before deploying the bundle on AEM, extract the jar and check edgegrid-signer-apache-http-client.jar , edgegrid-signer-core.jar, httpclient.jar, httpcore.jar should be part of the bundle.
Hopefully it will solve your issue. All the best.
Please let me know if you still face any issue.
-Mrutyunjaya

Apache strom - package backtype.storm.tuple does not exist

I'm trying the Storm analysis presents here
CallLogCounterBolt.java:4: error: package backtype.storm.tuple does not exist
import backtype.storm.tuple.Fields;
I ran into similar problems with another old Apache Storm tutorial. It turned out to simply be because of the tutorial using deprecated classes from previous versions (0.9.6), while I was using newer ones (1.1.0). Therefore my suggestion is to either look through the newer libraries for corresponding resources in those and changing your library load statements accordingly, or checking that the dependencies that you are using are not masked by similarly named libraries.
The issue is with your Java classpath... which entirely depends on how you have setup your project. Rather than try to fix what you have I'll give you a suggestion.
If you're using Java, then the "normal" way to create storm topologies is using Maven which should work with whatever IDE you're using (Eclipse, Intellij, etc.).
Once you have a skeleton maven project setup, all you need to do is add the storm dependencies. For example:
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>${storm.version}</version>
<scope>provided</scope>
</dependency>
Here is an example POM file.
You should use newer Libraries in order to execute Since backtype is deprecated, Go through the Apache Storm javadocs Apache Storm javadocs

are maven dependency exclusions necessary when using spring (mvc) + hadoop + hive?

I have a web app working great. Tried to connect to hadoop using hive. Tests work fine, but I can't run the web app. I get an error from transitive maven dependencies on hadoop-core bringing in j2ee jars that override Tomcat and mess up when trying to run the web app (specifically in loading the context).
Foolishly I thought maybe if I just use Spring Data built for CDH5 they would have covered all that. No such luck. I was following their docs here: https://github.com/spring-projects/spring-hadoop/wiki/Build-with-Cloudera-CDH5
Here's my current POM:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-hadoop</artifactId>
<version>2.0.4.RELEASE-cdh5</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>${hive.version}</version>
<scope>runtime</scope>
</dependency>
Here is the error:
SEVERE: Servlet.service() for servlet [jsp] in context with path [] threw exception [java.lang.AbstractMethodError: javax.servlet.jsp.JspFactory.getJspApplicationContext(Ljavax/servlet/ServletContext;)Ljavax/servlet/jsp/JspApplicationContext;] with root cause
java.lang.AbstractMethodError: javax.servlet.jsp.JspFactory.getJspApplicationContext(Ljavax/servlet/ServletContext;)Ljavax/servlet/jsp/JspApplicationContext;
at org.apache.jasper.compiler.Validator$ValidateVisitor.<init>(Validator.java:515)
at org.apache.jasper.compiler.Validator.validateExDirectives(Validator.java:1817)
at org.apache.jasper.compiler.Compiler.generateJava(Compiler.java:217)
at org.apache.jasper.compiler.Compiler.compile(Compiler.java:373)
I also got this error when building direct from cloudera's repos
I could start stuffing exclusions in there, but that feels hacky, and I'm paranoid about other transitive dependency errors cropping up that I may not know about.
I've pored over the docs and the sample code and pom files here: https://github.com/spring-projects/spring-hadoop-samples/blob/master/hive/pom.xml
They don't seem to have exclusions in their POM files. However, I've seen other people do it, such as here: Spring + Maven + Hadoop
Is that the accepted way to work with these technologies? This is my first time so am seeking some confirmation here. Perhaps I'm missing something?
Is it canonical to simply have exclusions

Setting javax.xml.ws.Service from JDK, instead of javaee-api with maven

I'm facing with this problem:
The method getPort(QName, Class<T>) in the type Service is not applicable for the arguments (QName, Class<AcessoDadosGeolocalizacao>, WebServiceFeature[])
I used wsimport to generate my clients, but now my maven application is using the class javax.xml.ws.Service from
<dependency>
<groupId>javaee</groupId>
<artifactId>javaee-api</artifactId>
<version>5</version>
<scope>provided</scope>
</dependency>
How can I use the javax.xml.ws.Service from the JDK 6?
I've added the webservices-api to my pom.xml and the problem is gone.
<dependency>
<groupId>javax.xml</groupId>
<artifactId>webservices-api</artifactId>
<version>2.1-b14</version>
</dependency>
If I am adding this entry(webservices-api) ;it is giving run time error while accessing JAXB-API.I found that the JDK6 should be the first in the order of classpath and then maven library.I moved up the JDK6 above the Maven library.Then it worked.
I ran into a similar issue with Eclipse and a Dynamic Web Application. Its not Maven related however googling for that error gets you all of about 7 results in Google as of today's date with about three or more of them being relisting at other websites of the same stack exchange question - so I thought in case others had a similar issue I'd add what helped me. The WAR was set to use JBoss AS5, the VM was set to use Java 6. Because its eclipse and I had already consumed the web service - the error was not occurring on import as the stubs had already been created. I ensured the Java facet was set to use 1.6 (it had been 1.5), I cleaned and built but the error persisted. I then noticed I had a reference on my build path to Java EE 1.5. I removed this, cleaned and built and the error went away.
Hope this helps anyone else faced with the same issue!

Resources