I'm developing a spring cloud kafka app with kafka streams binder. I have two #Input and one #Output as the following:
internal interface OfferChannel {
companion object {
const val CREATE_OFFER = "create-offer"
const val CREATE_OFFER_COUNTER = "create-offer-counter"
const val OFFER_CREATED = "offer-created"
}
#Input(OFFER_CREATED)
fun offerCreatedChannel(): SubscribableChannel
#Input(CREATE_OFFER_COUNTER)
fun createOfferAdminChannel(): SubscribableChannel
#Output(CREATE_OFFER)
fun createOfferChannel(): MessageChannel
}
During application startup I am getting the following error:
org.springframework.cloud.stream.binder.BinderException: Exception thrown while starting consumer:
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:455)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:99)
at org.springframework.cloud.stream.binder.AbstractBinder.bindConsumer(AbstractBinder.java:144)
at org.springframework.cloud.stream.binding.BindingService.lambda$rescheduleConsumerBinding$0(BindingService.java:171)
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.IllegalStateException: Only one LastSubscriberMessageHandler is allowed
at org.springframework.cloud.stream.binder.BinderErrorChannel.subscribe(BinderErrorChannel.java:44)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.registerErrorInfrastructure(AbstractMessageChannelBinder.java:704)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.registerErrorInfrastructure(AbstractMessageChannelBinder.java:627)
at org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder.createConsumerEndpoint(KafkaMessageChannelBinder.java:578)
at org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder.createConsumerEndpoint(KafkaMessageChannelBinder.java:135)
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindConsumer(AbstractMessageChannelBinder.java:402)
... 10 common frames omitted
Additionally, if I remove one of the #Input declarations the application starts successfully. Any suggestions what it going wrong here ?
Something is related to the version of the libraries.
For example my task was to switch to 2.4 kafka client with org.springframework.boot version 2.2.3.RELEASE. Firstly I updated to the latest 2.2.10.RELEASE version, you would say it's not a big deal since it's minor version change, well it appears that 2.2.10.RELEASE caused this bug and I had to revert it to 2.2.9.RELEASE.
Next step for my kafka version increase was to follow this tutorial:
https://docs.spring.io/spring-kafka/reference/html/#update-deps
So I set implementation "org.springframework.kafka:spring-kafka:2.4.9.RELEASE", well guess what? 2.4.9.RELEASE also caused the same problem, so I downgraded it to 2.4.8.RELEASE.
My working config (not using latest minor versions):
plugins {
id 'org.springframework.boot' version '2.2.9.RELEASE'
}
ext {
springKafkaVersion = '2.4.8.RELEASE'
kafkaVersion = '2.4.1'
}
dependencies {
implementation "org.springframework.kafka:spring-kafka:$springKafkaVersion"
implementation "org.apache.kafka:kafka-clients:$kafkaVersion"
implementation "org.apache.kafka:kafka-streams:$kafkaVersion"
testImplementation "org.springframework.kafka:spring-kafka-test:$springKafkaVersion"
testImplementation "org.apache.kafka:kafka-clients:$kafkaVersion:test"
}
So the suggestion is to look at your library versions and try to upgrade/downgrade accordingly.
I have the same error when updating spring boot version from 2.2.9 to 2.2.10, current work around is to use the old version. You can check your spring boot version and have a try
Related
I'm building an android chat app using rabbitmq and the project builds without any issue. However, I'm having an issue in creating ConnectionFactory object. It gives me the following error
E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.zabu.kyimoecho.mosaic, PID: 9002
java.lang.NoClassDefFoundError: com.rabbitmq.client.impl.nio.-$$Lambda$NioParams$NrSUEb8m8wLfH2ztzTBNKyBN8fA
at com.rabbitmq.client.impl.nio.NioParams.<clinit>(NioParams.java:37)
at com.rabbitmq.client.ConnectionFactory.<init>(ConnectionFactory.java:153)
at com.zabu.kyimoecho.mosaic.GenericIdentity.<init>(GenericIdentity.kt:11)
at com.zabu.kyimoecho.mosaic.Admin.<init>(Admin.kt:9)
at com.zabu.kyimoecho.mosaic.MainActivity.<init>(MainActivity.kt:10)
at java.lang.Class.newInstance(Native Method)
at android.app.Instrumentation.newActivity(Instrumentation.java:1067)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2317)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2476)
at android.app.ActivityThread.-wrap11(ActivityThread.java)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1344)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:148)
at android.app.ActivityThread.main(ActivityThread.java:5417)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
I've looked online and the suggestions I came across didn't resolve this issue.
I'm using Android 3.3.2, JDK 8 and Rabbitmq 5.6.0.
build.gradle :
....
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
...
dependencies {
implementation 'com.rabbitmq:amqp-client:5.6.0'
.....
}
Just for the sake of reference (if anybody runs into similar issue), I resolved this issue by changing targetSdkVersion to 27 from 24 in build.gradle.
I've faced this same issue. If you see the documentation on this page RabbitMQ Documentation, it says that amqp-client:5.x series library is meant for Android 7 (Nougat) and above. If you are using any Android version below Android 7 then you can use the amqp-client:4.x series library.
I wanted to use both so I got one of the amqp-client:4.x series library source code and recompiled it with a slightly different package name and use the jar file so that I could use both the updated 5.x series for newer Android versions and 4.x series for lower Android versions.
I am getting below exception while executing springboot application:
Caused by: java.lang.NoSuchMethodError: 'javax.validation.BootstrapConfiguration.getClockProviderClassName()Ljava/lang/String;
at org.hibernate.validator.internal.xml.ValidationBootstrapParameters.<init>(ValidationBootstrapParameters.java:63) ~[hibernate-validator-6.0.7.Final.jar!/:6.0.7.Final]'
I included validation-api as a dependency with latest version & also made sure no other version is coming (not using hibernate validator as well) Still it is failing. Please suggest any solution.
Check the lib folder to differing version of validation-api -- for example
jakarta.validation-api-2.0.2.jar
validation-api-1.1.0.Final.jar
And try to see if you can remove validation-api-1.1.0.Final.jar
My sonar version is 5.6.4 which throws java.lang.NoSuchMethodError at my code.
org.sonar.plugins.javascript.api.tree.expression.CallExpressionTree.argumentClause()Lorg/sonar/plugins/javascript/api/tree/expression/ArgumentListTree;
public void visitCallExpression(CallExpressionTree tree) {
if (tree.callee() instanceof DotMemberExpressionTree){
DotMemberExpressionTree dmTree = (DotMemberExpressionTree) tree.callee();
System.out.println(tree);
if (isLionGetProperty(dmTree) && tree.argumentClause().arguments().size() < 2) {
addIssue(tree.callee(), MESSAGE);
}
super.visitCallExpression(tree);
}
}
My local plugin works fine which is based on Sonar 6.2. Where can I get some doc regarding an old version of SonarJS
Method argumentClause was added in SonarJS 3.0, you need to have at least this version of SonarJS installed to be able to use it. There is no documentation, you can find information in the source repository or JIRA (e.g. argumentClause was added in this ticket)
I just upgraded my app from Grails 3.2.0 to 3.2.1 due to some problems and the user authentication started failing. I'm using Grails Spring Security Core plugin version 3.1.1.
I'm getting the following exception:
org.springframework.security.authentication.InternalAuthenticationServiceException:
Cannot cast object 'User(email:user#example.com)' with class 'com.test.User' to class 'com.test.User'
at org.springframework.security.authentication.dao.DaoAuthenticationProvider.retrieveUser(DaoAuthenticationProvider.java:126)
at org.springframework.security.authentication.dao.AbstractUserDetailsAuthenticationProvider.authenticate(AbstractUserDetailsAuthenticationProvider.java:144)
at org.springframework.security.authentication.ProviderManager.authenticate(ProviderManager.java:174)
at org.springframework.security.web.authentication.UsernamePasswordAuthenticationFilter.attemptAuthentication(UsernamePasswordAuthenticationFilter.java:94)
at grails.plugin.springsecurity.web.authentication.GrailsUsernamePasswordAuthenticationFilter.attemptAuthentication(GrailsUsernamePasswordAuthenticationFilter.groovy:53)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:212)
Caused by: org.codehaus.groovy.runtime.typehandling.GroovyCastException:
Cannot cast object 'User(email:user#example.com)' with class 'com.test.User' to class 'com.test.User'
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.continueCastOnSAM(DefaultTypeTransformation.java:405)
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.continueCastOnNumber(DefaultTypeTransformation.java:319)
at org.codehaus.groovy.runtime.typehandling.DefaultTypeTransformation.castToType(DefaultTypeTransformation.java:232)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.castToType(ScriptBytecodeAdapter.java:603)
at com.test.User.findWhere(User.groovy)
at com.test.User$findWhere.call(Unknown Source)
at grails.plugin.springsecurity.userdetails.GormUserDetailsService.$tt__loadUserByUsername(GormUserDetailsService.groovy:60)
at grails.plugin.springsecurity.userdetails.GormUserDetailsService$_loadUserByUsername_closure1.doCall(GormUserDetailsService.groovy)
This is only failing when we deploy it on Apache Tomcat (using 8.5.6) but it is working fine on development with grails run-app.
Here are the modified dependencies in build.gradle (rest configuration is same as generated by create-app):
// "compile" changed to "provided"
provided "org.springframework.boot:spring-boot-starter-tomcat"
compile "org.grails.plugins:spring-security-core:3.1.1"
compile "org.grails.plugins:asynchronous-mail:2.0.0.RC4"
compile "org.mongodb:bson:3.3.0"
compile "org.codehaus.groovy.modules.http-builder:http-builder:0.7.1"
runtime "mysql:mysql-connector-java:5.1.39"
// https://github.com/spring-projects/spring-boot/issues/6761
runtime "com.google.code.gson:gson:2.5"
// Commented this to avoid issue (https://github.com/grails/grails-core/issues/10196)
//provided "org.codehaus.groovy:groovy-ant"
The same setup was working in the Grails 3.2.0.
Any idea about this exception?
Seems to be an issue with Grails 3.2.1 itself. Issue tracked grails/grails-core#10244.
Workaround is to override limitScanningToApplication in your grails-app/init/PACKAGE/Application.groovy
import grails.boot.GrailsApp
import grails.boot.config.GrailsAutoConfiguration
class Application extends GrailsAutoConfiguration {
static void main(String[] args) {
GrailsApp.run(Application, args)
}
#Override
boolean limitScanningToApplication() {
return false
}
}
I am trying to compile and run the storm-kafka-starter project at
https://github.com/TheHydroImpulse/storm-kafka-starter
The main function for KafkaTopology looks like:
public class KafkaTopology {
public static void main(String[] args) throws Exception {
List<String> hosts = new ArrayList<String>();
hosts.add("localhost");
SpoutConfig kafkaConf = new SpoutConfig(StaticHosts.fromHostString(hosts,1),
"test-topic","/kafkastorm","discovery");
kafkaConf.scheme = new SchemeAsMultiScheme(new StringScheme());
kafkaConf.forceStartOffsetTime(-2);
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("spout", kafkaSpout, 2);
builder.setBolt("printer", new PrinterBolt()).shuffleGrouping("spout");
Config config = new Config();
config.setDebug(true);
if(args!=null && args.length > 0) {
config.setNumWorkers(3);
StormSubmitter.submitTopology(args[0], config, builder.createTopology());
}
else {
config.setMaxTaskParallelism(3);
LocalCluster cluster = new LocalCluster();
cluster.submitTopology("kafka", config, builder.createTopology());
Thread.sleep(10000);
cluster.shutdown();
}
}
}
The jar compiles using maven. But on running the topology, I get the error:
Exception in thread "main" java.lang.NoClassDefFoundError:
storm/kafka/KafkaConfig$BrokerHosts
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2451)
at java.lang.Class.getMethod0(Class.java:2694)
at java.lang.Class.getMethod(Class.java:1622)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: storm.kafka.KafkaConfig$BrokerHosts
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 6 more
The local repository does have BrokerHosts in the storm-kafka jar and I have imported the KafkaConfig library in my java file. I cannot figure out the cause of the error. Any suggestions would be appreciated.
I had similar issues using the 0.9.2_incubating version of Apache Storm.
The isssue is caused because the actual storm distribution doesn't have the kafka libraries in the /lib folder. I was able to resolve the error by copying the following libraries (that I used to compile & build the topology) to the /lib folder from where i ran Storm
storm-kafka-0.9.2-incubating.jar
kafka_2.10-0.8.1.1.jar
scala-library-2.10.1.jar
Remember that the actual versions in your case might vary. Take the ones that you use to build your storm topology (i.e. from .m2 or .gradle dependencies folder)
Note: I am not using the exact same starter project mentioned above but the fix will be similar.
I went through the storm-user groups and the issues on the storm-kafka-starter github page. It turns out that the error is due to two reasons:
Version incompatibility among versions of storm, kafka and kafka-storm
Missing jars on the classpath
My initial setup did not work even after including all the necessary dependency jars in the /storm/lib folder. Turns out that the storm-kafka-starter project mentioned works only for storm 0.9.x versions.
Also see post here on which setup works best - https://groups.google.com/d/msg/storm-user/V_j_JZmFsb4/E4_II9ork3UJ
I went through the same integration woes. Finally got a working example together.
You are welcome to check it out here >
https://github.com/buildlackey/cep
(click on the storm+kafka directory for a sample program that should get you up and running).
I failed to use to correct maven build command. Using
mvn clean install assembly:assembly
and using the jar with dependencies fixed it form me.
Changed pom.xml from
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka</artifactId>
<version>1.0.2</version>
<scope>provided</scope>
</dependency>
to
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-kafka</artifactId>
<version>1.0.2</version>
</dependency>
Now it works for me!