StringIndexOutOfBoundsException from EmbeddedHeadersMessageConverter - spring

I'm in the process of moving a simple Kafka consumer application out of an existing framework and feel like spring-cloud-stream is an easy way to do that. I used Initializr to bootstrap the app, which is now using Spring-Boot v1.3.3 and Spring-Cloud-Stream v1.0.0-RC1. The application is extremely simple, all it has to do is pick a message from Kafka, deserialize the JSON encoded object and pass it on to our existing library. To get started I just used the LogSink example, since eventually I won't do much else (just deserialize and pass object to a different method).
It all works great: It connects to Kafka, receives the message and passes it (as byte[]) to my sink. However, EmbeddedHeadersMessageConverter logs a StringIndexOutOfBoundsException:
2016-04-11 10:06:50.287 ERROR 11464 --- [pool-1-thread-1] fkaMessageChannelBinder$ReceivingHandler : Could not convert message: 7B2267656E65726174696F6E223A3 [...]
java.lang.StringIndexOutOfBoundsException: String index out of range: 2009
at java.lang.String.checkBounds(String.java:373) ~[na:1.8.0_25]
at java.lang.String.<init>(String.java:413) ~[na:1.8.0_25]
at org.springframework.cloud.stream.binder.EmbeddedHeadersMessageConverter.oldExtractHeaders(EmbeddedHeadersMessageConverter.java:131) ~[spring-cloud-stream-1.0.0.RC1.jar:1.0.0.RC1]
at org.springframework.cloud.stream.binder.EmbeddedHeadersMessageConverter.extractHeaders(EmbeddedHeadersMessageConverter.java:104) ~[spring-cloud-stream-1.0.0.RC1.jar:1.0.0.RC1]
at org.springframework.cloud.stream.binder.kafka.KafkaMessageChannelBinder$ReceivingHandler.handleRequestMessage(KafkaMessageChannelBinder.java:583) ~[spring-cloud-stream-binder-kafka-1.0.0.RC1.jar:1.0.0.RC1]
at org.springframework.integration.handler.AbstractReplyProducingMessageHandler.handleMessageInternal(AbstractReplyProducingMessageHandler.java:99) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:127) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.channel.FixedSubscriberChannel.send(FixedSubscriberChannel.java:69) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.channel.FixedSubscriberChannel.send(FixedSubscriberChannel.java:63) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:115) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
at org.springframework.messaging.core.GenericMessagingTemplate.doSend(GenericMessagingTemplate.java:45) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
at org.springframework.messaging.core.AbstractMessageSendingTemplate.send(AbstractMessageSendingTemplate.java:105) [spring-messaging-4.2.5.RELEASE.jar:4.2.5.RELEASE]
at org.springframework.integration.endpoint.MessageProducerSupport.sendMessage(MessageProducerSupport.java:105) [spring-integration-core-4.2.5.RELEASE.jar:na]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter.access$300(KafkaMessageDrivenChannelAdapter.java:43) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.inbound.KafkaMessageDrivenChannelAdapter$AutoAcknowledgingChannelForwardingMessageListener.doOnMessage(KafkaMessageDrivenChannelAdapter.java:171) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.listener.AbstractDecodingMessageListener.onMessage(AbstractDecodingMessageListener.java:50) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.listener.QueueingMessageListenerInvoker$KafkaMessageDispatchingSubscriber.onNext(QueueingMessageListenerInvoker.java:221) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at org.springframework.integration.kafka.listener.QueueingMessageListenerInvoker$KafkaMessageDispatchingSubscriber.onNext(QueueingMessageListenerInvoker.java:209) [spring-integration-kafka-1.3.0.RELEASE.jar:na]
at reactor.core.processor.util.RingBufferSubscriberUtils.route(RingBufferSubscriberUtils.java:67) [reactor-core-2.0.7.RELEASE.jar:na]
at reactor.core.processor.RingBufferProcessor$BatchSignalProcessor.run(RingBufferProcessor.java:789) [reactor-core-2.0.7.RELEASE.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_25]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_25]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_25]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_25]
https://github.com/spring-cloud/spring-cloud-stream/issues/209 seems to indicate the problem is missing Kafka headers, which is true, there aren't any. But the solution mentioned there is to add
spring.cloud.stream.binder.kafka.mode=raw
to my application configuration. Unfortunately that did not work for me. Also, STS actually has auto-completion for the respective properties and it suggested
spring.cloud.stream.kafka.binder.mode=raw
Neither of the 2 (separately or combined) made any difference, the exception is still being logged.
I have used Spring for years, but this would be my first Spring-Boot/Spring-Cloud application.
Here's the application code:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.cloud.stream.messaging.Sink;
import org.springframework.integration.annotation.ServiceActivator;
#SpringBootApplication
public class UpdateApplication {
private static Logger logger = LoggerFactory.getLogger(UpdateApplication.class);
public static void main(String[] args) {
SpringApplication.run(UpdateApplication.class, args);
}
#EnableBinding(Sink.class)
public static class UpdateHandler {
#StreamListener(Sink.INPUT)
//#ServiceActivator(inputChannel=Sink.INPUT)
public void loggerSink(Object payload) {
logger.info("Received: " + payload);
}
}
}
I tried both, #ServiceActivator as well as #StreamListener annotation, which in this case does not seem to make a difference.
My application.properties looks like this:
spring.cloud.stream.bindings.input.binder=kafka
spring.cloud.stream.bindings.input.destination=updates
spring.cloud.stream.bindings.input.group=update-client
spring.cloud.stream.kafka.binder.brokers=brokerName
spring.cloud.stream.kafka.binder.zkNodes=zookeeperName
spring.cloud.stream.kafka.binder.mode=raw
Any help to get rid of this error would be appreciated.
As a side note: Since I just started experimenting with spring-cloud-stream I added
spring.cloud.stream.bindings.updates.consumer.resetOffsets=true
spring.cloud.stream.bindings.updates.consumer.startOffset=earlist
to the configuration to avoid having to send new messages every time I restart, but that didn't work.

Since the RC that option has been moved to the .consumer. configuration option.
So, right now you have to do like this:
spring.cloud.stream.bindings.input.consumer.mode=raw
See more info in the Reference Manual.

spring.cloud.stream.bindings.input.consumer.headerMode=raw
is working for version 1.1.0.RELEASE.

Related

Grails Spring Batch plugin not finding jobs

I'm working on a project with Grails 3.3.7 and I'm trying to get the Spring Batch to work using the grails-spring-batch. Just like in the documentation example, I created a MySimpleJobBatchConfig.groovy file in the grails-app/batch directory with the following content :
import myapp.PrintMessageTasklet;
beans {
xmlns batch:"http://www.springframework.org/schema/batch"
batch.job(id: 'mySimpleJob') {
batch.step(id: 'logStart') {
batch.tasklet(ref: 'printMessage')
}
}
printMessage(PrintMessageTasklet) { bean ->
bean.autowire = "byName"
}
}
PrintMessageTasklet is defined as such in src/main/groovy/myapp/PrintMessageTasklet.groovy :
package myapp
import org.springframework.batch.core.StepContribution
import org.springframework.batch.core.scope.context.ChunkContext
import org.springframework.batch.core.step.tasklet.Tasklet
import org.springframework.batch.repeat.RepeatStatus
class PrintMessageTasklet implements Tasklet {
RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) {
println "Test"
return RepeatStatus.FINISHED
}
}
And here's the service that's trying to launch the job in grails-app/services/myapp/SimpleJobService :
package myapp
import grails.gorm.transactions.Transactional
import org.springframework.batch.core.JobParameters
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing
#Transactional
class SimpleJobService {
def jobLauncher;
def mySimpleJob;
def launchSimpleJob() {
jobLauncher.run(mySimpleJob, new JobParameters())
}
}
However, when I run launchSimpleJob, I get the following exception telling me that mySimpleJob is null, despite me having defined it in the MySimpleJobBatchConfig.groovy file.
java.lang.reflect.InvocationTargetException: null
at org.grails.core.DefaultGrailsControllerClass$ReflectionInvoker.invoke(DefaultGrailsControllerClass.java:211)
at org.grails.core.DefaultGrailsControllerClass.invoke(DefaultGrailsControllerClass.java:188)
at org.grails.web.mapping.mvc.UrlMappingsInfoHandlerAdapter.handle(UrlMappingsInfoHandlerAdapter.groovy:90)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at org.springframework.boot.web.filter.ApplicationContextHeaderFilter.doFilterInternal(ApplicationContextHeaderFilter.java:55)
at org.grails.web.servlet.mvc.GrailsWebRequestFilter.doFilterInternal(GrailsWebRequestFilter.java:77)
at org.grails.web.filters.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:67)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.IllegalArgumentException: The Job must not be null.
at org.springframework.util.Assert.notNull(Assert.java:134)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:94)
at myapp.SimpleJobService.$tt__launchSimpleJob(SimpleJobService.groovy:14)
at grails.gorm.transactions.GrailsTransactionTemplate$2.doInTransaction(GrailsTransactionTemplate.groovy:94)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at grails.gorm.transactions.GrailsTransactionTemplate.execute(GrailsTransactionTemplate.groovy:91)
at myapp.SimpleJobController.launch(SimpleJobController.groovy:9)
... 14 common frames omitted
If I try to start my job using springBatchService.launch('mySimpleJob') instead, the message in the returned map also tells me that it couldn't find the job named "mySimpleJob".
Since I'm quite new to Groovy and Spring Batch and I followed carefully the documentation of this plugin I can't find what's wrong. Does anyone knows how to fix this ?
Fixed it, the plugin documentation was outdated, turns out my BatchConfing needed to be in the src/main/resources/batch folder instead of the grails-app/batch folder.
Also had to change #Transactional to #Transactional(propagation = Propagation.NOT_SUPPORTED) in my service.

Dropwizard testing - ResourceTestRule throwing NoClassDefFoundError: ch/qos/logback/core/filter/Filter

I am using dropwizard 1.2.4 with log4j 1.2.17. I have followed the instructions as mentioned below
https://github.com/arteam/dropwizard-nologback/
It is throwing the exception like below during unit testing.
java.lang.NoClassDefFoundError: ch/qos/logback/core/filter/Filter
at io.dropwizard.testing.junit.ResourceTestRule.<clinit>(ResourceTestRule.java:34)
at com.vnera.restapilayer.api.resources.ApiInfoControllerTest.<clinit>(ApiInfoControllerTest.java:25)
at sun.misc.Unsafe.ensureClassInitialized(Native Method)
at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156)
at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
at java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
at java.lang.reflect.Field.get(Field.java:393)
at org.junit.runners.model.FrameworkField.get(FrameworkField.java:73)
at org.junit.runners.model.TestClass.getAnnotatedFieldValues(TestClass.java:230)
at org.junit.runners.ParentRunner.classRules(ParentRunner.java:255)
at org.junit.runners.ParentRunner.withClassRules(ParentRunner.java:244)
at org.junit.runners.ParentRunner.classBlock(ParentRunner.java:194)
at org.junit.runners.ParentRunner.run(ParentRunner.java:362)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: java.lang.ClassNotFoundException: ch.qos.logback.core.filter.Filter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 19 more
My test code looks like below
import io.dropwizard.testing.junit.ResourceTestRule;
import org.junit.Assert;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import javax.ws.rs.core.Response;
import static org.mockito.Mockito.mock;
#Category(value = UnitTest.class)
public class ApiInfoControllerTest {
private static ApiNonFunctionalHandler nonFunctionalHandler = mock(ApiNonFunctionalHandler.class);
private static ApiFilter apiFilter = new ApiFilter(nonFunctionalHandler);
private static final String authToken = "NetworkInsight xTyAGJmZ8nU8yJDP7LnA8Q==";
#ClassRule
public static final ResourceTestRule resources = ResourceTestRule.builder()
.addResource(new ApiInfoController())
.addProvider(apiFilter).build();
#Test
public void testApiVersion() throws Exception {
Response response = resources.client()
.target(ApiConstants.INFO_BASE_URL + "/version")
.request()
.header("Authorization", authToken)
.buildGet().invoke();
Assert.assertNotNull(response);
Assert.assertEquals(response.toString(), Response.Status.OK.getStatusCode(), response.getStatus());
final VersionResponse actualError = response.readEntity(VersionResponse.class);
Assert.assertEquals(actualError.getApiVersion(), ApiConstants.API_VERSION);
}
}
My main application is working fine. The configuration.yaml for main application looks like below
# Change default server ports
server:
applicationConnectors:
- type: http
port: 8123
adminConnectors:
- type: http
port: 8124
requestLog:
type: external
logging:
type: external
Can someone let me know what could be going wrong and how can I get around this?
EDIT
Output of mvn dependency:tree is placed here as I am hitting the character limit here.
This is a bug in dropwizard 1.2.4 as discussed below
https://github.com/dropwizard/dropwizard/pull/2338

Wicket - Getting broken pipe exception when accessing DB

In my wicket application I have this service class:
import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
#Component
#Transactional
public class DatabaseService {
#Autowired
SessionFactory sessionFactory;
public void save(Message m) {}
}
This service class is "injected" into a wicket panel:
public class MyPanel extends Panel {
#SpringBean()
private DatabaseService service;
}
It works fine. But if I open the application hours later (server is still running), I receive following error:
java.net.SocketException: Datenübergabe unterbrochen (broken pipe)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(Unknown Source)
at java.net.SocketOutputStream.write(Unknown Source)
[...]
at java.io.BufferedOutputStream.flushBuffer(Unknown Source)
at java.io.BufferedOutputStream.flush(Unknown Source)
at com.mysql.jdbc.MysqlIO.send(MysqlIO.java:3634)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2460)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2625)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2547)
at com.mysql.jdbc.ConnectionImpl.setAutoCommit(ConnectionImpl.java:4874)
at org.apache.commons.dbcp.DelegatingConnection.setAutoCommit(DelegatingConnection.java:371)
at org.apache.commons.dbcp.PoolingDataSource$PoolGuardConnectionWrapper.setAutoCommit(PoolingDataSource.java:328)
[...]
(JdbcResourceLocalTransactionCoordinatorImpl.java:214)
at org.hibernate.engine.transaction.internal.TransactionImpl.begin(TransactionImpl.java:52)
at org.hibernate.internal.SessionImpl.beginTransaction(SessionImpl.java:1525)
at org.springframework.orm.hibernate5.HibernateTransactionManager.doBegin(HibernateTransactionManager.java:500)
[...]
at de.project.database.DatabaseService$$EnhancerBySpringCGLIB$$8fa0ab80.getMessages(<generated>)
at WICKET_de.project.database.DatabaseService$$FastClassByCGLIB$$68e55e7c.invoke(<generated>)
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.apache.wicket.proxy.LazyInitProxyFactory$AbstractCGLibInterceptor.intercept(LazyInitProxyFactory.java:350)
at WICKET_de.project.database.DatabaseService$$EnhancerByCGLIB$$a9cbdf2b.getMessages(<generated>)
at de.project.pms.MyPanel.<init>(MyPanel.java:26)
at de.project.home.projectHome.<init>(projectHome.java:17)
Is it connected with the (un)detach mechanismn of wicket?
MySQL connections usually time out after a period of time. This causes exceptions if you are using a datasource/connection pool and do not use connection validation. From the stack trace you've pasted I see that you are using apache dbcp as a datasource, so I think you should set the following parameters on it:
validationQuery, testOnCreate, testOnBorrow, testOnReturn, testWhileIdle

spring boot and camel throws direct.DirectConsumerNotAvailableException

I'm trying to get simple example of springboot and camel working but come undone. Not sure what i'm doing wrong. in the gradle build i've included so far
dependencies {
compile 'org.apache.camel:camel-spring-boot-starter:2.18.4'
compile 'org.apache.camel:camel-groovy:2.18.4'
compile 'org.apache.camel:camel-stream:2.18.4'
compile 'org.codehaus.groovy:groovy-all:2.4.11'
testCompile group: 'junit', name: 'junit', version: '4.11'
testCompile group: 'junit', name: 'junit', version: '4.12'
}
i've create a DirectRoute component like this
#Component
class DirectRoute extends RouteBuilder{
#Override
void configure () throws Exception {
from ("direct:in") //tried stream:in also
.to ("stream:out")
}
}
I then have a driver bean that try's to invoke the route
#Component
public class HelloImpl implements Hello {
#Produce(uri = "direct:in")
private ProducerTemplate template;
#Override
public String say(String value) throws ExecutionException, InterruptedException {
assert template
println "def endpoint is : " + template.getDefaultEndpoint()
return template.sendBody (template.getDefaultEndpoint(), value)
}
}
lastly in the springboot application class i added a command line runner like this, that gets my bean from the spring context, and invokes the say method. I'm using groovy so i just passed a closure to the command line runner.
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
//return closure to run on startup - just list the beans enabled
{args ->
println("Let's inspect the beans provided by Spring Boot:")
String[] beanNames = ctx.getBeanDefinitionNames()
Arrays.sort(beanNames)
for (String beanName : beanNames) {
println(beanName)
}
println("call the direct:start route via the service")
Hello service = ctx.getBean("helloService")
def result = service.say("William")
println "service returned : $result "
}
}
when i run my application i get all the bean names printed out (that's ok), however when i invoke the direct:in via producer template i get this error (org.apache.camel.component.direct.DirectConsumerNotAvailableException) see below.
I was expecting the route to be triggered the name sent to see that arrive in the output stream - but this is what i get.
Caused by: org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[ID-MONSTER-PC2-58911-1496920205300-0-2]
at org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1795) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.util.ExchangeHelper.extractResultBody(ExchangeHelper.java:677) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.extractResultBody(DefaultProducerTemplate.java:515) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.extractResultBody(DefaultProducerTemplate.java:511) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.sendBody(DefaultProducerTemplate.java:163) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.ProducerTemplate$sendBody$0.call(Unknown Source) ~[na:na]
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133) [groovy-all-2.4.11.jar:2.4.11]
at services.HelloImpl.say(HelloImpl.groovy:29) ~[main/:na]
at services.Hello$say.call(Unknown Source) ~[na:na]
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) [groovy-all-2.4.11.jar:2.4.11]
at application.Application$_commandLineRunner_closure1.doCall(Application.groovy:47) ~[main/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_121]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_121]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93) ~[groovy-all-2.4.11.jar:2.4.11]
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325) ~[groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294) ~[groovy-all-2.4.11.jar:2.4.11]
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022) ~[groovy-all-2.4.11.jar:2.4.11]
at groovy.lang.Closure.call(Closure.java:414) ~[groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.ConvertedClosure.invokeCustom(ConvertedClosure.java:54) ~[groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.ConversionHandler.invoke(ConversionHandler.java:124) ~[groovy-all-2.4.11.jar:2.4.11]
at com.sun.proxy.$Proxy44.run(Unknown Source) ~[na:na]
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:776) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE]
... 10 common frames omitted
Caused by: org.apache.camel.component.direct.DirectConsumerNotAvailableException: No consumers available on endpoint: direct://in. Exchange[ID-MONSTER-PC2-58911-1496920205300-0-2]
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:55) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:97) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache$1.doInProducer(ProducerCache.java:529) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache$1.doInProducer(ProducerCache.java:497) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache.doInProducer(ProducerCache.java:365) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache.sendExchange(ProducerCache.java:497) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache.send(ProducerCache.java:225) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.send(DefaultProducerTemplate.java:144) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.sendBody(DefaultProducerTemplate.java:161) ~[camel-core-2.18.4.jar:2.18.4]
What have i done wrong - and why does the producer template invocation on 'direct:in' (also tried stream in with same problem) not work? I thought that .to("stream:out") would be a consumer.
any pointers or advice gratefully received at this point
I have an update on my problems:
I had a subpackage with the application class annotated with #SpringBootApplication. So yes, unadorned it only scans subpackages.
you can add scanBasePackages= or scanBaseClasses= parameter, however when I tried doing a scan for single class, it seemed to scan the whole directory any way and grabbed the others as well.
I refactored the app to have a single root package with subpackages and elected to set the 'scanBasePakages to the new root package. but left the Application class in its own subpackage (personal preference only - documentation suggests leaving the Application in the root package)
you can now add other classes annotated with #Configuration to generate beans or use the basic #Component.
if you create Camel routes annotated with #Component they will be auto configured in the camelContext for you.
it appears by default that Spring isnt not starting the camelContext for you. When I checked the status of the context it shows as starting and not started. so in my commandLineRunner I had to start get the spring injected camelContext and had to start it myself, and exited it when I finished. I was slightly suprised as I thought SpringBootStarter would auto start the camelContext, but it appears not.
once you have Spring component scanning etc working and you start the camelContext, then problems with the org.apache.camel.component.direct.DirectConsumerNotAvailableException exception went away and things started to work - at least the baby examples I'm trying.
So revised structure now looks like this:
The revised ApplicationClass now looks like this with some simple println output to see the state of the context, and beans in the spring ctx. The helleoService bean is still the proxy I use to setup the producer template to call the DirectRoute.
package com.softwood.application
import groovy.util.logging.Slf4j
import org.apache.camel.CamelContext
import org.springframework.beans.factory.annotation.Autowired
import com.softwood.services.Hello
/**
* Created by willw on 07/06/2017.
*/
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean
#Slf4j //inject logger
#SpringBootApplication (scanBasePackages = ["com.softwood"]) //forces scan at parent
// same as #Configuration #EnableAutoConfiguration #ComponentScan with 'defaults' e.g. sub packages
public class Application {
#Autowired
ApplicationContext ctx
#Autowired
CamelContext camelContext
public static void main(String[] args) {
SpringApplication.run(Application.class, args)
}
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
//return closure to run on startup - just list the beans enabled
{args ->
println("Let's inspect the beans provided by Spring Boot:")
String[] beanNames = ctx.getBeanDefinitionNames()
Arrays.sort(beanNames)
for (String beanName : beanNames) {
println(beanName)
}
/* when component scan is working - bean routes are added
automatically to camel context via springBoot, however you do have to start
the camel context, yourself
*/
println "camelCtx has following components : " + camelContext.componentNames
println "camelCtx state is : " + camelContext.status
println "starting camel context"
camelContext.start()
println "camelCtx state now is : " + camelContext.status
//log.debug "wills logging call "
println("call the direct:start route via the service")
Hello service = ctx.getBean("helloService")
def result = service.say("William")
println "service returned : $result "
println "sleep 5 seconds "
sleep (5000)
println "stop camel context"
camelContext.stop()
println "camelCtx state now is : " + camelContext.status
}
}
}
That proxy is just registered as a simple bean like this in the spring context
package com.softwood.services
/**
* Created by willw on 07/06/2017.
*/
import org.apache.camel.Produce;
import org.apache.camel.ProducerTemplate
import org.springframework.stereotype.Component;
import java.util.concurrent.ExecutionException
#Component
public class HelloImpl implements Hello {
#Produce(uri = "direct:in") /* ?block=true */
private ProducerTemplate template
#Override
public String say(String value) throws ExecutionException, InterruptedException {
assert template
println "def endpoint is : " + template.getDefaultEndpoint()
//Future future = template.asyncSendBody(template.getDefaultEndpoint(), value)
//return future.get()
return template.sendBody (template.getDefaultEndpoint(), value)
}
}
The TimedRoute just sorts itself out with no template required to invoke in
package com.softwood.camelRoutes
/**
* Created by willw on 07/06/2017.
*/
import org.apache.camel.builder.RouteBuilder
import org.springframework.stereotype.Component
#Component
class TimedRoute extends RouteBuilder {
#Override
void configure () throws Exception {
from ("timer:foo")
.to ("log:com.softwood.application.Application?level=WARN")
}
}
My simple no-op file route isn't working (yet) and not sure why. I suspect I've not got the file config right somehow; some playing is required.
package com.softwood.camelRoutes
import org.apache.camel.builder.RouteBuilder
import org.springframework.stereotype.Component
/**
* Created by willw on 08/06/2017.
*/
#Component
class FileNoOpRoute extends RouteBuilder{
#Override
void configure () throws Exception {
from ("file:../com.softwood.file-inbox?recursive=true&noop=true&idempotent=true")
.to ("file:../com.softwood.file-outbox")
}
}
However the basics are not working and least camel is doing something whereas before I just had the exception and nothing.
I have found another question on Spring config highlighting some of the above also.

Tomcat 7.0.50 fails to inialize due to websocket deployment exception

I am trying to deploy WebSockets on tomcat 7.0.50. following is my code
#ServerEndpoint(value="/ws/fileuploadtracker/")
public class FileUploadTrackerEndPoint{
#OnOpen
public void onOpen(Session session) {
.....
}
#OnMessage
public void onMessage(Session session, String msg) {
try {
session.getBasicRemote().sendText(msg);
} catch (IOException e) {
logger.error(e.getMessage());
}
}
}
I picked above code from oracles' notes on javaee's websocket example from the link:
http://docs.oracle.com/javaee/7/tutorial/doc/websocket004.htm
My tomcat fails to start with following exception :
SEVERE: Error during ServletContainerInitializer processing
javax.servlet.ServletException: javax.websocket.DeploymentException: A parameter of type [interface javax.websocket.Session] was found on method[onOpen] of class [java.lang.reflect.Method] that did not have a #PathParam annotation
at org.apache.tomcat.websocket.server.WsSci.onStartup(WsSci.java:146)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5444)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: javax.websocket.DeploymentException: A parameter of type [interface javax.websocket.Session] was found on method[onOpen] of class [java.lang.reflect.Method] that did not have a #PathParam annotation
at org.apache.tomcat.websocket.pojo.PojoMethodMapping.getPathParams(PojoMethodMapping.java:233)
at org.apache.tomcat.websocket.pojo.PojoMethodMapping.<init>(PojoMethodMapping.java:122)
at org.apache.tomcat.websocket.server.WsServerContainer.addEndpoint(WsServerContainer.java:239)
at org.apache.tomcat.websocket.server.WsSci.onStartup(WsSci.java:143)
... 8 more
1) How am I supposed to configure server end points as annotated class on tomcat 7.
2) IS there any tomcat specific way to write annotated server endpoints?
3)if tomcat implements JSR356 why does not it support above config?
I tried hard to find a suitable example but couldn't. I also tried putting #pathparam annotation but it only acepts strings and throws classcastexception.
It's also possible you used the wrong #PathParam, make sure you use:
import javax.websocket.server.PathParam;
and not:
import javax.ws.rs.PathParam;
I was using 1.b09 api. changed it to 1.0.
lesson learnt : always go for stable versions and not beta ones

Resources