Grails Spring Batch plugin not finding jobs - spring

I'm working on a project with Grails 3.3.7 and I'm trying to get the Spring Batch to work using the grails-spring-batch. Just like in the documentation example, I created a MySimpleJobBatchConfig.groovy file in the grails-app/batch directory with the following content :
import myapp.PrintMessageTasklet;
beans {
xmlns batch:"http://www.springframework.org/schema/batch"
batch.job(id: 'mySimpleJob') {
batch.step(id: 'logStart') {
batch.tasklet(ref: 'printMessage')
}
}
printMessage(PrintMessageTasklet) { bean ->
bean.autowire = "byName"
}
}
PrintMessageTasklet is defined as such in src/main/groovy/myapp/PrintMessageTasklet.groovy :
package myapp
import org.springframework.batch.core.StepContribution
import org.springframework.batch.core.scope.context.ChunkContext
import org.springframework.batch.core.step.tasklet.Tasklet
import org.springframework.batch.repeat.RepeatStatus
class PrintMessageTasklet implements Tasklet {
RepeatStatus execute(StepContribution stepContribution, ChunkContext chunkContext) {
println "Test"
return RepeatStatus.FINISHED
}
}
And here's the service that's trying to launch the job in grails-app/services/myapp/SimpleJobService :
package myapp
import grails.gorm.transactions.Transactional
import org.springframework.batch.core.JobParameters
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing
#Transactional
class SimpleJobService {
def jobLauncher;
def mySimpleJob;
def launchSimpleJob() {
jobLauncher.run(mySimpleJob, new JobParameters())
}
}
However, when I run launchSimpleJob, I get the following exception telling me that mySimpleJob is null, despite me having defined it in the MySimpleJobBatchConfig.groovy file.
java.lang.reflect.InvocationTargetException: null
at org.grails.core.DefaultGrailsControllerClass$ReflectionInvoker.invoke(DefaultGrailsControllerClass.java:211)
at org.grails.core.DefaultGrailsControllerClass.invoke(DefaultGrailsControllerClass.java:188)
at org.grails.web.mapping.mvc.UrlMappingsInfoHandlerAdapter.handle(UrlMappingsInfoHandlerAdapter.groovy:90)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:967)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:901)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at org.springframework.boot.web.filter.ApplicationContextHeaderFilter.doFilterInternal(ApplicationContextHeaderFilter.java:55)
at org.grails.web.servlet.mvc.GrailsWebRequestFilter.doFilterInternal(GrailsWebRequestFilter.java:77)
at org.grails.web.filters.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:67)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.lang.IllegalArgumentException: The Job must not be null.
at org.springframework.util.Assert.notNull(Assert.java:134)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:94)
at myapp.SimpleJobService.$tt__launchSimpleJob(SimpleJobService.groovy:14)
at grails.gorm.transactions.GrailsTransactionTemplate$2.doInTransaction(GrailsTransactionTemplate.groovy:94)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at grails.gorm.transactions.GrailsTransactionTemplate.execute(GrailsTransactionTemplate.groovy:91)
at myapp.SimpleJobController.launch(SimpleJobController.groovy:9)
... 14 common frames omitted
If I try to start my job using springBatchService.launch('mySimpleJob') instead, the message in the returned map also tells me that it couldn't find the job named "mySimpleJob".
Since I'm quite new to Groovy and Spring Batch and I followed carefully the documentation of this plugin I can't find what's wrong. Does anyone knows how to fix this ?

Fixed it, the plugin documentation was outdated, turns out my BatchConfing needed to be in the src/main/resources/batch folder instead of the grails-app/batch folder.
Also had to change #Transactional to #Transactional(propagation = Propagation.NOT_SUPPORTED) in my service.

Related

Spring Cloud Stream Kafka Streams Binder KafkaException: Could not start stream: 'listener' cannot be null

I am new to Kafka Streams and Spring Cloud Stream but have read good things about it in terms of moving the integration related codes into properties file so devs can focus mostly on the business logic side of things.
Here I have my simple application class.
package com.some.events.consumer
import com.some.events.SomeEvent
import org.apache.kafka.streams.kstream.KStream
import org.springframework.boot.autoconfigure.SpringBootApplication
import org.springframework.boot.runApplication
import org.springframework.context.annotation.Bean
import java.util.function.Consumer
#SpringBootApplication
class ConsumerApplication {
#Bean
fun consume(): Consumer<KStream<String, SomeEvent>> {
return Consumer { input -> input.foreach { key, value -> println("Key: $key, value: $value") } }
}
}
fun main(args: Array<String>) {
runApplication<ConsumerApplication>(*args)
}
My application.yml file is as follows.
spring:
cloud:
function:
definition: consume
stream:
bindings:
consume-in-0:
destination: "some-event"
group: "some-event"
My dependencies in build.gradle.kts are defined as follows (just included the relevant ones here).
extra["springCloudVersion"] = "2020.0.2"
dependencies {
implementation("org.jetbrains.kotlin:kotlin-reflect")
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8")
implementation("org.springframework.cloud:spring-cloud-stream")
implementation("org.springframework.cloud:spring-cloud-stream-binder-kafka-streams")
testImplementation("org.springframework.boot:spring-boot-starter-test")
}
dependencyManagement {
imports {
mavenBom("org.springframework.cloud:spring-cloud-dependencies:${property("springCloudVersion")}")
}
}
When I run the application I got the following exception.
org.springframework.context.ApplicationContextException: Failed to start bean 'streamsBuilderFactoryManager'; nested exception is org.springframework.kafka.KafkaException: Could not start stream: ; nested exception is java.lang.IllegalArgumentException: 'listener' cannot be null
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:181) ~[spring-context-5.3.5.jar:5.3.5]
at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:54) ~[spring-context-5.3.5.jar:5.3.5]
at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:356) ~[spring-context-5.3.5.jar:5.3.5]
at java.base/java.lang.Iterable.forEach(Iterable.java:75) ~[na:na]
at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:155) ~[spring-context-5.3.5.jar:5.3.5]
at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:123) ~[spring-context-5.3.5.jar:5.3.5]
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:935) ~[spring-context-5.3.5.jar:5.3.5]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:586) ~[spring-context-5.3.5.jar:5.3.5]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:769) ~[spring-boot-2.4.4.jar:2.4.4]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:761) ~[spring-boot-2.4.4.jar:2.4.4]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:426) ~[spring-boot-2.4.4.jar:2.4.4]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:326) ~[spring-boot-2.4.4.jar:2.4.4]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1313) ~[spring-boot-2.4.4.jar:2.4.4]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1302) ~[spring-boot-2.4.4.jar:2.4.4]
at com.some.events.consumer.ConsumerApplicationKt.main(ConsumerApplication.kt:22) ~[main/:na]
Caused by: org.springframework.kafka.KafkaException: Could not start stream: ; nested exception is java.lang.IllegalArgumentException: 'listener' cannot be null
at org.springframework.cloud.stream.binder.kafka.streams.StreamsBuilderFactoryManager.start(StreamsBuilderFactoryManager.java:94) ~[spring-cloud-stream-binder-kafka-streams-3.1.2.jar:3.1.2]
at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:178) ~[spring-context-5.3.5.jar:5.3.5]
... 14 common frames omitted
Caused by: java.lang.IllegalArgumentException: 'listener' cannot be null
at org.springframework.util.Assert.notNull(Assert.java:201) ~[spring-core-5.3.5.jar:5.3.5]
at org.springframework.kafka.config.StreamsBuilderFactoryBean.addListener(StreamsBuilderFactoryBean.java:268) ~[spring-kafka-2.6.7.jar:2.6.7]
at org.springframework.cloud.stream.binder.kafka.streams.StreamsBuilderFactoryManager.start(StreamsBuilderFactoryManager.java:84) ~[spring-cloud-stream-binder-kafka-streams-3.1.2.jar:3.1.2]
... 15 common frames omitted
Process finished with exit code 1
Note that I am aware that I need to configure the Serde and Avro related things (I am using Avro for event schema), but the thing is, the stream won't even run.
Can someone point me in the right direction? I tried googling this but no one has posted an issue where it's caused by 'listener' cannot be null. Thanks!
This is a bug; it is fixed in the 3.1.3-SNAPSHOT
https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/commit/f25dbff2b7fc0d0c742dd674a9e392057a34c86d
https://github.com/spring-cloud/spring-cloud-stream-binder-kafka/issues/1030#issuecomment-804039087
I am not sure about the comment there; adding micrometer to the class path should resolve it.
The destination: "some-event" should point to a kafka topic. Like destination: "some-event-topic".
Then you have to create an interface for the listener consume-in-0. Using the spring annotations will make the project load this listener and it will not be null anymore.
import org.apache.kafka.streams.kstream.KStream;
import org.springframework.cloud.stream.annotation.Input;
public interface KafkaListenerBinding {
    #Input("consume-in-0")
    KStream<String, String> inputStream();
}
Then you create a #Service to process messages from the listener #StreamListener("consume-in-0").
import lombok.extern.log4j.Log4j2;
import org.apache.kafka.streams.kstream.KStream;
import org.springframework.cloud.stream.annotation.EnableBinding;
import org.springframework.cloud.stream.annotation.StreamListener;
import org.springframework.stereotype.Service;
#Log4j2
#Service
#EnableBinding(KafkaListenerBinding.class)
public class KafkaListenerService {
#StreamListener("consume-in-0")
public void process(KStream<String, String> input) {
input.foreach((k,v) -> log.info(String.format("Key: %s, Value: %s",k,v)));
}
}
NOTE: Despite the bug said by #Gary Russel I am going to complete my answer with the functional manner to implement the Spring service.
The functional style can be achieved by defining the function at the application.yml file. There is an internal convention to use the name of the function and the posfix in-0 and out-0 for the bindings. You have to use this when defining the binding. More details here.
spring:
cloud:
stream:
function:
definition: transformToUpperCase
bindings:
transformToUpperCase-in-0:
destination: input-func-topic
transformToUpperCase-out-0:
destination: output-func-topic
Then you annotate your class with #Configuration and #EnableAutoConfiguration and make sure that the lambda method is the same that you defined on the application.yml file for the function.definition.
#Configuration
#EnableAutoConfiguration
public class KafkaListenerFunctionalService {
#Bean
public Function<KStream<String, String>, KStream<String, String>> transformToUpperCase() {
return input -> input
.peek((k, v) -> log.info("Functional received Input: {}", v))
.mapValues(i -> i.toUpperCase());
}
}
I have the same problem, so first i add the io.micrometer dependency(install the latest version from maven)
second create Bean from SimpleMeterRegistry and it solved the problem
#Bean
SimpleMeterRegistry simpleMeterRegistry() {
return new SimpleMeterRegistry();
}

Dropwizard testing - ResourceTestRule throwing NoClassDefFoundError: ch/qos/logback/core/filter/Filter

I am using dropwizard 1.2.4 with log4j 1.2.17. I have followed the instructions as mentioned below
https://github.com/arteam/dropwizard-nologback/
It is throwing the exception like below during unit testing.
java.lang.NoClassDefFoundError: ch/qos/logback/core/filter/Filter
at io.dropwizard.testing.junit.ResourceTestRule.<clinit>(ResourceTestRule.java:34)
at com.vnera.restapilayer.api.resources.ApiInfoControllerTest.<clinit>(ApiInfoControllerTest.java:25)
at sun.misc.Unsafe.ensureClassInitialized(Native Method)
at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156)
at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
at java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
at java.lang.reflect.Field.get(Field.java:393)
at org.junit.runners.model.FrameworkField.get(FrameworkField.java:73)
at org.junit.runners.model.TestClass.getAnnotatedFieldValues(TestClass.java:230)
at org.junit.runners.ParentRunner.classRules(ParentRunner.java:255)
at org.junit.runners.ParentRunner.withClassRules(ParentRunner.java:244)
at org.junit.runners.ParentRunner.classBlock(ParentRunner.java:194)
at org.junit.runners.ParentRunner.run(ParentRunner.java:362)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: java.lang.ClassNotFoundException: ch.qos.logback.core.filter.Filter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 19 more
My test code looks like below
import io.dropwizard.testing.junit.ResourceTestRule;
import org.junit.Assert;
import org.junit.ClassRule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import javax.ws.rs.core.Response;
import static org.mockito.Mockito.mock;
#Category(value = UnitTest.class)
public class ApiInfoControllerTest {
private static ApiNonFunctionalHandler nonFunctionalHandler = mock(ApiNonFunctionalHandler.class);
private static ApiFilter apiFilter = new ApiFilter(nonFunctionalHandler);
private static final String authToken = "NetworkInsight xTyAGJmZ8nU8yJDP7LnA8Q==";
#ClassRule
public static final ResourceTestRule resources = ResourceTestRule.builder()
.addResource(new ApiInfoController())
.addProvider(apiFilter).build();
#Test
public void testApiVersion() throws Exception {
Response response = resources.client()
.target(ApiConstants.INFO_BASE_URL + "/version")
.request()
.header("Authorization", authToken)
.buildGet().invoke();
Assert.assertNotNull(response);
Assert.assertEquals(response.toString(), Response.Status.OK.getStatusCode(), response.getStatus());
final VersionResponse actualError = response.readEntity(VersionResponse.class);
Assert.assertEquals(actualError.getApiVersion(), ApiConstants.API_VERSION);
}
}
My main application is working fine. The configuration.yaml for main application looks like below
# Change default server ports
server:
applicationConnectors:
- type: http
port: 8123
adminConnectors:
- type: http
port: 8124
requestLog:
type: external
logging:
type: external
Can someone let me know what could be going wrong and how can I get around this?
EDIT
Output of mvn dependency:tree is placed here as I am hitting the character limit here.
This is a bug in dropwizard 1.2.4 as discussed below
https://github.com/dropwizard/dropwizard/pull/2338

spring boot and camel throws direct.DirectConsumerNotAvailableException

I'm trying to get simple example of springboot and camel working but come undone. Not sure what i'm doing wrong. in the gradle build i've included so far
dependencies {
compile 'org.apache.camel:camel-spring-boot-starter:2.18.4'
compile 'org.apache.camel:camel-groovy:2.18.4'
compile 'org.apache.camel:camel-stream:2.18.4'
compile 'org.codehaus.groovy:groovy-all:2.4.11'
testCompile group: 'junit', name: 'junit', version: '4.11'
testCompile group: 'junit', name: 'junit', version: '4.12'
}
i've create a DirectRoute component like this
#Component
class DirectRoute extends RouteBuilder{
#Override
void configure () throws Exception {
from ("direct:in") //tried stream:in also
.to ("stream:out")
}
}
I then have a driver bean that try's to invoke the route
#Component
public class HelloImpl implements Hello {
#Produce(uri = "direct:in")
private ProducerTemplate template;
#Override
public String say(String value) throws ExecutionException, InterruptedException {
assert template
println "def endpoint is : " + template.getDefaultEndpoint()
return template.sendBody (template.getDefaultEndpoint(), value)
}
}
lastly in the springboot application class i added a command line runner like this, that gets my bean from the spring context, and invokes the say method. I'm using groovy so i just passed a closure to the command line runner.
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
//return closure to run on startup - just list the beans enabled
{args ->
println("Let's inspect the beans provided by Spring Boot:")
String[] beanNames = ctx.getBeanDefinitionNames()
Arrays.sort(beanNames)
for (String beanName : beanNames) {
println(beanName)
}
println("call the direct:start route via the service")
Hello service = ctx.getBean("helloService")
def result = service.say("William")
println "service returned : $result "
}
}
when i run my application i get all the bean names printed out (that's ok), however when i invoke the direct:in via producer template i get this error (org.apache.camel.component.direct.DirectConsumerNotAvailableException) see below.
I was expecting the route to be triggered the name sent to see that arrive in the output stream - but this is what i get.
Caused by: org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[ID-MONSTER-PC2-58911-1496920205300-0-2]
at org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1795) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.util.ExchangeHelper.extractResultBody(ExchangeHelper.java:677) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.extractResultBody(DefaultProducerTemplate.java:515) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.extractResultBody(DefaultProducerTemplate.java:511) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.sendBody(DefaultProducerTemplate.java:163) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.ProducerTemplate$sendBody$0.call(Unknown Source) ~[na:na]
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133) [groovy-all-2.4.11.jar:2.4.11]
at services.HelloImpl.say(HelloImpl.groovy:29) ~[main/:na]
at services.Hello$say.call(Unknown Source) ~[na:na]
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113) [groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) [groovy-all-2.4.11.jar:2.4.11]
at application.Application$_commandLineRunner_closure1.doCall(Application.groovy:47) ~[main/:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_121]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_121]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93) ~[groovy-all-2.4.11.jar:2.4.11]
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325) ~[groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:294) ~[groovy-all-2.4.11.jar:2.4.11]
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022) ~[groovy-all-2.4.11.jar:2.4.11]
at groovy.lang.Closure.call(Closure.java:414) ~[groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.ConvertedClosure.invokeCustom(ConvertedClosure.java:54) ~[groovy-all-2.4.11.jar:2.4.11]
at org.codehaus.groovy.runtime.ConversionHandler.invoke(ConversionHandler.java:124) ~[groovy-all-2.4.11.jar:2.4.11]
at com.sun.proxy.$Proxy44.run(Unknown Source) ~[na:na]
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:776) [spring-boot-1.5.2.RELEASE.jar:1.5.2.RELEASE]
... 10 common frames omitted
Caused by: org.apache.camel.component.direct.DirectConsumerNotAvailableException: No consumers available on endpoint: direct://in. Exchange[ID-MONSTER-PC2-58911-1496920205300-0-2]
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:55) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:197) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:97) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache$1.doInProducer(ProducerCache.java:529) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache$1.doInProducer(ProducerCache.java:497) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache.doInProducer(ProducerCache.java:365) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache.sendExchange(ProducerCache.java:497) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.ProducerCache.send(ProducerCache.java:225) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.send(DefaultProducerTemplate.java:144) ~[camel-core-2.18.4.jar:2.18.4]
at org.apache.camel.impl.DefaultProducerTemplate.sendBody(DefaultProducerTemplate.java:161) ~[camel-core-2.18.4.jar:2.18.4]
What have i done wrong - and why does the producer template invocation on 'direct:in' (also tried stream in with same problem) not work? I thought that .to("stream:out") would be a consumer.
any pointers or advice gratefully received at this point
I have an update on my problems:
I had a subpackage with the application class annotated with #SpringBootApplication. So yes, unadorned it only scans subpackages.
you can add scanBasePackages= or scanBaseClasses= parameter, however when I tried doing a scan for single class, it seemed to scan the whole directory any way and grabbed the others as well.
I refactored the app to have a single root package with subpackages and elected to set the 'scanBasePakages to the new root package. but left the Application class in its own subpackage (personal preference only - documentation suggests leaving the Application in the root package)
you can now add other classes annotated with #Configuration to generate beans or use the basic #Component.
if you create Camel routes annotated with #Component they will be auto configured in the camelContext for you.
it appears by default that Spring isnt not starting the camelContext for you. When I checked the status of the context it shows as starting and not started. so in my commandLineRunner I had to start get the spring injected camelContext and had to start it myself, and exited it when I finished. I was slightly suprised as I thought SpringBootStarter would auto start the camelContext, but it appears not.
once you have Spring component scanning etc working and you start the camelContext, then problems with the org.apache.camel.component.direct.DirectConsumerNotAvailableException exception went away and things started to work - at least the baby examples I'm trying.
So revised structure now looks like this:
The revised ApplicationClass now looks like this with some simple println output to see the state of the context, and beans in the spring ctx. The helleoService bean is still the proxy I use to setup the producer template to call the DirectRoute.
package com.softwood.application
import groovy.util.logging.Slf4j
import org.apache.camel.CamelContext
import org.springframework.beans.factory.annotation.Autowired
import com.softwood.services.Hello
/**
* Created by willw on 07/06/2017.
*/
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean
#Slf4j //inject logger
#SpringBootApplication (scanBasePackages = ["com.softwood"]) //forces scan at parent
// same as #Configuration #EnableAutoConfiguration #ComponentScan with 'defaults' e.g. sub packages
public class Application {
#Autowired
ApplicationContext ctx
#Autowired
CamelContext camelContext
public static void main(String[] args) {
SpringApplication.run(Application.class, args)
}
#Bean
public CommandLineRunner commandLineRunner(ApplicationContext ctx) {
//return closure to run on startup - just list the beans enabled
{args ->
println("Let's inspect the beans provided by Spring Boot:")
String[] beanNames = ctx.getBeanDefinitionNames()
Arrays.sort(beanNames)
for (String beanName : beanNames) {
println(beanName)
}
/* when component scan is working - bean routes are added
automatically to camel context via springBoot, however you do have to start
the camel context, yourself
*/
println "camelCtx has following components : " + camelContext.componentNames
println "camelCtx state is : " + camelContext.status
println "starting camel context"
camelContext.start()
println "camelCtx state now is : " + camelContext.status
//log.debug "wills logging call "
println("call the direct:start route via the service")
Hello service = ctx.getBean("helloService")
def result = service.say("William")
println "service returned : $result "
println "sleep 5 seconds "
sleep (5000)
println "stop camel context"
camelContext.stop()
println "camelCtx state now is : " + camelContext.status
}
}
}
That proxy is just registered as a simple bean like this in the spring context
package com.softwood.services
/**
* Created by willw on 07/06/2017.
*/
import org.apache.camel.Produce;
import org.apache.camel.ProducerTemplate
import org.springframework.stereotype.Component;
import java.util.concurrent.ExecutionException
#Component
public class HelloImpl implements Hello {
#Produce(uri = "direct:in") /* ?block=true */
private ProducerTemplate template
#Override
public String say(String value) throws ExecutionException, InterruptedException {
assert template
println "def endpoint is : " + template.getDefaultEndpoint()
//Future future = template.asyncSendBody(template.getDefaultEndpoint(), value)
//return future.get()
return template.sendBody (template.getDefaultEndpoint(), value)
}
}
The TimedRoute just sorts itself out with no template required to invoke in
package com.softwood.camelRoutes
/**
* Created by willw on 07/06/2017.
*/
import org.apache.camel.builder.RouteBuilder
import org.springframework.stereotype.Component
#Component
class TimedRoute extends RouteBuilder {
#Override
void configure () throws Exception {
from ("timer:foo")
.to ("log:com.softwood.application.Application?level=WARN")
}
}
My simple no-op file route isn't working (yet) and not sure why. I suspect I've not got the file config right somehow; some playing is required.
package com.softwood.camelRoutes
import org.apache.camel.builder.RouteBuilder
import org.springframework.stereotype.Component
/**
* Created by willw on 08/06/2017.
*/
#Component
class FileNoOpRoute extends RouteBuilder{
#Override
void configure () throws Exception {
from ("file:../com.softwood.file-inbox?recursive=true&noop=true&idempotent=true")
.to ("file:../com.softwood.file-outbox")
}
}
However the basics are not working and least camel is doing something whereas before I just had the exception and nothing.
I have found another question on Spring config highlighting some of the above also.

How to make Weld lookup class on generated-sources

I have a maven project which i'm using MapStruct to generate mappers to help in the job of translating entities into DTOs and vice-versa.
This mappers are generated during the generate-sources phase of maven, and stored into target/generated-sources and target/AppName/WEB-INF/classes folders.
For example, I have this Mapper
#Mapper
public interface RuleMapper {
RuleDto ruletoDto(Rule rule);
//other cool stuf
}
I configurated MapStruct to use CDI, so it will generate the following:
#Generated(
value = "org.mapstruct.ap.MappingProcessor",
date = "2016-12-19T23:19:36-0200",
comments = "version: 1.1.0.CR1, compiler: javac, environment: Java 1.8.0_112"
)
#Singleton
#Named
public class RuleMapperImpl implements RuleMapper {
#Override
public RuleDto ruletoDto(Rule rule) {
ruleDto ruleDto = new ruleDto();
if ( rule != null ) {
ruleDto.setIdRule( rule.getIdRule() );
}
return ruleDto;
}
}
It works perfectely when running on Wildfly server, the problem is that I'm trying to junit test this class, for this, I implemented a custom runner as shown bellow:
import org.junit.runners.BlockJUnit4ClassRunner;
import org.junit.runners.model.InitializationError;
public class WeldJUnit4Runner extends BlockJUnit4ClassRunner {
public WeldJUnit4Runner(Class<Object> clazz) throws InitializationError {
super(clazz);
}
#Override
protected Object createTest() throws Exception {
final Class<?> test = getTestClass().getJavaClass();
return WeldContext.INSTANCE.getBean(test);
}
}
And:
import org.jboss.weld.environment.se.Weld;
import org.jboss.weld.environment.se.WeldContainer;
public class WeldContext {
public static final WeldContext INSTANCE = new WeldContext();
private final Weld weld;
private final WeldContainer container;
private WeldContext() {
this.weld = new Weld();
this.container = weld.initialize();
Runtime.getRuntime().addShutdownHook(new Thread() {
#Override
public void run() {
weld.shutdown();
}
});
}
public <T> T getBean(Class<T> type) {
return container.instance().select(type).get();
}
}
These implementations were taken from here.
Finally, the test:
#RunWith(WeldJUnit4Runner.class)
public class RuleMapperTest {
#Inject
private RuleMapper ruleMapper;
#Test
public void coolTestName() {
Assert.assertTrue(Boolean.TRUE);
}
}
When I try to run, this is the console output:
log4j:WARN No appenders could be found for logger (org.jboss.logging).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
warning about logs, and the following exception:
java.lang.ExceptionInInitializerError
at br.com.treinoos.common.cdi.WeldJUnit4Runner.createTest(WeldJUnit4Runner.java:15)
at org.junit.runners.BlockJUnit4ClassRunner$1.runReflectiveCall(BlockJUnit4ClassRunner.java:266)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.BlockJUnit4ClassRunner.methodBlock(BlockJUnit4ClassRunner.java:263)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:675)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Caused by: org.jboss.weld.exceptions.DeploymentException: WELD-001408: Unsatisfied dependencies for type RuleMapper with qualifiers #Default
at injection point [BackedAnnotatedField] #Inject private br.com.treinoos.model.core.business.treinoos.mappers.RuleMapperTest.ruleMapper
at br.com.treinoos.model.core.business.treinoos.mappers.RuleMapperTest.ruleMapper(RuleMapperTest.java:0)
at org.jboss.weld.bootstrap.Validator.validateInjectionPointForDeploymentProblems(Validator.java:359)
at org.jboss.weld.bootstrap.Validator.validateInjectionPoint(Validator.java:281)
at org.jboss.weld.bootstrap.Validator.validateGeneralBean(Validator.java:134)
at org.jboss.weld.bootstrap.Validator.validateRIBean(Validator.java:155)
at org.jboss.weld.bootstrap.Validator.validateBean(Validator.java:518)
at org.jboss.weld.bootstrap.ConcurrentValidator$1.doWork(ConcurrentValidator.java:68)
at org.jboss.weld.bootstrap.ConcurrentValidator$1.doWork(ConcurrentValidator.java:66)
at org.jboss.weld.executor.IterativeWorkerTaskFactory$1.call(IterativeWorkerTaskFactory.java:63)
at org.jboss.weld.executor.IterativeWorkerTaskFactory$1.call(IterativeWorkerTaskFactory.java:56)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Like Weld wasn't abble to lookup the generated class.
The beans.xml is already created under src/test/resources/META-INF/beans.xml:
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://xmlns.jcp.org/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee
http://xmlns.jcp.org/xml/ns/javaee/beans_1_1.xsd"
version="1.1" bean-discovery-mode="all">
</beans>
Can anybody point me a solution to this problem? I've already searched something simillar, but no success.
Here's a full explanation to your problem and why what I wrote fixes it.
In Maven, you have at least 2 classloader. Your test classpath and your main classpath each have their own classloader. You can have others depending on your dependency structure. CDI identifies each classloader as a separate bean archive when running this way. src/main/webapp is explicitly for your WAR file. The beans.xml there does not give you a bean archive. Adding one to src/main/resources does. This problem is specific to how you're instantiating weld.
There are other projects that do this correctly - CDI-unit and Arquillian, specifically the Weld Embedded container. If you were to use one of these, this would not be an issue.

Tomcat 7.0.50 fails to inialize due to websocket deployment exception

I am trying to deploy WebSockets on tomcat 7.0.50. following is my code
#ServerEndpoint(value="/ws/fileuploadtracker/")
public class FileUploadTrackerEndPoint{
#OnOpen
public void onOpen(Session session) {
.....
}
#OnMessage
public void onMessage(Session session, String msg) {
try {
session.getBasicRemote().sendText(msg);
} catch (IOException e) {
logger.error(e.getMessage());
}
}
}
I picked above code from oracles' notes on javaee's websocket example from the link:
http://docs.oracle.com/javaee/7/tutorial/doc/websocket004.htm
My tomcat fails to start with following exception :
SEVERE: Error during ServletContainerInitializer processing
javax.servlet.ServletException: javax.websocket.DeploymentException: A parameter of type [interface javax.websocket.Session] was found on method[onOpen] of class [java.lang.reflect.Method] that did not have a #PathParam annotation
at org.apache.tomcat.websocket.server.WsSci.onStartup(WsSci.java:146)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5444)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: javax.websocket.DeploymentException: A parameter of type [interface javax.websocket.Session] was found on method[onOpen] of class [java.lang.reflect.Method] that did not have a #PathParam annotation
at org.apache.tomcat.websocket.pojo.PojoMethodMapping.getPathParams(PojoMethodMapping.java:233)
at org.apache.tomcat.websocket.pojo.PojoMethodMapping.<init>(PojoMethodMapping.java:122)
at org.apache.tomcat.websocket.server.WsServerContainer.addEndpoint(WsServerContainer.java:239)
at org.apache.tomcat.websocket.server.WsSci.onStartup(WsSci.java:143)
... 8 more
1) How am I supposed to configure server end points as annotated class on tomcat 7.
2) IS there any tomcat specific way to write annotated server endpoints?
3)if tomcat implements JSR356 why does not it support above config?
I tried hard to find a suitable example but couldn't. I also tried putting #pathparam annotation but it only acepts strings and throws classcastexception.
It's also possible you used the wrong #PathParam, make sure you use:
import javax.websocket.server.PathParam;
and not:
import javax.ws.rs.PathParam;
I was using 1.b09 api. changed it to 1.0.
lesson learnt : always go for stable versions and not beta ones

Resources