How to stop polling after a message is received? Spring Integration - spring

I want to poll for a file in a directory and stop the polling once the file is found. I am very new to Spring framework and a lot of it still is very confusing. After doing some research, I found out a couple of ways of doing this but haven't any luck with any of them.
One of the ways is using a control bus as shown here. However, it just seems that the polling just stops after 2 seconds. I am not sure how to include the condition to stop only when a file is received.
Another way is to use "Smart Polling" as answered here. The link in the answer is old but it points to the official Spring docs here: Smart Polling. Through the article, I learned about AbstractMessageSourceAdvice and SimpleActiveIdleMessageSourceAdvice. The latter seems to suit my goal and would be the simplest to implement, so I decided to give that a go. My codes are as below:
IntegrationConfig.java
package com.example.springexample;
import java.io.File;
import org.aopalliance.aop.Advice;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.annotation.InboundChannelAdapter;
import org.springframework.integration.annotation.Poller;
import org.springframework.integration.aop.SimpleActiveIdleMessageSourceAdvice;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.core.MessageSource;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.file.FileReadingMessageSource;
import org.springframework.integration.file.filters.SimplePatternFileListFilter;
import org.springframework.integration.util.DynamicPeriodicTrigger;
import org.springframework.messaging.MessageChannel;
#Configuration
#EnableIntegration
public class IntegrationConfig {
#Bean
public IntegrationFlow advised() {
return IntegrationFlows.from("fileInputChannel")
.handle("runBatchScript", "run", c -> c.advice(stopPollingAdvice()))
.get();
}
#Bean
public MessageChannel fileInputChannel() {
return new DirectChannel();
}
#Bean
#InboundChannelAdapter(value = "fileInputChannel", poller = #Poller(fixedDelay = "1000"))
public MessageSource<File> fileReadingMessageSource() {
FileReadingMessageSource source = new FileReadingMessageSource();
source.setDirectory(new File("."));
source.setFilter(new SimplePatternFileListFilter("*.bat"));
return source;
}
#Bean
public RunBatchScript runBatchScript() {
return new RunBatchScript();
}
#Bean
public Advice stopPollingAdvice() {
DynamicPeriodicTrigger trigger = new DynamicPeriodicTrigger(10000);
SimpleActiveIdleMessageSourceAdvice advice = new SimpleActiveIdleMessageSourceAdvice(trigger);
advice.setActivePollPeriod(60000);
return advice;
}
}
RunBatchScript.java
package com.example.springexample;
import java.io.IOException;
import java.util.Date;
import java.util.logging.Logger;
public class RunBatchScript {
Logger logger = Logger.getLogger(RunBatchScript.class.getName());
public void run() throws IOException {
logger.info("Running the batch script at " + new Date());
Runtime.getRuntime().exec("cmd.exe /c simplebatchscript.bat");
logger.info("Finished running the batch script at " + new Date());
}
}
SpringExampleApplication.java
package com.example.springexample;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
#SpringBootApplication
public class SpringExampleApplication {
public static void main(String[] args) {
SpringApplication.run(SpringExampleApplication.class, args);
}
}
I used this and this as the base for my codes. However, it doesn't seem to be working as the poller still polls every 1 second instead of the new 10 seconds or 60 seconds. Moreover, I am not sure how to actually stop the poller. I tried putting null into the constructor for SimpleActiveIdleMessageSource but it just returns NullPointerException.
The output when I run the application:
2020-03-15 13:57:46.081 INFO 37504 --- [ask-scheduler-1] c.example.springexample.RunBatchScript : Running the batch script at Sun Mar 15 13:57:46 SRET 2020
2020-03-15 13:57:46.084 INFO 37504 --- [ask-scheduler-1] c.example.springexample.RunBatchScript : Finished running the batch script at Sun Mar 15 13:57:46 SRET 2020
2020-03-15 13:57:47.085 INFO 37504 --- [ask-scheduler-2] c.example.springexample.RunBatchScript : Running the batch script at Sun Mar 15 13:57:47 SRET 2020
2020-03-15 13:57:47.087 INFO 37504 --- [ask-scheduler-2] c.example.springexample.RunBatchScript : Finished running the batch script at Sun Mar 15 13:57:47 SRET 2020
2020-03-15 13:57:48.089 INFO 37504 --- [ask-scheduler-1] c.example.springexample.RunBatchScript : Running the batch script at Sun Mar 15 13:57:48 SRET 2020
2020-03-15 13:57:48.092 INFO 37504 --- [ask-scheduler-1] c.example.springexample.RunBatchScript : Finished running the batch script at Sun Mar 15 13:57:48 SRET 2020
2020-03-15 13:57:49.093 INFO 37504 --- [ask-scheduler-3] c.example.springexample.RunBatchScript : Running the batch script at Sun Mar 15 13:57:49 SRET 2020
2020-03-15 13:57:49.096 INFO 37504 --- [ask-scheduler-3] c.example.springexample.RunBatchScript : Finished running the batch script at Sun Mar 15 13:57:49 SRET 2020
Any help with some code is greatly appreciated.

You should apply SimpleActiveIdleMessageSourceAdvice to #InboundChannelAdapter. Also , the trigger of SimpleActiveIdleMessageSourceAdvice should be the same as the trigger that is used to poll the files:
#Bean
#EndpointId("fileInboundChannelAdapter")
#InboundChannelAdapter(value = "fileInputChannel", poller = #Poller("fileReadingMessageSourcePollerMetadata"))
public MessageSource<File> fileReadingMessageSource() {
FileReadingMessageSource source = new FileReadingMessageSource();
source.setDirectory(new File("."));
source.setFilter(new SimplePatternFileListFilter("*.bat"));
return source;
}
#Bean
public PollerMetadata fileReadingMessageSourcePollerMetadata() {
PollerMetadata meta = new PollerMetadata();
DynamicPeriodicTrigger trigger = new DynamicPeriodicTrigger(1000);
SimpleActiveIdleMessageSourceAdvice advice = new SimpleActiveIdleMessageSourceAdvice(trigger);
advice.setActivePollPeriod(60000);
meta.setTrigger(trigger);
meta.setAdviceChain(List.of(advice));
meta.setMaxMessagesPerPoll(1);
return meta;
}
Please note that SimpleActiveIdleMessageSourceAdvice just change the next time to poll files. You can set it to a very large number such as several thousand years later which can somehow achieve your intention which never poll the file again in your lifetime. But the scheduler thread that poll the file still active.
If you really want to shut down this scheduler thread too, you can send a shut down signal to the control bus.
First define a control bus :
#Bean
public IntegrationFlow controlBusFlow() {
return IntegrationFlows.from("controlBus")
.controlBus()
.get();
}
Then implements an AbstractMessageSourceAdvice that send a shutdown signal to the control bus after a file is polled :
#Service
public class StopPollingAdvice extends AbstractMessageSourceAdvice{
#Lazy
#Qualifier("controlBus")
#Autowired
private MessageChannel controlBusChannel;
#Override
public boolean beforeReceive(MessageSource<?> source) {
return super.beforeReceive(source);
}
#Override
public Message<?> afterReceive(Message<?> result, MessageSource<?> source) {
Message operation = MessageBuilder.withPayload("#fileInboundChannelAdapter.stop()").build();
controlBusChannel.send(operation);
return result;
}
}
and change the PollerMetadata that poll files to :
#Bean
public PollerMetadata fileReadingMessageSourcePollerMetadata(StopPollingAdvice stopPollingAdvice) {
PollerMetadata meta = new PollerMetadata();
meta.setTrigger(new PeriodicTrigger(1000));
meta.setAdviceChain(List.of(stopPollingAdvice));
meta.setMaxMessagesPerPoll(1);
return meta;
}

Related

How to configure Spring Boot App to connect to Axon Server and register event handler

First, I have a spring boot application using axon 4.5.5 libraries connecting to server 4.2.4. Is this supported?
This spring boot app is suppose to listen to several events coming from the (main app emitted to the) axon server and here is my sprint boot axon client configuration in application.yml below.
Second, this application connects to the axon server but fails to handle any events causing all the said events to be blacklisted. I have trace it to the event handler registration is probably causing the issue. We are using the EventStore as the StreamableMessageSource when calling registerTrackingEventProcessor().
Do you guys have any ideas why the registered event handlers are not firing? I can see the spring boot app is connected to the axon server on the dashboard as well as the main app that fires the events. I can also see the fired events when searching it in the dashboard and their blacklisting in the axon server log. So my guess it is the configuration causing issues.
Here is my library versions (from pom.xml):
Version 4.5.5
axon-test
axon-spring-boot-autoconfigure
axon-spring-boot-starter
axon-modelling
axon-metrics
axon-messaging
axon-eventsourcing
Version 2.6.0
spring-boot-starter-web
Here is my application.yml axon fragment:
axon:
axonserver:
client-id: reporting-etl-client
component-name: reporting-etl
query-threads: ${AXON_QUERY_THREADS_MAX:50}
servers: ${AXON_AXONSERVER_SERVERS:axonserver}
serializer:
events: jackson
and AxonConfig.java:
package com.fedeee.reporting.config;
import com.fedeee.reporting.axon.events.EventHandlerProjector;
import com.thoughtworks.xstream.XStream;
import org.axonframework.axonserver.connector.query.AxonServerQueryBus;
import org.axonframework.config.EventProcessingConfigurer;
import org.axonframework.config.ProcessingGroup;
import org.axonframework.eventhandling.EventBus;
import org.axonframework.eventhandling.TrackingEventProcessorConfiguration;
import org.axonframework.eventhandling.gateway.DefaultEventGateway;
import org.axonframework.eventhandling.gateway.EventGateway;
import org.axonframework.queryhandling.DefaultQueryGateway;
import org.axonframework.queryhandling.QueryBus;
import org.axonframework.queryhandling.QueryGateway;
import org.axonframework.serialization.AbstractXStreamSerializer;
import org.axonframework.serialization.Serializer;
import org.axonframework.serialization.xml.XStreamSerializer;
import org.reflections.Reflections;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.util.Assert;
import java.util.Set;
#Configuration
public class AxonConfig {
private static final Logger LOG = LoggerFactory.getLogger(AxonConfig.class);
/**
* Correctly configuring the XStream serializer to avoid security warnings.
*/
#Autowired
public void configureXStream(Serializer serializer) {
if (serializer instanceof AbstractXStreamSerializer) {
XStream xStream = ((XStreamSerializer) serializer).getXStream();
XStream.setupDefaultSecurity(xStream);
xStream.allowTypesByWildcard(new String[] {"com.fedeee.pkg.api.events.**", "org.axonframework.**"});
}
}
/**
*
* #param configurer
* #param context
*/
#Autowired
public void configure(EventProcessingConfigurer configurer, ApplicationContext context) {
LOG.info("Setting up TrackingEventProcessors for threads, batch size and other configurations..."
+ " annotated with #ProcessingGroup...");
// find classes in the com.fedeee.* package that has methods annotated with #ProcessingGroup to configure
Reflections reflections = new Reflections("com.fedeee.reporting.axon.events");
Set<Class<?>> annotatedClasses = reflections.getTypesAnnotatedWith(ProcessingGroup.class);
// Configure each identified class
annotatedClasses.stream().forEach(annotatedClass -> {
// Locate the appropriate spring bean to get appropriate values from each one.
String beanName = annotatedClass.getName().substring(annotatedClass.getName().lastIndexOf(".") + 1);
beanName = beanName.substring(0,1).toLowerCase() + beanName.substring(1);
Object projObj = context.getBean(beanName);
if (projObj instanceof EventHandlerProjector) {
EventHandlerProjector projector = (EventHandlerProjector) projObj;
LOG.info("Configuring EventHandlerProjector Bean '{}' with maxThreads: {} and batchSize: {}.",
beanName, projector.getMaxThreads(), projector.getBatchSize());
ProcessingGroup pgAnnotation = annotatedClass.getAnnotation(ProcessingGroup.class);
String processingGroup = pgAnnotation.value();
configurer.registerTrackingEventProcessor(
processingGroup,
org.axonframework.config.Configuration::eventStore,
conf -> TrackingEventProcessorConfiguration.forParallelProcessing(projector.getMaxThreads())
.andBatchSize(projector.getBatchSize())
).registerHandlerInterceptor(processingGroup, configuration -> new EventHandlerLoggingInterceptor());
// Enable logging for EventHandlers
LOG.info(".. '{}' successfully configured with processing group '{}'.", beanName, processingGroup);
} else {
LOG.info(".. '{}' failed to configure with any processing group.", beanName);
}
});
// TODO: handle tracking event processor initialization. See the axon mailing list thread:
// *****************************************************************************************************************
// https://groups.google.com/forum/#!topic/axonframework/eyw0rRiSzUw
// In that thread there is a discussion about properly initializing the token store to avoid recreating query models.
// I still need to understand more about this...
// *****************************************************************************************************************
}
// #Autowired
// public void configureErrorHandling(
// EventProcessingConfigurer configurer, ErrorHandler errorHandler
// ) {
// configurer.registerDefaultListenerInvocationErrorHandler(c -> errorHandler);
// }
#Autowired
public void registerInterceptors(QueryBus queryBus) {
Assert.notNull(queryBus, "Invalid configuration, queryBus is null!");
if (AxonServerQueryBus.class.isAssignableFrom(queryBus.getClass())) {
queryBus.registerHandlerInterceptor(InterceptorSupport.authorizationHandlerInterceptor());
}
}
#Bean
public QueryGateway queryGateway(QueryBus queryBus) {
return DefaultQueryGateway.builder().queryBus(queryBus).build();
}
#Bean
public EventGateway eventGateway(EventBus eventBus) {
return DefaultEventGateway.builder().eventBus(eventBus).build();
}
}
Here is the EventHandlerProjector.java:
package com.fedeee.reporting.axon.events;
/**
* Defines a contract to ensure we specify the number of threads and batch size to be allowed by the event
*/
public interface EventHandlerProjector {
/**
* Specifies the number of records per batch to be handled by the tracking processor.
* #return
*/
public default Integer getBatchSize() {
return 1;
}
/**
* Specifies the maximumn number of threads the tracking processor can be specified to use.
* #return
*/
public default Integer getMaxThreads() {
return 1;
}
}
And finally the event handler class:
package com.fedeee.reporting.axon.events.pkg;
import com.fedeee.api.UserInfo;
import com.fedeee.pkg.api.events.*;
import com.fedeee.reporting._shared.utils.MdcAutoClosable;
import com.fedeee.reporting.axon.events.EventHandlerProjector;
import com.fedeee.reporting.recover.packageevents.PackageCreatedDto;
import com.fedeee.reporting.recover.packageevents.RecoverPackageCreatedRequest;
import com.fedeee.reporting.recover.packageevents.RecoverPackageEditedDto;
import com.fedeee.reporting.recover.packageevents.RecoverPackageEventsService;
import com.fedeee.reporting.recover.translators.PackageEventTranslator;
import com.fedeee.reporting.recover.translators.UserInfoTranslator;
import com.fedeee.reporting.recover.user.RecoverUserDto;
import org.axonframework.config.ProcessingGroup;
import org.axonframework.eventhandling.EventHandler;
import org.axonframework.eventhandling.Timestamp;
import org.axonframework.messaging.annotation.MetaDataValue;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import java.time.Instant;
import static com.fedeee.api.AxonMessageMetadataKeys.USER_INFO;
/**
* Event TrackingProcessor for PackageRecord-based Events handled by Axon.
*
* Configurations can be provided with the given batchSize and maxThreads options via .env or docker-compose.
*
* IMPORTANT! <code>AxonConfig</code> looks for the <em>#ProcessingGroup</em> annotation to set everything up properly.
*/
#ProcessingGroup(value = "Package-Record")
#Component
public class PackageRecordProjector implements EventHandlerProjector {
#Value("${reporting-etl.tp-batch-size.package-record:1}")
private Integer batchSize;
#Value("${reporting-etl.tp-max-threads.package-record:5}")
private Integer maxThreads;
private RecoverPackageEventsService recoverPackageEventsService;
#Autowired
public PackageRecordProjector(RecoverPackageEventsService recoverPackageEventsService) {
super();
this.recoverPackageEventsService = recoverPackageEventsService;
}
private final Logger LOG = LoggerFactory.getLogger(this.getClass());
/**
* Event handler to handle packages created in Recover.
*
* This replaces the REST endpoint exposed and used by Recover in RecoverPackageEventsController.created().
*
* #param event
* #param occurrenceInstant
* #param userInfo
*/
#EventHandler
void on(PackageCreated event, #Timestamp Instant occurrenceInstant, #MetaDataValue(USER_INFO) UserInfo userInfo) {
try (MdcAutoClosable mdc = new MdcAutoClosable()) {
mdcInit(event, userInfo, mdc);
LOG.info("Handling PackageCreated event...");
PackageCreatedDto createdDto = PackageEventTranslator.from(event, occurrenceInstant, userInfo);
RecoverUserDto recoverUserDto = UserInfoTranslator.from(userInfo);
RecoverPackageCreatedRequest request = new RecoverPackageCreatedRequest(createdDto, recoverUserDto);
/* Once we are ready, comment this in and make appropriate changes to RecoverPackageEventsController to
disallow duplication via the REST endpoint. (There are comments in there already. :) )
*/
recoverPackageEventsService.save(request);
LOG.info("Finished handling PackageCreated event.");
} catch (Exception e) {
LOG.info("An Exception has been thrown : ", e);
throw e;
}
}
#EventHandler
void on(PackageTypeCorrected event, #Timestamp Instant occurrenceInstant, #MetaDataValue(USER_INFO) UserInfo userInfo) {
try (MdcAutoClosable mdc = new MdcAutoClosable()) {
mdcInit(event, userInfo, mdc);
// TODO: not implemented (Recover and Reporting-ETL)
LOG.info("Finished handling PackageTypeCorrected event.");
} catch (Exception e) {
LOG.info("An Exception has been thrown : ", e);
throw e;
}
}
#EventHandler
void on(PackageDeleted event, #Timestamp Instant occurrenceInstant, #MetaDataValue(USER_INFO) UserInfo userInfo) {
try (MdcAutoClosable mdc = new MdcAutoClosable()) {
mdcInit(event, userInfo, mdc);
// TODO: not implemented (Reporting-ETL)
LOG.info("Finished handling PackageDeleted event.");
} catch (Exception e) {
LOG.info("An Exception has been thrown : ", e);
throw e;
}
}
// TODO: not implemented (Recover and Reporting-ETL)
// #EventHandler
// void on(PackageIntegrated event, #Timestamp Instant occurrenceInstant, #MetaDataValue(USER_INFO) UserInfo userInfo) {
// try (MdcAutoClosable mdc = new MdcAutoClosable()) {
// mdcInit(event, userInfo, mdc);
// } catch (Exception e) {
// LOG.info("An Exception has been thrown : ", e);
// throw e;
// }
// }
#EventHandler
void on(DataCaptureStarted event, #Timestamp Instant occurrenceInstant, #MetaDataValue(USER_INFO) UserInfo userInfo) {
try (MdcAutoClosable mdc = new MdcAutoClosable()) {
mdcInit(event, userInfo, mdc);
RecoverPackageEditedDto editedDto = PackageEventTranslator.from(event, occurrenceInstant, userInfo);
/* Once we are ready, comment this in and make appropriate changes to RecoverPackageEventsController to
disallow duplication via the REST endpoint. (There are comments in there already. :) )
*/
recoverPackageEventsService.save(editedDto);
LOG.info("Finished handling DataCaptureStarted event.");
} catch (Exception e) {
LOG.info("An Exception has been thrown : ", e);
throw e;
}
}
#EventHandler
void on(DataCaptureEnded event, #Timestamp Instant occurrenceInstant, #MetaDataValue(USER_INFO) UserInfo userInfo) {
try (MdcAutoClosable mdc = new MdcAutoClosable()) {
mdcInit(event, userInfo, mdc);
RecoverPackageEditedDto editedDto = PackageEventTranslator.from(event, occurrenceInstant, userInfo);
/* Once we are ready, comment this in and make appropriate changes to RecoverPackageEventsController to
disallow duplication via the REST endpoint. (There are comments in there already. :) )
*/
recoverPackageEventsService.update(event.getPackageId(), editedDto);
LOG.info("Finished handling DataCaptureEnded event.");
} catch (Exception e) {
LOG.info("An Exception has been thrown : ", e);
throw e;
}
}
#Override public Integer getBatchSize() {
return batchSize;
}
#Override public Integer getMaxThreads() {
return maxThreads;
}
private void mdcInit(PackageEvent event, UserInfo userInfo, MdcAutoClosable mdc) {
mdc.put("PackageId", event.getPackageId());
mdc.put("UserId", userInfo.getUserId());
LOG.info("Handling package record event: {}", event);
}
}
Here is the logs for today 2023/01/27...
.
.
.
2023-01-27 17:19:32.924 DEBUG 8 --- [MessageBroker-4] i.a.a.message.command.CommandCache : Checking timed out commands
2023-01-27 17:19:37.924 DEBUG 8 --- [MessageBroker-4] i.a.axonserver.message.query.QueryCache : Checking timed out queries
2023-01-27 17:19:37.924 DEBUG 8 --- [MessageBroker-3] i.a.a.message.command.CommandCache : Checking timed out commands
2023-01-27 17:19:40.299 DEBUG 8 --- [ool-5-thread-16] i.a.axonserver.grpc.PlatformService : Registered client : ClientComponent{client='reporting-etl-client', component='reporting-etl', context='default'}
2023-01-27 17:19:40.299 INFO 8 --- [ool-5-thread-16] i.a.a.logging.TopologyEventsLogger : Application connected: reporting-etl, clientId = reporting-etl-client, context = default
2023-01-27 17:19:40.299 DEBUG 8 --- [ool-5-thread-16] i.a.a.c.version.ClientVersionsCache : Version update received from client reporting-etl-client.default to version 4.4.
2023-01-27 17:19:40.332 INFO 8 --- [ool-5-thread-15] i.a.a.message.event.EventDispatcher : Starting tracking event processor for : - 209301
2023-01-27 17:19:42.925 DEBUG 8 --- [MessageBroker-7] i.a.axonserver.message.query.QueryCache : Checking timed out queries
2023-01-27 17:19:42.925 DEBUG 8 --- [MessageBroker-3] i.a.a.message.command.CommandCache : Checking timed out commands
.
.
.
2023-01-27 18:56:08.163 DEBUG 8 --- [MessageBroker-1] i.a.a.message.command.CommandCache : Checking timed out commands
2023-01-27 18:56:08.163 DEBUG 8 --- [MessageBroker-7] i.a.axonserver.message.query.QueryCache : Checking timed out queries
2023-01-27 18:56:09.242 DEBUG 8 --- [pool-5-thread-9] i.a.a.message.command.CommandDispatcher : Dispatch com.fedeee.pkg.api.commands.StartDataCapture to: recover-api-client.default
2023-01-27 18:56:09.257 DEBUG 8 --- [ool-5-thread-10] i.a.a.message.command.CommandDispatcher : Sending response to: io.axoniq.axonserver.message.command.CommandInformation#17b317ff
2023-01-27 18:56:09.294 DEBUG 8 --- [st-dispatcher-2] i.a.a.grpc.GrpcQueryDispatcherListener : Send request io.axoniq.axonserver.grpc.SerializedQuery#158b1b9a, with priority: 0
2023-01-27 18:56:09.294 DEBUG 8 --- [st-dispatcher-2] i.a.a.grpc.GrpcQueryDispatcherListener : Remaining time for message: 72152808-ca89-4565-82dd-2675e52686e2 - 3600000ms
2023-01-27 18:56:09.300 DEBUG 8 --- [pool-5-thread-5] i.a.axonserver.message.query.QueryCache : Remove messageId 72152808-ca89-4565-82dd-2675e52686e2
2023-01-27 18:56:09.300 DEBUG 8 --- [pool-5-thread-5] i.a.a.message.query.QueryDispatcher : No (more) information for 72152808-ca89-4565-82dd-2675e52686e2 on completed
2023-01-27 18:56:09.300 DEBUG 8 --- [st-dispatcher-2] i.a.a.grpc.GrpcQueryDispatcherListener : Send request io.axoniq.axonserver.grpc.SerializedQuery#6e93a34b, with priority: 0
2023-01-27 18:56:09.301 DEBUG 8 --- [st-dispatcher-2] i.a.a.grpc.GrpcQueryDispatcherListener : Remaining time for message: 53aa1974-4012-452a-b451-2957003e4b9f - 3599999ms
2023-01-27 18:56:09.306 DEBUG 8 --- [pool-5-thread-4] i.a.axonserver.message.query.QueryCache : Remove messageId 53aa1974-4012-452a-b451-2957003e4b9f
2023-01-27 18:56:09.306 DEBUG 8 --- [pool-5-thread-4] i.a.a.message.query.QueryDispatcher : No (more) information for 53aa1974-4012-452a-b451-2957003e4b9f on completed
2023-01-27 18:56:09.319 DEBUG 8 --- [ool-5-thread-13] i.a.a.m.q.s.handler.DirectUpdateHandler : SubscriptionQueryResponse for subscription Id 55748e24-e26f-4863-a5d0-a4eeff43da69 send to client.
2023-01-27 18:56:10.509 DEBUG 8 --- [st-dispatcher-2] i.a.a.grpc.GrpcQueryDispatcherListener : Send request io.axoniq.axonserver.grpc.SerializedQuery#fd8d154, with priority: 0
2023-01-27 18:56:10.510 DEBUG 8 --- [st-dispatcher-2] i.a.a.grpc.GrpcQueryDispatcherListener : Remaining time for message: d2d91224-557f-4735-933b-e4195e7e42f9 - 3599999ms
2023-01-27 18:56:10.514 DEBUG 8 --- [ool-5-thread-15] i.a.axonserver.message.query.QueryCache : Remove messageId d2d91224-557f-4735-933b-e4195e7e42f9
2023-01-27 18:56:10.514 DEBUG 8 --- [ool-5-thread-15] i.a.a.message.query.QueryDispatcher : No (more) information for d2d91224-557f-4735-933b-e4195e7e42f9 on completed
2023-01-27 18:56:13.163 DEBUG 8 --- [MessageBroker-2] i.a.a.message.command.CommandCache : Checking timed out commands
2023-01-27 18:56:13.163 DEBUG 8 --- [MessageBroker-5] i.a.axonserver.message.query.QueryCache : Checking timed out queries
.
.
.
As you might be running into known bugs, it's better to use 4.6 of server and framework, that said, it should likely work since the server API is pretty stable.
Could you also share the com.fedeee.reporting.axon.events.EventHandlerProjector class? Has that class #EventHandler annotated methods? Are there more classes with #EventHandler annotated methods?
I assume you are using Spring Boot and the Axon Framework starter, is did indeed the case?

Thymeleaf [# th:each] is not getting parsed

I have a thymeleaf (3.0.11.RELEASE) TEXT template with iteration as follows:
[# th:each="sei : ${specificInfoElements}"]
[(${sei?.elementLabel})] : [(${sei?.elementValues})]
[/]
The above is not getting evaluated by template engine and its coming as follows in output:
[# th:each="sei : ${specificInfoElements}"]
:
[/]
Can anybody help me understand what I am doing wrong?
Note: I am using spring boot.
#Autowired
private SpringTemplateEngine thymeleafTemplateEngine;
Context thymeleafContext = new Context();
thymeleafContext.setVariables(templateModel);
String outputText = thymeleafTemplateEngine.process(emailTemplateString,
thymeleafContext);
I just tested this with Spring Boot and it works fine. What I did exactly:
Create a new project via start.spring.io using Spring Boot 2.5.1 with Java 11
Update application.properties with:
spring.thymeleaf.mode=TEXT
spring.thymeleaf.suffix=.txt
Create a file src/main/resources/templates/test.txt containing the template:
[# th:each="sei : ${specificInfoElements}"]
[(${sei?.elementLabel})] : [(${sei?.elementValues})]
[/]
Create a test class that extends from CommandLineRunner so I can just start the app and see some output:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.stereotype.Component;
import org.thymeleaf.context.Context;
import org.thymeleaf.spring5.SpringTemplateEngine;
import java.util.List;
#Component
public class Test implements CommandLineRunner {
#Autowired
private SpringTemplateEngine templateEngine;
#Override
public void run(String... args) throws Exception {
Context context = new Context();
context.setVariable("specificInfoElements", List.of(new SpecificInfoElement("first label", "first value"),
new SpecificInfoElement("second label", "second element")));
String result = templateEngine.process("test", context);
System.out.println("result = " + result);
}
private static class SpecificInfoElement {
private String elementLabel;
private String elementValues;
public SpecificInfoElement(String elementLabel, String elementValues) {
this.elementLabel = elementLabel;
this.elementValues = elementValues;
}
public String getElementLabel() {
return elementLabel;
}
public String getElementValues() {
return elementValues;
}
}
}
Running this outputs:
2021-06-22 08:22:39.229 INFO 13464 --- [ main] com.example.demo.DemoApplication : Started DemoApplication in 1.119 seconds (JVM running for 2.828)
result =
first label : first value
second label : second element
I hope this can help you figure out what you are doing differently.
Since I am storing templates in DB, thymeleaf is using StringTemplateResolver by default. StringTemplateResolver is added to the SpringTemplateEngine upon calling process method for the first time as shown in the code snippet taken from org.thymeleaf.TemplateEngine:
if (this.templateResolvers.isEmpty()) {
this.templateResolvers.add(new StringTemplateResolver());
}
I changed the template mode of StringTemplateResolver before calling the process method as follows:
if (CollectionUtils.isEmpty(springTemplateEngine.getTemplateResolvers())) {
// calling process method will initialize SpringTemplateEngine with StringTemplateResolver.
springTemplateEngine.process("template contents", context);
Set<ITemplateResolver> templateResolvers =
springTemplateEngine.getTemplateResolvers();
StringTemplateResolver stringTemplateResolver = (StringTemplateResolver) templateResolvers.iterator().next();
stringTemplateResolver.setTemplateMode(TemplateMode.TEXT);
}

Need a way to prevent unwanted job param from propagating to next execution of spring boot batch job

I am running a batch app using spring boot 2.1.2 and spring batch 4.1.1. The app uses a MySQL database for the spring batch metadata data source.
First, I run the job with this command:
java -jar target/batchdemo-0.0.1-SNAPSHOT.jar -Dspring.batch.job.names=echo com.paypal.batch.batchdemo.BatchdemoApplication myparam1=value1 myparam2=value2
Notice I am passing two params:
myparam1=value1
myparam2=value2
Since the job uses RunIdIncrementer, the actual params used by the app are logged as:
Job: [SimpleJob: [name=echo]] completed with the following parameters: [{myparam2=value2, run.id=1, myparam1=value1}]
Next I run the job again, this time dropping myparam2:
java -jar target/batchdemo-0.0.1-SNAPSHOT.jar -Dspring.batch.job.names=echo com.paypal.batch.batchdemo.BatchdemoApplication myparam1=value1
This time the job again runs with param2 still included:
Job: [SimpleJob: [name=echo]] completed with the following parameters: [{myparam2=value2, run.id=2, myparam1=value1}]
This causes business logic to be invoked as if I had again passed myparam2 to the app.
Is there a way to drop the job parameter and have it not be passed to the next instance?
App code:
package com.paypal.batch.batchdemo;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.Bean;
#SpringBootApplication
#EnableBatchProcessing
public class BatchdemoApplication {
public static void main(String[] args) {
SpringApplication.run(BatchdemoApplication.class, args);
}
#Autowired
JobBuilderFactory jobBuilder;
#Autowired
StepBuilderFactory stepBuilder;
#Autowired
ParamEchoTasklet paramEchoTasklet;
#Bean
public RunIdIncrementer incrementer() {
return new RunIdIncrementer();
}
#Bean
public Job job() {
return jobBuilder.get("echo").incrementer(incrementer()).start(echoParamsStep()).build();
}
#Bean
public Step echoParamsStep() {
return stepBuilder.get("echoParams").tasklet(paramEchoTasklet).build();
}
}
package com.paypal.batch.batchdemo;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.stereotype.Component;
#Component
public class ParamEchoTasklet implements Tasklet {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
LOGGER.info("ParamEchoTasklet BEGIN");
chunkContext.getStepContext().getJobParameters().entrySet().stream().forEachOrdered((entry) -> {
String key = entry.getKey();
Object value = entry.getValue();
LOGGER.info("Param {} = {}", key, value);
});
LOGGER.info("ParamEchoTasklet END");
return RepeatStatus.FINISHED;
}
private Logger LOGGER = LoggerFactory.getLogger(ParamEchoTasklet.class);
}
I debugged the spring batch and spring boot code, and here is what is happening. JobParametersBuilder line 273 adds the params from the most recent prior job instance to the nextParameters map along with any params added by the JobParametersIncrementer:
List<JobExecution> previousExecutions = this.jobExplorer.getJobExecutions(lastInstances.get(0));
if (previousExecutions.isEmpty()) {
// Normally this will not happen - an instance exists with no executions
nextParameters = incrementer.getNext(new JobParameters());
}
else {
JobExecution previousExecution = previousExecutions.get(0);
nextParameters = incrementer.getNext(previousExecution.getJobParameters());
}
Then since I am using spring boot, JobLauncherCommandLineRunner line 213 merges the prior params with the new params passed for the new execution, which results in the old param being passed to the new execution:
return merge(nextParameters, jobParameters);
It appears to be impossible to run the job ever again without the param unless I am missing something. Could it be a bug in spring batch?
The normal behavior for RunIdIncrementer appears to increment the run id for the JobExecution and pass along the remaining prior JobParameters. I would not call this a bug.
Keep in mind that the idea behind the RunIdIncrementer is simply to change one identifying parameter to allow a job to be run again, even if a prior run with the same (other) parameters completed successfully and restart has not been configured.
You could always create a customized incrementer by implementing JobParametersIncrementer.
Another alternative is to use the JobParametersBuilder to build a JobParameters object and then use the JobLauncher to run your job with those parameters. I often use the current system time in milliseconds to create uniqueness if I'm running jobs that will otherwise have the same JobParameters. You will obviously have to figure out the logic for pulling your specific parameters from the command line (or wherever else) and iterating over them to populate the JobParameters object.
Example:
public JobExecution executeJob(Job job) {
JobExecution jobExecution = null;
try {
JobParameters jobParameters =
new JobParametersBuilder()
.addLong( "time.millis", System.currentTimeMillis(), true)
.addString( "param1", "value1", true)
.toJobParameters();
jobExecution = jobLauncher.run( job, jobParameters );
} catch ( JobInstanceAlreadyCompleteException | JobRestartException | JobParametersInvalidException | JobExecutionAlreadyRunningException e ) {
e.printStackTrace();
}
return jobExecution;
}

Spring Integration not moving file after processing

My requirement is to move files from input to output directory. Currently, I receive an XML file, parse it, process it and would like to move to new folder. I am using SPring boot 2.0, Spring INtegration 5. Attached is the code. This integration flows process the file but after processing it is not moving the file new directory.
Could you please let me know what is missing and how to fix this?
Logs are
2018-04-06 15:55:16.473[0;39m [32mDEBUG[0;39m [35m6364[0;39m [2m---[0;39m [2m[ask-scheduler-1][0;39m [36mo.s.i.handler.ServiceActivatingHandler [0;39m [2m:[0;39m handler 'ServiceActivator for [org.springframework.integration.handler.BeanNameMessageProcessor#33a55bd8] (org.springframework.integration.handler.ServiceActivatingHandler#0)' produced no reply for request Message: GenericMessage [payload=Producers {id: -2147483648, parent-id: 0}, headers={file_originalFile=C:\slim\OBDF\Entire_IMO_hierarchy.xml, id=3ee00fca-1f2b-be84-742a-b5c6edfaf42a, file_name=Entire_IMO_hierarchy.xml, file_relativePath=Entire_IMO_hierarchy.xml, timestamp=1523055316426}]
2018-04-06 15:55:16.475[0;39m [32mDEBUG[0;39m [35m6364[0;39m ---[0;39m [ask-scheduler-1][0;39m [36mo.s.integration.channel.DirectChannel [0;39m :[0;39m postSend (sent=true) on channel 'slimflow.channel#1', message: GenericMessage [payload=Producers {id: -2147483648, parent-id: 0}, headers={file_originalFile=C:\slim\OBDF\Entire_IMO_hierarchy.xml, id=3ee00fca-1f2b-be84-742a-b5c6edfaf42a, file_name=Entire_IMO_hierarchy.xml, file_relativePath=Entire_IMO_hierarchy.xml, timestamp=1523055316426}]
2018-04-06 15:55:16.480[0;39m [32mDEBUG[0;39m [35m6364[0;39m ---[0;39m [ask-scheduler-1][0;39m [36mo.s.integration.channel.DirectChannel [0;39m :[0;39m postSend (sent=true) on channel 'slimflow.channel#0', message: GenericMessage [payload=C:\slim\OBDF\Entire_IMO_hierarchy.xml, headers={file_originalFile=C:\slim\OBDF\Entire_IMO_hierarchy.xml, id=0f673954-bceb-6e64-0d47-639522002569, file_name=Entire_IMO_hierarchy.xml, file_relativePath=Entire_IMO_hierarchy.xml, timestamp=1523055316320}]
Integration flow config
import java.io.File;
import java.util.function.Function;
import javax.xml.bind.JAXBException;
import javax.xml.stream.XMLStreamException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.core.MessageSource;
import org.springframework.integration.dsl.IntegrationFlow;
import org.springframework.integration.dsl.IntegrationFlows;
import org.springframework.integration.dsl.Pollers;
import org.springframework.integration.file.FileReadingMessageSource;
import org.springframework.integration.file.FileWritingMessageHandler;
import org.springframework.integration.file.filters.AcceptOnceFileListFilter;
import org.springframework.integration.file.filters.ChainFileListFilter;
import org.springframework.integration.file.filters.RegexPatternFileListFilter;
import org.springframework.integration.transformer.PayloadTypeConvertingTransformer;
import org.springframework.messaging.MessageHandler;
#Configuration
#EnableIntegration
public class SlimIntegrationConfig {
#Value("${input.directory}")
private String inputDir;
#Value("${outputDir.directory}")
private String outputDir;
#Value("${input.scan.frequency: 100000}")
private long scanFrequency;
#Autowired
private XmlBeanExtractor<Producers> xmlBeanExtractor;
#Bean
public MessageSource<File> inputFileSource() {
FileReadingMessageSource src = new FileReadingMessageSource(
(f1, f2) -> Long.valueOf(f1.lastModified()).compareTo(f2.lastModified()));
src.setDirectory(new File(inputDir));
src.setAutoCreateDirectory(true);
ChainFileListFilter<File> chainFileListFilter = new ChainFileListFilter<>();
chainFileListFilter.addFilter(new AcceptOnceFileListFilter<>() );
chainFileListFilter.addFilter(new RegexPatternFileListFilter("(?i)^.+\\.xml$"));
src.setFilter(chainFileListFilter);
return src;
}
#Bean
public DirectChannel outputChannel() {
return new DirectChannel();
}
#Bean
public MessageHandler fileOutboundChannelAdapter() {
FileWritingMessageHandler adapter = new FileWritingMessageHandler(new File(outputDir));
adapter.setDeleteSourceFiles(true);
adapter.setAutoCreateDirectory(true);
adapter.setExpectReply(false);
return adapter;
}
#Bean
PayloadTypeConvertingTransformer<File, Producers> xmlBeanTranformer() {
PayloadTypeConvertingTransformer<File, Producers> tranformer = new PayloadTypeConvertingTransformer<>();
tranformer.setConverter(file -> {
Producers p = null;
try {
p = xmlBeanExtractor.extract(file.getAbsolutePath(), Producers.class);
} catch (JAXBException | XMLStreamException e) {
e.printStackTrace();
}
return p;
});
return tranformer;
}
#Bean
public IntegrationFlow slimflow() {
return IntegrationFlows
.from(inputFileSource(), spec -> spec.poller(Pollers.fixedDelay(scanFrequency)))
.transform(xmlBeanTranformer())
.handle("slimFileProcessor","processfile")
.channel(outputChannel())
.handle(fileOutboundChannelAdapter())
.get()
;
}
}
We need top know what your slimFileProcessor.processfile() does. However it doesn't reflect what you do in the xmlBeanTranformer. You convert there a File payload to the Producers object and exactly this one is sent to the slimFileProcessor.
So, that's first: there is no File in the payload for the FileWritingMessageHandler. But we can fix it a bit later.
Now you have a log like:
ServiceActivatingHandler#0)' produced no reply for request
So, your slimFileProcessor doesn't return something to be sent to the outputChannel() for potential file move from one directory to another.
If return something isn't possible by the logic at all, you can consider to use a .publishSubscribeChannel(). Make that xmlBeanTranformer() as a one subscriber and fileOutboundChannelAdapter() as another. This way the same File object will be sent to two branches. Only the point that the second branch won't be called until the first one finishes its work. Of course, if everything is done in the same thread.
You still can live with a simple linear flow, just because you get a gain of the FileHeaders.ORIGINAL_FILE header which is going to be used in the FileWritingMessageHandler. But you should keep in mind that last one supports only these types for request message payload: File, InputStream, byte[] or String. For your move after process use-case, of course, it would be better to deal with the File type. That's why I suggest to consider publish-subscribe variant.

ejb remote client giving EJBCLIENT000025: No EJB receiver available for handling error

I am getting an error while running a remote client for EJB3 and wildfly.
My ejb looks something like bellow.
package com.tst.ejb;
import javax.ejb.LocalBean;
import javax.ejb.Stateless;
/**
* Session Bean implementation class TestWLDFly
*/
#Stateless
#LocalBean
public class TestWLDFly implements TestWLDFlyRemote {
/**
* Default constructor.
*/
public TestWLDFly() {
// TODO Auto-generated constructor stub
}
#Override
public void TestPrint() {
// TODO Auto-generated method stub
}
}
Remote interface is as follows
package com.tst.ejb;
import javax.ejb.Remote;
#Remote
public interface TestWLDFlyRemote {
public void TestPrint();
}
Remote client is as bellow...
package com.tst.ejb;
import java.util.Hashtable;
import java.util.Properties;
import javax.naming.Context;
import javax.naming.InitialContext;
import javax.naming.NamingException;
import org.jboss.ejb.client.ContextSelector;
import org.jboss.ejb.client.EJBClientConfiguration;
import org.jboss.ejb.client.EJBClientContext;
import org.jboss.ejb.client.PropertiesBasedEJBClientConfiguration;
import org.jboss.ejb.client.remoting.ConfigBasedEJBClientContextSelector;
public class EJBClient {
public static void main(String[] args) throws NamingException {
final Hashtable<String, Object> jndiProperties = new Hashtable<>();
Properties p = new Properties();
p.put("remote.connectionprovider.create.options.org.xnio.Options.SSL_ENABLED", "false");
p.put("remote.connections", "one");
p.put("remote.connection.one.port", "8080");
p.put("remote.connection.one.host", "localhost");
EJBClientConfiguration cc = new PropertiesBasedEJBClientConfiguration(p);
ContextSelector<EJBClientContext> selector = new ConfigBasedEJBClientContextSelector(cc);
EJBClientContext.setSelector(selector);
Properties props = new Properties();
props.put(Context.URL_PKG_PREFIXES, "org.jboss.ejb.client.naming");
InitialContext context = new InitialContext(props);
final String rcal = "ejb:/TSTWLDFLY/TestWLDFly!com.tst.ejb.TestWLDFlyRemote";
final TestWLDFlyRemote remote = (TestWLDFlyRemote) context.lookup(rcal);
remote.TestPrint();
}
}
Wildfly log shows JNDI names as bellow
java:global/TSTWLDFLY/TestWLDFly!com.tst.ejb.TestWLDFlyRemote
java:app/TSTWLDFLY/TestWLDFly!com.tst.ejb.TestWLDFlyRemote
java:module/TestWLDFly!com.tst.ejb.TestWLDFlyRemote
java:jboss/exported/TSTWLDFLY/TestWLDFly!com.tst.ejb.TestWLDFlyRemote
java:global/TSTWLDFLY/TestWLDFly!com.tst.ejb.TestWLDFly
java:app/TSTWLDFLY/TestWLDFly!com.tst.ejb.TestWLDFly
java:module/TestWLDFly!com.tst.ejb.TestWLDFly
I am getting an exception I run the client.
Nov 19, 2015 4:46:36 PM org.jboss.ejb.client.EJBClient <clinit>
INFO: JBoss EJB Client version 2.1.1.Final
Nov 19, 2015 4:46:36 PM org.xnio.Xnio <clinit>
INFO: XNIO version 3.3.1.Final
Nov 19, 2015 4:46:36 PM org.xnio.nio.NioXnio <clinit>
INFO: XNIO NIO Implementation Version 3.3.1.Final
Nov 19, 2015 4:46:36 PM org.jboss.remoting3.EndpointImpl <clinit>
INFO: JBoss Remoting version 4.0.9.Final
Nov 19, 2015 4:46:41 PM org.jboss.ejb.client.remoting.ConfigBasedEJBClientContextSelector setupEJBReceivers
WARN: Could not register a EJB receiver for connection to localhost:8080
java.lang.RuntimeException: Operation failed with status WAITING
at org.jboss.ejb.client.remoting.IoFutureHelper.get(IoFutureHelper.java:94)
at org.jboss.ejb.client.remoting.ConnectionPool.getConnection(ConnectionPool.java:80)
at org.jboss.ejb.client.remoting.RemotingConnectionManager.getConnection(RemotingConnectionManager.java:51)
at org.jboss.ejb.client.remoting.ConfigBasedEJBClientContextSelector.setupEJBReceivers(ConfigBasedEJBClientContextSelector.java:158)
at org.jboss.ejb.client.remoting.ConfigBasedEJBClientContextSelector.getCurrent(ConfigBasedEJBClientContextSelector.java:115)
at org.jboss.ejb.client.remoting.ConfigBasedEJBClientContextSelector.getCurrent(ConfigBasedEJBClientContextSelector.java:47)
at org.jboss.ejb.client.EJBClientContext.getCurrent(EJBClientContext.java:279)
at org.jboss.ejb.client.EJBClientContext.requireCurrent(EJBClientContext.java:289)
at org.jboss.ejb.client.EJBInvocationHandler.doInvoke(EJBInvocationHandler.java:178)
at org.jboss.ejb.client.EJBInvocationHandler.invoke(EJBInvocationHandler.java:146)
at com.sun.proxy.$Proxy0.TestPrint(Unknown Source)
at com.tst.ejb.EJBClient.main(EJBClient.java:40)
Exception in thread "main" java.lang.IllegalStateException: EJBCLIENT000025: No EJB receiver available for handling [appName:, moduleName:TSTWLDFLY, distinctName:] combination for invocation context org.jboss.ejb.client.EJBClientInvocationContext#5c5a1b69
at org.jboss.ejb.client.EJBClientContext.requireEJBReceiver(EJBClientContext.java:774)
at org.jboss.ejb.client.ReceiverInterceptor.handleInvocation(ReceiverInterceptor.java:116)
at org.jboss.ejb.client.EJBClientInvocationContext.sendRequest(EJBClientInvocationContext.java:186)
at org.jboss.ejb.client.EJBInvocationHandler.sendRequestWithPossibleRetries(EJBInvocationHandler.java:255)
at org.jboss.ejb.client.EJBInvocationHandler.doInvoke(EJBInvocationHandler.java:200)
at org.jboss.ejb.client.EJBInvocationHandler.doInvoke(EJBInvocationHandler.java:183)
at org.jboss.ejb.client.EJBInvocationHandler.invoke(EJBInvocationHandler.java:146)
at com.sun.proxy.$Proxy0.TestPrint(Unknown Source)
at com.tst.ejb.EJBClient.main(EJBClient.java:40)
I tried all the JNDI names. May be I am doing something wrong. Any help is appreciated.
You should not use #LocalBean annotation. It makes impossible remote access to this EJB. Just remove it.

Resources