Using Spring EL to add optional postfix from properties to consumerGroup in #KafkaListener - spring-boot

I have simple spring boot application with Kafka Consumers that looks like
#KafkaListener(topics="topic", groupId="SOME_CONSTANT") {
....
}
What I am required to do Is to add optional spring boot property (from env variables but that is not important) lets say:
myapp.env: TEST
And when that variable is present I should automatically update consumer group to be
SOME_CONSTANT-TEST
I am playing with SPEL
#KafkaListener(topics="topic", groupId="#{ '${myApp.env}' == null ? 'SOME_CONSTANT' : 'SOME_CONSTANT' + '-' + '${myApp.env}}'") {
....
}
But that does not seem to work :/ Any Ideas?

You can use the T operator to read the constant's value, and use the colon ':' for the case when there's no env variable:
#KafkaListener(topics="topic", groupId="#{ '${my.app.env:}' == '' ? T(com.mypackage.MyListener).SOME_CONSTANT : T(com.mypackage.MyListener).SOME_CONSTANT + '-' + '${my.app.env:}'}")
Here's a sample application with this solution:
package org.spring.kafka.playground;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ConfigurableApplicationContext;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.kafka.core.KafkaOperations;
import org.springframework.kafka.support.KafkaHeaders;
import org.springframework.messaging.handler.annotation.Header;
import org.springframework.stereotype.Component;
#SpringBootApplication
public class SO71291726 {
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(SO71291726.class, args);
try {
Thread.sleep(10000);
}
catch (InterruptedException e) {
Thread.interrupted();
throw new RuntimeException("Interrupted");
}
KafkaOperations kafkaTemplate = context.getBean("kafkaTemplate", KafkaOperations.class);
kafkaTemplate.send("topic", "My message");
}
Logger log = LoggerFactory.getLogger(this.getClass());
public static final String SOME_CONSTANT = "my-group-id-constant";
#Component
class MyListener {
#KafkaListener(topics="topic", groupId="#{ '${71291726.my.app.env:}' == '' ? T(org.spring.kafka.playground.SO71291726).SOME_CONSTANT : T(org.spring.kafka.playground.SO71291726).SOME_CONSTANT + '-' + '${71291726.my.app.env:}'}")
void listen(String message, #Header(KafkaHeaders.GROUP_ID) String groupId) {
log.info("Received message {} from group id {} ", message, groupId);
}
}
}
Output:
2022-02-28 14:26:14.733 INFO 18841 --- [ntainer#0-0-C-1] 1291726$$EnhancerBySpringCGLIB$$cf264156 : Received message My message from group id my-group-id-constant
If I add 71291726.my.app.env = TEST to the application.properties file:
2022-02-28 14:34:03.900 INFO 18870 --- [ntainer#0-0-C-1] 1291726$$EnhancerBySpringCGLIB$$e1a5933e : Received message My message from group id my-group-id-constant-TEST

Related

Thymeleaf [# th:each] is not getting parsed

I have a thymeleaf (3.0.11.RELEASE) TEXT template with iteration as follows:
[# th:each="sei : ${specificInfoElements}"]
[(${sei?.elementLabel})] : [(${sei?.elementValues})]
[/]
The above is not getting evaluated by template engine and its coming as follows in output:
[# th:each="sei : ${specificInfoElements}"]
:
[/]
Can anybody help me understand what I am doing wrong?
Note: I am using spring boot.
#Autowired
private SpringTemplateEngine thymeleafTemplateEngine;
Context thymeleafContext = new Context();
thymeleafContext.setVariables(templateModel);
String outputText = thymeleafTemplateEngine.process(emailTemplateString,
thymeleafContext);
I just tested this with Spring Boot and it works fine. What I did exactly:
Create a new project via start.spring.io using Spring Boot 2.5.1 with Java 11
Update application.properties with:
spring.thymeleaf.mode=TEXT
spring.thymeleaf.suffix=.txt
Create a file src/main/resources/templates/test.txt containing the template:
[# th:each="sei : ${specificInfoElements}"]
[(${sei?.elementLabel})] : [(${sei?.elementValues})]
[/]
Create a test class that extends from CommandLineRunner so I can just start the app and see some output:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.stereotype.Component;
import org.thymeleaf.context.Context;
import org.thymeleaf.spring5.SpringTemplateEngine;
import java.util.List;
#Component
public class Test implements CommandLineRunner {
#Autowired
private SpringTemplateEngine templateEngine;
#Override
public void run(String... args) throws Exception {
Context context = new Context();
context.setVariable("specificInfoElements", List.of(new SpecificInfoElement("first label", "first value"),
new SpecificInfoElement("second label", "second element")));
String result = templateEngine.process("test", context);
System.out.println("result = " + result);
}
private static class SpecificInfoElement {
private String elementLabel;
private String elementValues;
public SpecificInfoElement(String elementLabel, String elementValues) {
this.elementLabel = elementLabel;
this.elementValues = elementValues;
}
public String getElementLabel() {
return elementLabel;
}
public String getElementValues() {
return elementValues;
}
}
}
Running this outputs:
2021-06-22 08:22:39.229 INFO 13464 --- [ main] com.example.demo.DemoApplication : Started DemoApplication in 1.119 seconds (JVM running for 2.828)
result =
first label : first value
second label : second element
I hope this can help you figure out what you are doing differently.
Since I am storing templates in DB, thymeleaf is using StringTemplateResolver by default. StringTemplateResolver is added to the SpringTemplateEngine upon calling process method for the first time as shown in the code snippet taken from org.thymeleaf.TemplateEngine:
if (this.templateResolvers.isEmpty()) {
this.templateResolvers.add(new StringTemplateResolver());
}
I changed the template mode of StringTemplateResolver before calling the process method as follows:
if (CollectionUtils.isEmpty(springTemplateEngine.getTemplateResolvers())) {
// calling process method will initialize SpringTemplateEngine with StringTemplateResolver.
springTemplateEngine.process("template contents", context);
Set<ITemplateResolver> templateResolvers =
springTemplateEngine.getTemplateResolvers();
StringTemplateResolver stringTemplateResolver = (StringTemplateResolver) templateResolvers.iterator().next();
stringTemplateResolver.setTemplateMode(TemplateMode.TEXT);
}

is putting sqs-consumer to detect receiveMessage event in sqs scalable

I am using aws sqs as message queue. After sqs.sendMessage sends the data , I want to detect sqs.receiveMessage via either infinite loop or event triggering in scalable way. Then I came accross sqs-consumer
to handle sqs.receiveMessage events, the moment it receives the messages. But I was wondering , is it the most suitable way to handle message passing between microservices or is there any other better way to handle this thing?
I had written the code in java for fetching the data from sqs queue with SQSBufferedAsyncClient, advantages using this API is buffered the messages in async mode.
/**
*
*/
package com.sxm.aota.tsc.config;
import java.net.UnknownHostException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonWebServiceRequest;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.auth.InstanceProfileCredentialsProvider;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.retry.RetryPolicy;
import com.amazonaws.retry.RetryPolicy.BackoffStrategy;
import com.amazonaws.services.sqs.AmazonSQSAsync;
import com.amazonaws.services.sqs.AmazonSQSAsyncClient;
import com.amazonaws.services.sqs.buffered.AmazonSQSBufferedAsyncClient;
import com.amazonaws.services.sqs.buffered.QueueBufferConfig;
#Configuration
public class SQSConfiguration {
/** The properties cache config. */
#Autowired
private PropertiesCacheConfig propertiesCacheConfig;
#Bean
public AmazonSQSAsync amazonSQSClient() {
// Create Client Configuration
ClientConfiguration clientConfig = new ClientConfiguration()
.withMaxErrorRetry(5)
.withConnectionTTL(10_000L)
.withTcpKeepAlive(true)
.withRetryPolicy(new RetryPolicy(
null,
new BackoffStrategy() {
#Override
public long delayBeforeNextRetry(AmazonWebServiceRequest req,
AmazonClientException exception, int retries) {
// Delay between retries is 10s unless it is UnknownHostException
// for which retry is 60s
return exception.getCause() instanceof UnknownHostException ? 60_000L : 10_000L;
}
}, 10, true));
// Create Amazon client
AmazonSQSAsync asyncSqsClient = null;
if (propertiesCacheConfig.isIamRole()) {
asyncSqsClient = new AmazonSQSAsyncClient(new InstanceProfileCredentialsProvider(true), clientConfig);
} else {
asyncSqsClient = new AmazonSQSAsyncClient(
new BasicAWSCredentials("sceretkey", "accesskey"));
}
final Regions regions = Regions.fromName(propertiesCacheConfig.getRegionName());
asyncSqsClient.setRegion(Region.getRegion(regions));
asyncSqsClient.setEndpoint(propertiesCacheConfig.getEndPoint());
// Buffer for request batching
final QueueBufferConfig bufferConfig = new QueueBufferConfig();
// Ensure visibility timeout is maintained
bufferConfig.setVisibilityTimeoutSeconds(20);
// Enable long polling
bufferConfig.setLongPoll(true);
// Set batch parameters
// bufferConfig.setMaxBatchOpenMs(500);
// Set to receive messages only on demand
// bufferConfig.setMaxDoneReceiveBatches(0);
// bufferConfig.setMaxInflightReceiveBatches(0);
return new AmazonSQSBufferedAsyncClient(asyncSqsClient, bufferConfig);
}
}
then written the scheduleR which executes after every 2 secs and fetches the data from queue, process it and delete it from queue before visibility timeout otherwise it will be ready for processing again when visibility tiiimeout expires again.
package com.sxm.aota.tsc.sqs;
import java.util.List;
import java.util.concurrent.CountDownLatch;
import javax.annotation.PostConstruct;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.DependsOn;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import com.amazonaws.services.sqs.AmazonSQSAsync;
import com.amazonaws.services.sqs.model.DeleteMessageRequest;
import com.amazonaws.services.sqs.model.GetQueueUrlRequest;
import com.amazonaws.services.sqs.model.GetQueueUrlResult;
import com.amazonaws.services.sqs.model.ReceiveMessageRequest;
import com.amazonaws.services.sqs.model.ReceiveMessageResult;
import com.fasterxml.jackson.databind.ObjectMapper;
/**
* The Class TSCDataSenderScheduledTask.
*
* Sends the aggregated Vehicle data to TSC in batches
*/
#EnableScheduling
#Component("sqsScheduledTask")
#DependsOn({ "propertiesCacheConfig", "amazonSQSClient" })
public class SQSScheduledTask {
private static final Logger LOGGER = LoggerFactory.getLogger(SQSScheduledTask.class);
#Autowired
private PropertiesCacheConfig propertiesCacheConfig;
#Autowired
public AmazonSQSAsync amazonSQSClient;
/**
* Timer Task that will run after specific interval of time Majorly
* responsible for sending the data in batches to TSC.
*/
private String queueUrl;
private final ObjectMapper mapper = new ObjectMapper();
#PostConstruct
public void initialize() throws Exception {
LOGGER.info("SQS-Publisher", "Publisher initializing for queue " + propertiesCacheConfig.getSQSQueueName(),
"Publisher initializing for queue " + propertiesCacheConfig.getSQSQueueName());
// Get queue URL
final GetQueueUrlRequest request = new GetQueueUrlRequest().withQueueName(propertiesCacheConfig.getSQSQueueName());
final GetQueueUrlResult response = amazonSQSClient.getQueueUrl(request);
queueUrl = response.getQueueUrl();
LOGGER.info("SQS-Publisher", "Publisher initialized for queue " + propertiesCacheConfig.getSQSQueueName(),
"Publisher initialized for queue " + propertiesCacheConfig.getSQSQueueName() + ", URL = " + queueUrl);
}
#Scheduled(fixedDelayString = "${sqs.consumer.delay}")
public void timerTask() {
final ReceiveMessageResult receiveResult = getMessagesFromSQS();
String messageBody = null;
if (receiveResult != null && receiveResult.getMessages() != null && !receiveResult.getMessages().isEmpty()) {
try {
messageBody = receiveResult.getMessages().get(0).getBody();
String messageReceiptHandle = receiveResult.getMessages().get(0).getReceiptHandle();
Vehicles vehicles = mapper.readValue(messageBody, Vehicles.class);
processMessage(vehicles.getVehicles(),messageReceiptHandle);
} catch (Exception e) {
LOGGER.error("Exception while processing SQS message : {}", messageBody);
// Message is not deleted on SQS and will be processed again after visibility timeout
}
}
}
public void processMessage(List<Vehicle> vehicles,String messageReceiptHandle) throws InterruptedException {
//processing code
//delete the sqs message as the processing is completed
//Need to create atomic counter that will be increamented by all TS.. Once it will be 0 then we will be deleting the messages
amazonSQSClient.deleteMessage(new DeleteMessageRequest(queueUrl, messageReceiptHandle));
}
private ReceiveMessageResult getMessagesFromSQS() {
try {
// Create new request and fetch data from Amazon SQS queue
final ReceiveMessageResult receiveResult = amazonSQSClient
.receiveMessage(new ReceiveMessageRequest().withMaxNumberOfMessages(1).withQueueUrl(queueUrl));
return receiveResult;
} catch (Exception e) {
LOGGER.error("Error while fetching data from SQS", e);
}
return null;
}
}

How to get protobuf extension field in ProtobufAnnotationSerializer

I am a new to protocol-buffers and try to figure out how to extend a message type in the Stanford CoreNLP library as described here: https://nlp.stanford.edu/nlp/javadoc/javanlp/edu/stanford/nlp/pipeline/ProtobufAnnotationSerializer.html
The problem: I can set the extension field but i can't get it. I boiled the problem down to the code below. In the original message the field name is [edu.stanford.nlp.pipeline.myNewField] but is replaced by the field number 101 in the deserialized message.
How can i get the value of myNewField?
PS: This post https://stackoverflow.com/questions/28815214/how-to-set-get-protobufs-extension-field-in-go suggests that it should be as easy as calling getExtension(MyAppProtos.myNewField)
custom.proto
syntax = "proto2";
package edu.stanford.nlp.pipeline;
option java_package = "com.example.my.awesome.nlp.app";
option java_outer_classname = "MyAppProtos";
import "CoreNLP.proto";
extend Sentence {
optional uint32 myNewField = 101;
}
ProtoTest.java
import com.example.my.awesome.nlp.app.MyAppProtos;
import com.google.protobuf.ExtensionRegistry;
import com.google.protobuf.InvalidProtocolBufferException;
import edu.stanford.nlp.pipeline.CoreNLPProtos;
import edu.stanford.nlp.pipeline.CoreNLPProtos.Sentence;
public class ProtoTest {
static {
ExtensionRegistry registry = ExtensionRegistry.newInstance();
registry.add(MyAppProtos.myNewField);
CoreNLPProtos.registerAllExtensions(registry);
}
public static void main(String[] args) throws InvalidProtocolBufferException {
Sentence originalSentence = Sentence.newBuilder()
.setText("Hello world!")
.setTokenOffsetBegin(0)
.setTokenOffsetEnd(12)
.setExtension(MyAppProtos.myNewField, 13)
.build();
System.out.println("Original:\n" + originalSentence);
byte[] serialized = originalSentence.toByteArray();
Sentence deserializedSentence = Sentence.parseFrom(serialized);
System.out.println("Deserialized:\n" + deserializedSentence);
Integer myNewField = deserializedSentence.getExtension(MyAppProtos.myNewField);
System.out.println("MyNewField: " + myNewField);
}
}
Output:
Original:
tokenOffsetBegin: 0
tokenOffsetEnd: 12
text: "Hello world!"
[edu.stanford.nlp.pipeline.myNewField]: 13
Deserialized:
tokenOffsetBegin: 0
tokenOffsetEnd: 12
text: "Hello world!"
101: 13
MyNewField: 0
Update
Because this question was about extending CoreNLP message types and using them with the ProtobufAnnotationSerializer, here is what my extended serializer looks like:
import java.io.IOException;
import java.io.InputStream;
import java.util.Set;
import com.example.my.awesome.nlp.app.MyAppProtos;
import com.google.protobuf.ExtensionRegistry;
import edu.stanford.nlp.pipeline.Annotation;
import edu.stanford.nlp.pipeline.CoreNLPProtos;
import edu.stanford.nlp.pipeline.CoreNLPProtos.Sentence;
import edu.stanford.nlp.pipeline.CoreNLPProtos.Sentence.Builder;
import edu.stanford.nlp.pipeline.ProtobufAnnotationSerializer;
import edu.stanford.nlp.util.CoreMap;
import edu.stanford.nlp.util.Pair;
public class MySerializer extends ProtobufAnnotationSerializer {
private static ExtensionRegistry registry;
static {
registry = ExtensionRegistry.newInstance();
registry.add(MyAppProtos.myNewField);
CoreNLPProtos.registerAllExtensions(registry);
}
#Override
protected Builder toProtoBuilder(CoreMap sentence, Set<Class<?>> keysToSerialize) {
keysToSerialize.remove(MyAnnotation.class);
Builder builder = super.toProtoBuilder(sentence, keysToSerialize);
builder.setExtension(MyAppProtos.myNewField, 13);
return builder;
}
#Override
public Pair<Annotation, InputStream> read(InputStream is)
throws IOException, ClassNotFoundException, ClassCastException {
CoreNLPProtos.Document doc = CoreNLPProtos.Document.parseDelimitedFrom(is, registry);
return Pair.makePair(fromProto(doc), is);
}
#Override
protected CoreMap fromProtoNoTokens(Sentence proto) {
CoreMap result = super.fromProtoNoTokens(proto);
result.set(MyAnnotation.class, proto.getExtension(MyAppProtos.myNewField));
return result;
}
}
The mistake was that i didn't provide the parseFrom call with the extension registry.
Changing Sentence deserializedSentence = Sentence.parseFrom(serialized); to Sentence deserializedSentence = Sentence.parseFrom(serialized, registry); did the job!

How can i send sms after successful user registration using spring boot

After successful account creation, i have to send an sms to the associated customer.
For this purpose i have exposed sms service as an advice as below.
package com.naresh.advice;
import javax.annotation.PostConstruct;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.annotation.AfterReturning;
import org.aspectj.lang.annotation.Aspect;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import com.naresh.dto.AccountDTO;
import com.naresh.dto.CustomerDTO;
import com.twilio.Twilio;
import com.twilio.rest.api.v2010.account.Message;
import com.twilio.type.PhoneNumber;
#Component
#Aspect
public class SMSService {
#Value("${twilio.sms.authentication_Id:80b7c5a8b73a26a9b588a906d54269c3}")
private String authenticationId;
#Value("${twilio.sms.account_sid:AC038d9532222b3d39fce4b43a5dce9ce1}")
private String accountId;
#Value("${twilio.sms.from_number:+12566662741}")
private String fromNumber;
#PostConstruct
public void init() {
Twilio.init(accountId, authenticationId);
}
#AfterReturning(pointcut = "execution(* com.naresh.service.impl.CustomerServiceImpl.save(..)) && args(customerDTO,..)", returning = "custId")
public void sendSMS(JoinPoint joinPt, CustomerDTO customerDTO, Long custId) {
Message.creator(new PhoneNumber(customerDTO.getMobile()), new PhoneNumber(fromNumber),
"Customer " + custId + " registered successfully...").create();
}
#AfterReturning(pointcut = "execution(* com.naresh.service.impl.AccountServiceImpl.createAccount(..))", returning = "accDTO")
public void sendSMSAcc(JoinPoint joinPt, AccountDTO accDTO) {
CustomerDTO customerDTO = accDTO.getCustomer();
Message.creator(new PhoneNumber(customerDTO.getMobile()), new PhoneNumber(fromNumber),
"Hi " + customerDTO.getName() + ", Your " + accDTO.getAccountType() + " account " + accDTO.getAccNo()
+ " has been registered with us successfully.Your balance is " + accDTO.getBalance())
.create();
}
}
The above is working fine if the account creation task is successful. But if we are getting any error, at that time also success sms is received by the customer.
Please help me.
Thanks in advance
#AfterReturning advice, according to the docs:
is invoked only on normal method return, not if an exception is thrown.
That means, that your methods com.naresh.service.impl.CustomerServiceImpl.save and com.naresh.service.impl.AccountServiceImpl.createAccount return some value but doesn't throw any exception. What is the error your are getting? Does this error affects returned value? The only way is to parse the returned value to find out whether something was wrong.

CXF InInterceptor not firing

I have created web service. It works fine. Now I'm trying to implement authentication to it. I'm using CXF interceptors for that purpose. For some reason interceptors won't fire. What am I missing? This is my first web service.
import javax.annotation.Resource;
import javax.inject.Inject;
import javax.jws.WebMethod;
import javax.jws.WebParam;
import javax.jws.WebService;
import javax.xml.ws.WebServiceContext;
import org.apache.cxf.interceptor.InInterceptors;
#WebService
#InInterceptors(interceptors = "ws.BasicAuthAuthorizationInterceptor")
public class Service {
#WebMethod
public void test(#WebParam(name = "value") Integer value) throws Exception {
System.out.println("Value = " + value);
}
}
-
package ws;
import java.io.IOException;
import java.io.OutputStream;
import java.net.HttpURLConnection;
import java.util.Arrays;
import java.util.List;
import java.util.Map;
import org.apache.cxf.binding.soap.interceptor.SoapHeaderInterceptor;
import org.apache.cxf.configuration.security.AuthorizationPolicy;
import org.apache.cxf.endpoint.Endpoint;
import org.apache.cxf.interceptor.Fault;
import org.apache.cxf.message.Exchange;
import org.apache.cxf.message.Message;
import org.apache.cxf.transport.Conduit;
import org.apache.cxf.ws.addressing.EndpointReferenceType;
public class BasicAuthAuthorizationInterceptor extends SoapHeaderInterceptor {
#Override
public void handleMessage(Message message) throws Fault {
System.out.println("**** GET THIS LINE TO CONSOLE TO SEE IF INTERCEPTOR IS FIRING!!!");
AuthorizationPolicy policy = message.get(AuthorizationPolicy.class);
// If the policy is not set, the user did not specify credentials.
// 401 is sent to the client to indicate that authentication is required.
if (policy == null) {
sendErrorResponse(message, HttpURLConnection.HTTP_UNAUTHORIZED);
return;
}
String username = policy.getUserName();
String password = policy.getPassword();
// CHECK USERNAME AND PASSWORD
if (!checkLogin(username, password)) {
System.out.println("handleMessage: Invalid username or password for user: "
+ policy.getUserName());
sendErrorResponse(message, HttpURLConnection.HTTP_FORBIDDEN);
}
}
private boolean checkLogin(String username, String password) {
if (username.equals("admin") && password.equals("admin")) {
return true;
}
return false;
}
private void sendErrorResponse(Message message, int responseCode) {
Message outMessage = getOutMessage(message);
outMessage.put(Message.RESPONSE_CODE, responseCode);
// Set the response headers
#SuppressWarnings("unchecked")
Map<String, List<String>> responseHeaders = (Map<String, List<String>>) message
.get(Message.PROTOCOL_HEADERS);
if (responseHeaders != null) {
responseHeaders.put("WWW-Authenticate", Arrays.asList(new String[] { "Basic realm=realm" }));
responseHeaders.put("Content-Length", Arrays.asList(new String[] { "0" }));
}
message.getInterceptorChain().abort();
try {
getConduit(message).prepare(outMessage);
close(outMessage);
} catch (IOException e) {
e.printStackTrace();
}
}
private Message getOutMessage(Message inMessage) {
Exchange exchange = inMessage.getExchange();
Message outMessage = exchange.getOutMessage();
if (outMessage == null) {
Endpoint endpoint = exchange.get(Endpoint.class);
outMessage = endpoint.getBinding().createMessage();
exchange.setOutMessage(outMessage);
}
outMessage.putAll(inMessage);
return outMessage;
}
private Conduit getConduit(Message inMessage) throws IOException {
Exchange exchange = inMessage.getExchange();
EndpointReferenceType target = exchange.get(EndpointReferenceType.class);
Conduit conduit = exchange.getDestination().getBackChannel(inMessage, null, target);
exchange.setConduit(conduit);
return conduit;
}
private void close(Message outMessage) throws IOException {
OutputStream os = outMessage.getContent(OutputStream.class);
os.flush();
os.close();
}
}
I'm fighting with this for few days now. Don't know what to google any more. Help is appreciated.
I've found solution. I was missing the following line in MANIFEST.MF file in war project:
Dependencies: org.apache.cxf
Maven wasn't includint this line by himself so I had to find workaround. I found about that here. It says: When using annotations on your endpoints / handlers such as the Apache CXF ones (#InInterceptor, #GZIP, ...) remember to add the proper module dependency in your manifest. Otherwise your annotations are not picked up and added to the annotation index by JBoss Application Server 7, resulting in them being completely and silently ignored.
This is where I found out how to change MANIFEST.MF file.
In short, I added custom manifest file to my project and referenced it in pom.xml. Hope this helps someone.
The answer provided by Felix is accurate. I managed to solve the problem using his instructions. Just for completion here is the maven config that lets you use your own MANIFEST.MF file placed in the META-INF folder.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<archive>
<manifestFile>src/main/resources/META-INF/MANIFEST.MF</manifestFile>
</archive>
</configuration>
</plugin>
and here is the relevant content of the content of the MANIFEST.MF file I was using.
Manifest-Version: 1.0
Description: yourdescription
Dependencies: org.apache.ws.security,org.apache.cxf

Resources