throw not found exception if pubsub topic is not available - spring

I am using spring boot to interact with pubsub topic.
My config class for this connection look like this:
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.cloud.gcp.pubsub.core.PubSubTemplate;
import org.springframework.cloud.gcp.pubsub.core.publisher.PubSubPublisherTemplate;
import org.springframework.cloud.gcp.pubsub.support.PublisherFactory;
import org.springframework.cloud.gcp.pubsub.support.converter.SimplePubSubMessageConverter;
import org.springframework.util.Assert;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.util.concurrent.SettableListenableFuture;
import com.google.api.core.ApiFuture;
import com.google.api.core.ApiFutureCallback;
import com.google.api.core.ApiFutures;
import com.google.pubsub.v1.PubsubMessage;
public abstract class PubSubPublisher {
private static final Logger LOGGER = LoggerFactory.getLogger(PubSubPublisher.class);
private final PubSubTemplate pubSubTemplate;
protected PubSubPublisher(PubSubTemplate pubSubTemplate) {
this.pubSubTemplate = pubSubTemplate;
}
protected abstract String topic(String topicName);
public ListenableFuture<String> publish(String topicName, String message) {
LOGGER.info("Publishing to topic [{}]. Message: [{}]", topicName, message);
return pubSubTemplate.publish(topicName, message);
}
}
And I am calling this at my service, like this:
publisher.publish(topic-name, payload);
This publish method is async one, which always pass on did not wait for acknowldgrment. I make add get after publish for wait until it get the response from pubsub.
But I wanted to know if in case my topic is not already present and i try to push some message, it should throw some error like resource not found, considering using default async method only.
Might be implementing the callback would help but i am unable to do that in my code. And the current override publish method which use callback is just throwing the WARN not exception i wanted that to be exception. that is the reason i wanted to implement the callback.

You can check if Topic already present
from google.cloud import pubsub_v1
project_id = "projectname"
topic_name = "unknowTopic"
publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path(project_id, topic_name)
try:
response = publisher.get_topic(topic_path)
except Exception as e:
print(e)
This returns the error as
404 Resource not found (resource=unknowTopic).

Related

Invalid Address JMSException when using temporary credentials with SQSSession

I am getting an error trying to connect to an SQS queue in another AWS account using JMS. I have tried to follow the approach taken in this answer, but I am receiving the following error:
com.amazonaws.services.sqs.model.AmazonSQSException: The address https://sqs.us-east-1.amazonaws.com/ is not valid for this endpoint. (Service: AmazonSQS; Status Code: 404; Error Code: InvalidAddress; Request ID: d7f72bd3-6240-5f63-b313-70c2d8978c14; Proxy: null)
Unlike in the post mentioned above (which I believe has the account credentials in the default provider chain?) I am trying to assume a role that has access to this SQS queue. Is this not possible through JMS or am I doing something incorrectly?
import com.amazon.sqs.javamessaging.ProviderConfiguration;
import com.amazon.sqs.javamessaging.SQSConnectionFactory;
import com.amazon.sqs.javamessaging.SQSSession;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.securitytoken.AWSSecurityTokenService;
import com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient;
import com.amazonaws.services.sqs.AmazonSQSClientBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jms.config.DefaultJmsListenerContainerFactory;
import org.springframework.jms.core.JmsTemplate;
import org.springframework.jms.support.destination.DynamicDestinationResolver;
import javax.jms.ConnectionFactory;
import javax.jms.JMSException;
import javax.jms.Queue;
import javax.jms.Session;
/**
* A configuration class for JMS to poll an SQS queue
* in another AWS account
*/
#Configuration
public class TranslationJmsConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(TranslationJmsConfig.class);
#Value("${iam.connection.arn}")
private String connectionRoleArn;
#Value("${account.id}")
private String brokerAccountId;
/**
* JmsListenerContainerFactory bean for translation processing response queue
*
* #param concurrentConsumers number of concurrent consumers
* #param maxConcurrentConsumers max number of concurrent consumers
* #return An instance of JmsListenerContainerFactory
*/
#Bean("translationJmsListenerContainerFactory")
public DefaultJmsListenerContainerFactory translationJmsListenerContainerFactory(
#Value("#{new Integer('${listener.concurrency}')}") int concurrentConsumers,
#Value("#{new Integer('${listener.max.concurrency}')}") int maxConcurrentConsumers) {
DefaultJmsListenerContainerFactory factory =
new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(getConnectionFactory(connectionRoleArn));
factory.setDestinationResolver(new SqsDynamicDestinationResolver(brokerAccountId));
factory.setSessionTransacted(false); //SQS does not support transaction.
factory.setSessionAcknowledgeMode(Session.CLIENT_ACKNOWLEDGE); // Automatic message acknowledgment after successful listener execution; best-effort redelivery in case of a user exception thrown as well as in case of other listener execution interruptions (such as the JVM dying).
factory.setConcurrency(String.format("%d-%d", concurrentConsumers, maxConcurrentConsumers));
return factory;
}
/**
* create custom JMS Template
* #return JmsTemplate
*/
#Bean
public JmsTemplate customJmsTemplate() {
JmsTemplate jmsTemplate = new JmsTemplate(getConnectionFactory(connectionRoleArn));
jmsTemplate.setDestinationResolver(new SqsDynamicDestinationResolver(brokerAccountId));
return jmsTemplate;
}
/**
* A dynamic destination resolver for sqs queue
*/
public class SqsDynamicDestinationResolver extends DynamicDestinationResolver {
private final String brokerAccountId;
/**
* Constructor
* #param brokerAccountId broker Account Id
*/
public SqsDynamicDestinationResolver(String brokerAccountId) {
this.brokerAccountId = brokerAccountId;
}
#Override
protected Queue resolveQueue(Session session, String queueName) throws JMSException {
if (session instanceof SQSSession) {
SQSSession sqsSession = (SQSSession) session;
return sqsSession.createQueue(queueName, brokerAccountId); // 404 invalid address -- Something wrong with creds?
}
return super.resolveQueue(session, queueName);
}
}
private ConnectionFactory getConnectionFactory(String connectionRoleArn){
AWSSecurityTokenService stsClient = AWSSecurityTokenServiceClient.builder()
.build();
// assume the connector account credentials -> so we can assume customer account using chaining
AWSCredentialsProvider dummyCredentialProviders = IdentityHelpers.assumeInternalRole(stsClient, connectionRoleArn); // A helper that assumes temporary creds
return new SQSConnectionFactory(
new ProviderConfiguration(),
AmazonSQSClientBuilder.standard()
.withRegion(Regions.US_EAST_1)
.withCredentials(dummyCredentialProviders)
);
}
}
I realized that when using the temporary credentials, I didn't need the second parameter (the account id) of the sqsSession.createQueue call. so once i changed
sqsSession.createQueue(queueName, brokerAccountId);
To:
return sqsSession.createQueue(queueName);
it worked fine. I guess i missunderstood the need for the account id. I assume the parameter is used when you have multiple accounts in your providerChain and you want it to search a specific account? Any light on this would still be appreciated!

is putting sqs-consumer to detect receiveMessage event in sqs scalable

I am using aws sqs as message queue. After sqs.sendMessage sends the data , I want to detect sqs.receiveMessage via either infinite loop or event triggering in scalable way. Then I came accross sqs-consumer
to handle sqs.receiveMessage events, the moment it receives the messages. But I was wondering , is it the most suitable way to handle message passing between microservices or is there any other better way to handle this thing?
I had written the code in java for fetching the data from sqs queue with SQSBufferedAsyncClient, advantages using this API is buffered the messages in async mode.
/**
*
*/
package com.sxm.aota.tsc.config;
import java.net.UnknownHostException;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import com.amazonaws.AmazonClientException;
import com.amazonaws.AmazonWebServiceRequest;
import com.amazonaws.ClientConfiguration;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.auth.InstanceProfileCredentialsProvider;
import com.amazonaws.regions.Region;
import com.amazonaws.regions.Regions;
import com.amazonaws.retry.RetryPolicy;
import com.amazonaws.retry.RetryPolicy.BackoffStrategy;
import com.amazonaws.services.sqs.AmazonSQSAsync;
import com.amazonaws.services.sqs.AmazonSQSAsyncClient;
import com.amazonaws.services.sqs.buffered.AmazonSQSBufferedAsyncClient;
import com.amazonaws.services.sqs.buffered.QueueBufferConfig;
#Configuration
public class SQSConfiguration {
/** The properties cache config. */
#Autowired
private PropertiesCacheConfig propertiesCacheConfig;
#Bean
public AmazonSQSAsync amazonSQSClient() {
// Create Client Configuration
ClientConfiguration clientConfig = new ClientConfiguration()
.withMaxErrorRetry(5)
.withConnectionTTL(10_000L)
.withTcpKeepAlive(true)
.withRetryPolicy(new RetryPolicy(
null,
new BackoffStrategy() {
#Override
public long delayBeforeNextRetry(AmazonWebServiceRequest req,
AmazonClientException exception, int retries) {
// Delay between retries is 10s unless it is UnknownHostException
// for which retry is 60s
return exception.getCause() instanceof UnknownHostException ? 60_000L : 10_000L;
}
}, 10, true));
// Create Amazon client
AmazonSQSAsync asyncSqsClient = null;
if (propertiesCacheConfig.isIamRole()) {
asyncSqsClient = new AmazonSQSAsyncClient(new InstanceProfileCredentialsProvider(true), clientConfig);
} else {
asyncSqsClient = new AmazonSQSAsyncClient(
new BasicAWSCredentials("sceretkey", "accesskey"));
}
final Regions regions = Regions.fromName(propertiesCacheConfig.getRegionName());
asyncSqsClient.setRegion(Region.getRegion(regions));
asyncSqsClient.setEndpoint(propertiesCacheConfig.getEndPoint());
// Buffer for request batching
final QueueBufferConfig bufferConfig = new QueueBufferConfig();
// Ensure visibility timeout is maintained
bufferConfig.setVisibilityTimeoutSeconds(20);
// Enable long polling
bufferConfig.setLongPoll(true);
// Set batch parameters
// bufferConfig.setMaxBatchOpenMs(500);
// Set to receive messages only on demand
// bufferConfig.setMaxDoneReceiveBatches(0);
// bufferConfig.setMaxInflightReceiveBatches(0);
return new AmazonSQSBufferedAsyncClient(asyncSqsClient, bufferConfig);
}
}
then written the scheduleR which executes after every 2 secs and fetches the data from queue, process it and delete it from queue before visibility timeout otherwise it will be ready for processing again when visibility tiiimeout expires again.
package com.sxm.aota.tsc.sqs;
import java.util.List;
import java.util.concurrent.CountDownLatch;
import javax.annotation.PostConstruct;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.DependsOn;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import com.amazonaws.services.sqs.AmazonSQSAsync;
import com.amazonaws.services.sqs.model.DeleteMessageRequest;
import com.amazonaws.services.sqs.model.GetQueueUrlRequest;
import com.amazonaws.services.sqs.model.GetQueueUrlResult;
import com.amazonaws.services.sqs.model.ReceiveMessageRequest;
import com.amazonaws.services.sqs.model.ReceiveMessageResult;
import com.fasterxml.jackson.databind.ObjectMapper;
/**
* The Class TSCDataSenderScheduledTask.
*
* Sends the aggregated Vehicle data to TSC in batches
*/
#EnableScheduling
#Component("sqsScheduledTask")
#DependsOn({ "propertiesCacheConfig", "amazonSQSClient" })
public class SQSScheduledTask {
private static final Logger LOGGER = LoggerFactory.getLogger(SQSScheduledTask.class);
#Autowired
private PropertiesCacheConfig propertiesCacheConfig;
#Autowired
public AmazonSQSAsync amazonSQSClient;
/**
* Timer Task that will run after specific interval of time Majorly
* responsible for sending the data in batches to TSC.
*/
private String queueUrl;
private final ObjectMapper mapper = new ObjectMapper();
#PostConstruct
public void initialize() throws Exception {
LOGGER.info("SQS-Publisher", "Publisher initializing for queue " + propertiesCacheConfig.getSQSQueueName(),
"Publisher initializing for queue " + propertiesCacheConfig.getSQSQueueName());
// Get queue URL
final GetQueueUrlRequest request = new GetQueueUrlRequest().withQueueName(propertiesCacheConfig.getSQSQueueName());
final GetQueueUrlResult response = amazonSQSClient.getQueueUrl(request);
queueUrl = response.getQueueUrl();
LOGGER.info("SQS-Publisher", "Publisher initialized for queue " + propertiesCacheConfig.getSQSQueueName(),
"Publisher initialized for queue " + propertiesCacheConfig.getSQSQueueName() + ", URL = " + queueUrl);
}
#Scheduled(fixedDelayString = "${sqs.consumer.delay}")
public void timerTask() {
final ReceiveMessageResult receiveResult = getMessagesFromSQS();
String messageBody = null;
if (receiveResult != null && receiveResult.getMessages() != null && !receiveResult.getMessages().isEmpty()) {
try {
messageBody = receiveResult.getMessages().get(0).getBody();
String messageReceiptHandle = receiveResult.getMessages().get(0).getReceiptHandle();
Vehicles vehicles = mapper.readValue(messageBody, Vehicles.class);
processMessage(vehicles.getVehicles(),messageReceiptHandle);
} catch (Exception e) {
LOGGER.error("Exception while processing SQS message : {}", messageBody);
// Message is not deleted on SQS and will be processed again after visibility timeout
}
}
}
public void processMessage(List<Vehicle> vehicles,String messageReceiptHandle) throws InterruptedException {
//processing code
//delete the sqs message as the processing is completed
//Need to create atomic counter that will be increamented by all TS.. Once it will be 0 then we will be deleting the messages
amazonSQSClient.deleteMessage(new DeleteMessageRequest(queueUrl, messageReceiptHandle));
}
private ReceiveMessageResult getMessagesFromSQS() {
try {
// Create new request and fetch data from Amazon SQS queue
final ReceiveMessageResult receiveResult = amazonSQSClient
.receiveMessage(new ReceiveMessageRequest().withMaxNumberOfMessages(1).withQueueUrl(queueUrl));
return receiveResult;
} catch (Exception e) {
LOGGER.error("Error while fetching data from SQS", e);
}
return null;
}
}

How to pass object from controller to step in Spring Batch

I want to pass reqData form My Controller class to Step of my job,Is there any way to achieve the same any help will be appreciated. I have a Object of HttpRequestData which i have revived in controller. Thanks
HttpRequestController.java
package com.npst.imps.controller;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.item.ExecutionContext;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import com.npst.imps.utils.HttpRequestData;
import com.npst.imps.utils.TransactionResponseData;
import javax.servlet.http.HttpSession;
#RestController
public class HttpRequestController {
TransactionResponseData transactionResponseData;
#Autowired
HttpSession session;
JobExecution jobExecution;
#Autowired
JobLauncher jobLauncher;
#Autowired
Job fundtrans;
String test;
#RequestMapping("/impsft")
public String handleHttpRequest(#RequestBody HttpRequestData reqData) throws Exception {
Logger logger = LoggerFactory.getLogger(this.getClass());
try {
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();
jobExecution = jobLauncher.run(fundtrans, jobParameters);
ExecutionContext context= jobExecution.getExecutionContext();
//context.put("reqData", reqData);
transactionResponseData=(TransactionResponseData) context.get("transactionData");
//System.out.println(context.get("transactionResponseData"));
} catch (Exception e) {
logger.info(e.getMessage());
e.printStackTrace();
}
return reqData+" "+transactionResponseData.getMsg()+",Tid="+transactionResponseData.getTid();
}
}
Below is my step class
I want to get the same reqData in my step class and from here on wards i will put inside step Execution object of doAfter method.
PrepareTransactionId.java
package com.npst.imps.action;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.Date;
import javax.servlet.http.HttpSession;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.batch.core.ExitStatus;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.StepExecutionListener;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import com.npst.imps.service.TransactionService;
import com.npst.imps.utils.GenericTicketKey;
import com.npst.imps.utils.HttpRequestData;
import com.npst.imps.utils.TicketGenerator;
import com.npst.imps.utils.TransactionResponseData;
#Service
public class PrepareTransactionId implements Tasklet,StepExecutionListener{
static Logger logger = LoggerFactory.getLogger(PrepareTransactionId.class);
String appId;
private static TicketGenerator ticketGenerator = null;
private static GenericTicketKey genericTicketKey = null;
#Autowired
HttpSession session;
#Autowired
TransactionService transactionService;
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
try {
DateFormat dateFormat = new SimpleDateFormat("yyyyMMddHHmmss");
Date date = new Date();
String ticket;
System.out.println("transactionService:: PrepareTransactionId"+transactionService);
TransactionResponseData transactionData=new TransactionResponseData();
System.out.println("reqData::"+reqData);
long value=transactionService.getMaxTid(appId);
logger.info("Max id From db::"+value);
if (value == 0) {
value = System.currentTimeMillis() / 10000;
long l = value;
ticket=l+"";
}
long l = value + 1;
ticketGenerator = TicketGenerator.getInstance(9999999999L, 0, l);
genericTicketKey = new GenericTicketKey(0, false, 10);
ticket = ticketGenerator.getNextEdgeTicketFor(genericTicketKey);
stepExecution.getJobExecution().getExecutionContext().put("ticket", ticket);
ticket=appId+ticket;
System.out.println("tid::"+ticket);
stepExecution.getJobExecution().getExecutionContext().put("tid", ticket);
stepExecution.getJobExecution().getExecutionContext().put("reqData", reqData);
transactionData.setMsg("Request Recived...");
transactionData.setTid(ticket+"");
transactionData.setNodeId(appId);
transactionData.setReqtime(dateFormat.format(date));;
stepExecution.getJobExecution().getExecutionContext().put("transactionData", transactionData);
logger.info("Request Recived with tid::"+ticket);
ExitStatus exist=new ExitStatus("SUCCESS", "success");
return exist.replaceExitCode("SUCCESS");
}
catch(Exception e) {
e.printStackTrace();
return ExitStatus.FAILED;
}
}
public String getAppId() {
return appId;
}
public void setAppId(String appId) {
this.appId = appId;
}
#Override
public void beforeStep(StepExecution arg0) {
// TODO Auto-generated method stub
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
return null;
}
}
TL;DR -> You can't.
JobParameters instances can only hold values of types:
String
Long
Date
Double.
The reason behind it is primarily persistence. Remember that all spring batch metadata (including job parameters) goes to a datasource.
To use custom objects, you would need to make sure that your object is immutable and thread-safe.
JobParameters documentation states:
Value object representing runtime parameters to a batch job. Because
the parameters have no individual meaning outside of the JobParameters
they are contained within, it is a value object rather than an entity.
It is also extremely important that a parameters object can be
reliably compared to another for equality, in order to determine if
one JobParameters object equals another. Furthermore, because these
parameters will need to be persisted, it is vital that the types added
are restricted. This class is immutable and therefore thread-safe.
JobParametersBuilder documentation states as well:
Helper class for creating JobParameters. Useful because all
JobParameter objects are immutable, and must be instantiated
separately to ensure typesafety. Once created, it can be used in the
same was a java.lang.StringBuilder (except, order is irrelevant), by
adding various parameter types and creating a valid JobParameters once
finished.
But i promise my objects are ok. Can I use them?
You could, but Spring developers decide to not support this feature a long time ago.
This was discussed in spring forums and even a JIRA ticket was created - status Won't fix.
Related Links
Spring - JobParameters JavaDocs
Spring - JobParametersBuilder JavaDocs
Spring - JIRA Ticket
Spring - Forums Discussion
I will not suggest to pass complete HttpRequestData. Rather than pass only requires information to batch. You can pass this information using JobParameters.
sample code
JobParameters parameters = new JobParametersBuilder().addString("key1",HttpRequestData.gteData)
.addString("key2",HttpRequestData.gteData)
.addString("key3",HttpRequestData.gteData)
.toJobParameters();
now in step you can get JobParameters from StepExecution
putting custom object in JobParameters
HashMap<String, JobParameter>();
JobParameter myParameter = new JobParameter(your custom object);
map.put("myobject", myParameter);
JobParameters jobParameters = new JobParameters(map);

How can i send sms after successful user registration using spring boot

After successful account creation, i have to send an sms to the associated customer.
For this purpose i have exposed sms service as an advice as below.
package com.naresh.advice;
import javax.annotation.PostConstruct;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.annotation.AfterReturning;
import org.aspectj.lang.annotation.Aspect;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
import com.naresh.dto.AccountDTO;
import com.naresh.dto.CustomerDTO;
import com.twilio.Twilio;
import com.twilio.rest.api.v2010.account.Message;
import com.twilio.type.PhoneNumber;
#Component
#Aspect
public class SMSService {
#Value("${twilio.sms.authentication_Id:80b7c5a8b73a26a9b588a906d54269c3}")
private String authenticationId;
#Value("${twilio.sms.account_sid:AC038d9532222b3d39fce4b43a5dce9ce1}")
private String accountId;
#Value("${twilio.sms.from_number:+12566662741}")
private String fromNumber;
#PostConstruct
public void init() {
Twilio.init(accountId, authenticationId);
}
#AfterReturning(pointcut = "execution(* com.naresh.service.impl.CustomerServiceImpl.save(..)) && args(customerDTO,..)", returning = "custId")
public void sendSMS(JoinPoint joinPt, CustomerDTO customerDTO, Long custId) {
Message.creator(new PhoneNumber(customerDTO.getMobile()), new PhoneNumber(fromNumber),
"Customer " + custId + " registered successfully...").create();
}
#AfterReturning(pointcut = "execution(* com.naresh.service.impl.AccountServiceImpl.createAccount(..))", returning = "accDTO")
public void sendSMSAcc(JoinPoint joinPt, AccountDTO accDTO) {
CustomerDTO customerDTO = accDTO.getCustomer();
Message.creator(new PhoneNumber(customerDTO.getMobile()), new PhoneNumber(fromNumber),
"Hi " + customerDTO.getName() + ", Your " + accDTO.getAccountType() + " account " + accDTO.getAccNo()
+ " has been registered with us successfully.Your balance is " + accDTO.getBalance())
.create();
}
}
The above is working fine if the account creation task is successful. But if we are getting any error, at that time also success sms is received by the customer.
Please help me.
Thanks in advance
#AfterReturning advice, according to the docs:
is invoked only on normal method return, not if an exception is thrown.
That means, that your methods com.naresh.service.impl.CustomerServiceImpl.save and com.naresh.service.impl.AccountServiceImpl.createAccount return some value but doesn't throw any exception. What is the error your are getting? Does this error affects returned value? The only way is to parse the returned value to find out whether something was wrong.

How to register my custom MessageBodyReader in my CLIENT?

Maybe somebody can help me find out how to solve this.
I am using jersey-apache-client 1.17
I tried to use Jersey client to build a standalone application (no Servlet container or whatever, just the Java classes) which communicates with a RESTFUL API, and everything worked fine until I tried to handle the mediatype "text/csv; charset=utf-8" which is a CSV stream sent by the server.
The thing is that I can read this stream with the following code:
InputStreamReader reader = new InputStreamReader(itemExportBuilder
.get(ClientResponse.class).getEntityInputStream());
Csv csv = new Csv();
Input input = csv.createInput(reader);
try {
String[] readLine;
while ((readLine = input.readLine()) != null) {
LOG.debug("Reading CSV: {}", readLine);
}
} catch (IOException e) {
e.printStackTrace();
}
try {
input.close();
} catch (IOException e) {
e.printStackTrace();
}
But I'd like to encapsulate it and put it into a MessageBodyReader. But after writing this code, I just can't make the client use the following class:
package client.response;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.lang.annotation.Annotation;
import java.lang.reflect.Type;
import java.util.ArrayList;
import java.util.List;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.MultivaluedMap;
import javax.ws.rs.ext.MessageBodyReader;
import javax.ws.rs.ext.Provider;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
#Provider
public class ItemExportMessageBodyReader implements MessageBodyReader<ItemExportResponse> {
private static final Logger LOG = LoggerFactory.getLogger(ItemExportMessageBodyReader.class);
private static final Integer SKU = 0;
private static final Integer BASE_SKU = 1;
public boolean isReadable(Class<?> paramClass, Type type, Annotation[] annotations,
MediaType mediaType) {
LOG.info("Cheking if content is readable or not");
return paramClass == ItemExportResponse.class && !mediaType.isWildcardType()
&& !mediaType.isWildcardSubtype()
&& mediaType.isCompatible(MediaType.valueOf("text/csv; charset=utf-8"));
}
public ItemExportResponse readFrom(Class<ItemExportResponse> paramClass, Type paramType,
Annotation[] paramArrayOfAnnotation, MediaType paramMediaType,
MultivaluedMap<String, String> paramMultivaluedMap, InputStream entityStream)
throws IOException, WebApplicationException {
InputStreamReader reader = new InputStreamReader(entityStream);
Csv csv = new Csv();
Input input = csv.createInput(reader);
List<Item> items = new ArrayList<Item>();
try {
String[] readLine;
while ((readLine = input.readLine()) != null) {
LOG.trace("Reading CSV: {}", readLine);
Item item = new Item();
item.setBaseSku(readLine[BASE_SKU]);
items.add(item);
}
} catch (IOException e) {
LOG.warn("Item export HTTP response handling failed", e);
} finally {
try {
input.close();
} catch (IOException e) {
LOG.warn("Could not close the HTTP response stream", e);
}
}
ItemExportResponse response = new ItemExportResponse();
response.setItems(items);
return response;
}
}
The following documentation says that the preferred way of making this work in a JAX-RS client to register the message body reader with the code below:
Using Entity Providers with JAX-RS Client API
Client client = ClientBuilder.newBuilder().register(MyBeanMessageBodyReader.class).build();
Response response = client.target("http://example/comm/resource").request(MediaType.APPLICATION_XML).get();
System.out.println(response.getStatus());
MyBean myBean = response.readEntity(MyBean.class);
System.out.println(myBean);
Now the thing is that I can't use the ClientBuilder. I have to extend from a specific class which constructs the client another way, and I have no access to change the construction.
So when I receive the response from the server, the client fails with the following Exception:
com.sun.jersey.api.client.ClientHandlerException: A message body reader for Java class client.response.ItemExportResponse, and Java type class client.response.ItemExportResponse, and MIME media type text/csv; charset=utf-8 was not found
Any other way to register my MessageBodyReader?
OK. If anybody would bump into my question I solved this mystery by upgrading from Jersey 1.17 to version 2.9. The documentation I linked above also covers this version not the old one, this is where the confusion stems from.
Jersey introduced backward INCOMPATIBLE changes starting from version 2, so I have no clue how to configure it in version 1.17.
In version 2 the proposed solution worked fine.

Resources