Need to send Json to JMS using Apache Camel Spring Boot - spring-boot

I am using spring-boot for Apache Camel and I am able to send messages from one queue to another queue.
blow is the code
import com.google.gson.Gson;
import org.apache.camel.Exchange;
import org.apache.camel.LoggingLevel;
import org.apache.camel.Processor;
import org.apache.camel.builder.RouteBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.stereotype.Component;
#Component
public class JmsRoute extends RouteBuilder {
static final Logger log = LoggerFactory.getLogger(JmsRoute.class);
#Override
public void configure() throws Exception {
from("{{inbound.endpoint}}")
.transacted()
.log(LoggingLevel.INFO, log, "Recived Message")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
Student student = new Student();
Gson gson = new Gson();
String json = gson.toJson(student);
log.info("Exchange: {}", exchange.getMessage().getBody());
log.info("**********:{}", exchange.getMessage());
}
})
.loop()
.simple("{{outbound.loop.count}}")
.to("{{outbound.endpoint}}")
.log(LoggingLevel.INFO, log, "Message Sent")
.end();
}
}
I need to send to convert Object to JSON(Which I can convert using Gson) and then send it over the queue.
I am new to Camel and tried to find the solution for this over the internet but couldn't get any help.
Can anyone please help here ?

You are not setting the json to the exchange body.
public void process(Exchange exchange) throws Exception {
Student student = new Student();
Gson gson = new Gson();
String json = gson.toJson(student);
exchange.getIn().setBody(json); //processor does not do this automatically
log.info("Exchange: {}", exchange.getMessage().getBody());
log.info("**********:{}", exchange.getMessage());
}
I recommend checking out the new documentation pages for apache camel. They are great. Especially if you are just starting to use the framework. See https://camel.apache.org/manual/latest/getting-started.html

Related

Read messages from different AWS account using #SqsListener

I have an SQS standard queue that is provided by a third party vendor who has given access to our IAM user to read messages from there. So the AWS account ID for the queue is different than the one of my user.
I'm trying to use spring's #SqsListener annotation to consume these messages but I'm having trouble specifying the accountId that should be consumed from.
My bean configuration for the client looks like this:
#Bean
fun amazonSQSAsyncClient(): AmazonSQSAsync = AmazonSQSAsyncClientBuilder.standard()
.withCredentials(AWSStaticCredentialsProvider(BasicAWSCredentials(awsProperties.accessKey, awsProperties.secretKey)))
.withEndpointConfiguration(AwsClientBuilder.EndpointConfiguration(awsProperties.url, awsProperties.region))
.build()
I see no way of specifying the account Id in the credentials, and I also could not find any properties that can be used to define an accountId.
I tried setting the awsProperties.url shown above to something like https://sqs.us-east-1.amazonaws.com/<accountId> but this does not seem to be working. It is still trying to look for the queue in my own account Id and throwing a queue not found error.
Any ideas how to fix this and force the Spring AWS bean to consume from a specific AwsAccount?
You have a user that can access the queu in another account. That means you can run code with that user in your account and that can access the queue on another account.
Initializing a sqsclient will always use the account it is running on
You don't have to adjust this.
#Bean
fun amazonSQSAsyncClient(): AmazonSQSAsync = AmazonSQSAsyncClientBuilder.standard()
.withCredentials(AWSStaticCredentialsProvider(BasicAWSCredentials(awsProperties.accessKey, awsProperties.secretKey)))
.build()
You need to make sure the code can access the queue.
In the code you should set your queue URL like this:
https://sqs.<region>.amazonaws.com/<account>/<queuename>
, I quickly tried to access a queue from another account. If the permissions on the queue are correctly set, you have two possibilities. The first one is using the queue URL instead of the name (I checked, it works). The second one is creating you own DestinationResolver and providing it to the SimpleMessageListenerContainer. I created a small app with Spring Boot and it worked well. I pasted you the code below.
In a next feature release I'll figure out a better way to support this use case.
package demo;
import com.amazonaws.services.sqs.AmazonSQS;
import com.amazonaws.services.sqs.model.GetQueueUrlRequest;
import com.amazonaws.services.sqs.model.GetQueueUrlResult;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.cloud.aws.core.env.ResourceIdResolver;
import org.springframework.cloud.aws.messaging.config.SimpleMessageListenerContainerFactory;
import org.springframework.cloud.aws.messaging.support.destination.DynamicQueueUrlDestinationResolver;
import org.springframework.context.annotation.Bean;
import org.springframework.messaging.core.DestinationResolutionException;
import org.springframework.messaging.core.DestinationResolver;
import org.springframework.messaging.handler.annotation.MessageMapping;
import org.springframework.util.Assert;
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Bean
public MessageListener messageListener() {
return new MessageListener();
}
#Bean
public SimpleMessageListenerContainerFactory simpleMessageListenerFactory(AmazonSQS amazonSqs, ResourceIdResolver resourceIdResolver) {
SimpleMessageListenerContainerFactory factory = new SimpleMessageListenerContainerFactory();
factory.setDestinationResolver(new DynamicAccountAwareQueueUrlDestinationResolver(amazonSqs, resourceIdResolver));
return factory;
}
public static class DynamicAccountAwareQueueUrlDestinationResolver implements DestinationResolver<String> {
public static final String ACCOUNT_QUEUE_SEPARATOR = ":";
private final AmazonSQS amazonSqs;
private final DynamicQueueUrlDestinationResolver dynamicQueueUrlDestinationResolverDelegate;
public DynamicAccountAwareQueueUrlDestinationResolver(AmazonSQS amazonSqs, ResourceIdResolver resourceIdResolver) {
Assert.notNull(amazonSqs, "amazonSqs must not be null");
this.amazonSqs = amazonSqs;
this.dynamicQueueUrlDestinationResolverDelegate = new DynamicQueueUrlDestinationResolver(amazonSqs, resourceIdResolver);
}
#Override
public String resolveDestination(String queue) throws DestinationResolutionException {
if (queue.contains(ACCOUNT_QUEUE_SEPARATOR)) {
String account = queue.substring(0, queue.indexOf(ACCOUNT_QUEUE_SEPARATOR));
String queueName = queue.substring(queue.indexOf(ACCOUNT_QUEUE_SEPARATOR) + 1);
GetQueueUrlResult queueUrlResult = this.amazonSqs.getQueueUrl(new GetQueueUrlRequest()
.withQueueName(queueName)
.withQueueOwnerAWSAccountId(account));
return queueUrlResult.getQueueUrl();
} else {
return this.dynamicQueueUrlDestinationResolverDelegate.resolveDestination(queue);
}
}
}
public static class MessageListener {
private static Logger LOG = LoggerFactory.getLogger(MessageListener.class);
#MessageMapping("633332177961:queue-name")
public void listen(String message) {
LOG.info("Received message: {}", message);
}
}
}

Adding custom header using Spring Kafka

I am planning to use the Spring Kafka client to consume and produce messages from a kafka setup in a Spring Boot application. I see support for custom headers in Kafka 0.11 as detailed here. While it is available for native Kafka producers and consumers, I don't see support for adding/reading custom headers in Spring Kafka.
I am trying to implement a DLQ for messages based on a retry count that I was hoping to store in the message header without having to parse the payload.
I was looking for an answer when I stumbled upon this question. However I'm using the ProducerRecord<?, ?> class instead of Message<?>, so the header mapper does not seem to be relevant.
Here is my approach to add a custom header:
var record = new ProducerRecord<String, String>(topicName, "Hello World");
record.headers().add("foo", "bar".getBytes());
kafkaTemplate.send(record);
Now to read the headers (before consuming), I've added a custom interceptor.
import java.util.List;
import lombok.extern.slf4j.Slf4j;
import org.apache.kafka.clients.consumer.ConsumerInterceptor;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.consumer.ConsumerRecords;
#Slf4j
public class MyConsumerInterceptor implements ConsumerInterceptor<Object, Object> {
#Override
public ConsumerRecords<Object, Object> onConsume(ConsumerRecords<Object, Object> records) {
Set<TopicPartition> partitions = records.partitions();
partitions.forEach(partition -> interceptRecordsFromPartition(records.records(partition)));
return records;
}
private void interceptRecordsFromPartition(List<ConsumerRecord<Object, Object>> records) {
records.forEach(record -> {
var myHeaders = new ArrayList<Header>();
record.headers().headers("MyHeader").forEach(myHeaders::add);
log.info("My Headers: {}", myHeaders);
// Do with header as you see fit
});
}
#Override public void onCommit(Map<TopicPartition, OffsetAndMetadata> offsets) {}
#Override public void close() {}
#Override public void configure(Map<String, ?> configs) {}
}
The final bit is to register this interceptor with the Kafka Consumer Container with the following (Spring Boot) configuration:
import java.util.Map;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.springframework.boot.autoconfigure.kafka.KafkaProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
#Configuration
public class MessagingConfiguration {
#Bean
public ConsumerFactory<?, ?> kafkaConsumerFactory(KafkaProperties properties) {
Map<String, Object> consumerProperties = properties.buildConsumerProperties();
consumerProperties.put(ConsumerConfig.INTERCEPTOR_CLASSES_CONFIG, MyConsumerInterceptor.class.getName());
return new DefaultKafkaConsumerFactory<>(consumerProperties);
}
}
Well, Spring Kafka provides headers support since version 2.0: https://docs.spring.io/spring-kafka/docs/2.1.2.RELEASE/reference/html/_reference.html#headers
You can have that KafkaHeaderMapper instance and use it to populated headers to the Message before sending it via KafkaTemplate.send(Message<?> message). Or you can use the plain KafkaTemplate.send(ProducerRecord<K, V> record).
When you receive records using KafkaMessageListenerContainer, the KafkaHeaderMapper can be supplied there via a MessagingMessageConverter injected to the RecordMessagingMessageListenerAdapter.
So, any custom headers can be transferred either way.

Retry mechanism for producer's client of ActiveMQ with JMS and spring

Is there a mechanism or example implementation of a retry mechanism/solution for a producer using ActiveMQ with JMS (more precisely, with JmsTemplate) and spring framework ?
My use case, which I want to handle is, when the broker is not available, for example, I want to make some number of retries, maximum 6 (if possible with exponential delays between each). So, I need also to track the number of retries for a message between each attempt.
I am aware the the redelivery policy for the consumer, but also I want to implement a reliable producer's client side as well
Thanks,
Simeon
i think that the easiest way is to use what exists for this by using an embedded broker with persistence enabled which must be used by the producer to send the messages to and by creating a Camel route to read from local Queue and forward to the remote one or by using a JmsBridgeConnector or NetworkConnector nut i think the JmsBridgeConnector is easier.
here is an Spring code example :
producer have to use jmsConnectionFactory() to create a ConnectionFactory
package com.example.amq;
import java.io.File;
import javax.jms.ConnectionFactory;
import javax.jms.QueueConnectionFactory;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.activemq.broker.BrokerService;
import org.apache.activemq.network.jms.JmsConnector;
import org.apache.activemq.network.jms.OutboundQueueBridge;
import org.apache.activemq.network.jms.ReconnectionPolicy;
import org.apache.activemq.network.jms.SimpleJmsQueueConnector;
import org.apache.activemq.store.PersistenceAdapter;
import org.apache.activemq.store.kahadb.KahaDBPersistenceAdapter;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class ActiveMQConfiguration {
public static final String DESTINATION_NAME = "localQ";
#Bean // (initMethod = "start", destroyMethod = "stop")
public BrokerService broker() throws Exception {
final BrokerService broker = new BrokerService();
broker.addConnector("vm://localhost");
SimpleJmsQueueConnector simpleJmsQueueConnector = new SimpleJmsQueueConnector();
OutboundQueueBridge bridge = new OutboundQueueBridge();
bridge.setLocalQueueName(DESTINATION_NAME);
bridge.setOutboundQueueName("remoteQ");
OutboundQueueBridge[] outboundQueueBridges = new OutboundQueueBridge[] { bridge };
simpleJmsQueueConnector.getReconnectionPolicy().setMaxSendRetries(ReconnectionPolicy.INFINITE);
simpleJmsQueueConnector.setOutboundQueueBridges(outboundQueueBridges);
simpleJmsQueueConnector.setLocalQueueConnectionFactory((QueueConnectionFactory) jmsConnectionFactory());
simpleJmsQueueConnector.setOutboundQueueConnectionFactory(outboundQueueConnectionFactory());
JmsConnector[] jmsConnectors = new JmsConnector[] { simpleJmsQueueConnector };
broker.setJmsBridgeConnectors(jmsConnectors);
PersistenceAdapter persistenceAdapter = new KahaDBPersistenceAdapter();
File dir = new File(System.getProperty("user.home") + File.separator + "kaha");
if (!dir.exists()) {
dir.mkdirs();
}
persistenceAdapter.setDirectory(dir);
broker.setPersistenceAdapter(persistenceAdapter);
broker.setPersistent(true);
broker.setUseShutdownHook(false);
broker.setUseJmx(true);
return broker;
}
#Bean
public QueueConnectionFactory outboundQueueConnectionFactory() {
ActiveMQConnectionFactory connectionFactory = new ActiveMQConnectionFactory(
"auto://localhost:5671");
connectionFactory.setUserName("admin");
connectionFactory.setPassword("admin");
return connectionFactory;
}
#Bean
public ConnectionFactory jmsConnectionFactory() {
ActiveMQConnectionFactory connectionFactory = new ActiveMQConnectionFactory("vm://localhost");
connectionFactory.setObjectMessageSerializationDefered(true);
connectionFactory.setCopyMessageOnSend(false);
return connectionFactory;
}
}
By using Camel :
import org.apache.activemq.camel.component.ActiveMQComponent;
import org.apache.activemq.camel.component.ActiveMQConfiguration;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.impl.DefaultCamelContext;
public class ActiveMQCamelBridge {
public static void main(String args[]) throws Exception {
CamelContext context = new DefaultCamelContext();
context.addComponent("inboundQueue", ActiveMQComponent.activeMQComponent("tcp://localhost:61616"));
ActiveMQComponent answer = ActiveMQComponent.activeMQComponent("tcp://localhost:5671");
if (answer.getConfiguration() instanceof ActiveMQConfiguration) {
((ActiveMQConfiguration) answer.getConfiguration()).setUserName("admin");
((ActiveMQConfiguration) answer.getConfiguration()).setPassword("admin");
}
context.addComponent("outboundQueue", answer);
context.addRoutes(new RouteBuilder() {
public void configure() {
from("inboundQueue:queue:localQ").to("outboundQueue:queue:remoteQ");
}
});
context.start();
Thread.sleep(60 * 5 * 1000);
context.stop();
}
}
Producer does not provide any kind of retry mechanism like consumer. You need to make sure in your code that message sent by producer acknowledge by broker.

Spring Boot with Apache Kafka: Messages not being read

I am currently setting up a Spring Boot application with Kafka listener.
I am trying to code only the consumer. For producer, I am manually sending message from the Kafka console for now.
I followed the example:
http://www.source4code.info/2016/09/spring-kafka-consumer-producer-example.html
I tried running this as a Spring Boot application but not able to see any messages being received. There are already some messages in my local topic of Kafka.
C:\software\kafka_2.11-0.10.1.0\kafka_2.11-0.10.1.0\kafka_2.11-0.10.1.0\bin\wind
ows>kafka-console-producer.bat --broker-list localhost:9092 --topic test
this is a message
testing again
My Spring Boot application is:
#EnableDiscoveryClient
#SpringBootApplication
public class KafkaApplication {
/**
* Run the application using Spring Boot and an embedded servlet engine.
*
* #param args
* Program arguments - ignored.
*/
public static void main(String[] args) {
// Tell server to look for registration.properties or registration.yml
System.setProperty("spring.config.name", "kafka-server");
SpringApplication.run(KafkaApplication.class, args);
}
}
And Kafka configuration is:
package kafka;
import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.IntegerDeserializer;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.config.KafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.listener.ConcurrentMessageListenerContainer;
import java.util.HashMap;
import java.util.Map;
#Configuration
#EnableKafka
public class KafkaConsumerConfig {
#Bean
KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
//factory.setConcurrency(1);
//factory.getContainerProperties().setPollTimeout(3000);
return factory;
}
#Bean
public ConsumerFactory<String, String> consumerFactory() {
return new DefaultKafkaConsumerFactory(consumerConfigs());
}
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> propsMap = new HashMap();
propsMap.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
//propsMap.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
//propsMap.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
//propsMap.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
propsMap.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class);
propsMap.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
//propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");
//propsMap.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
return propsMap;
}
#Bean
public Listener listener() {
return new Listener();
}
}
And Kafka listener is:
package kafka;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.annotation.KafkaListener;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.logging.Logger;
public class Listener {
protected Logger logger = Logger.getLogger(Listener.class
.getName());
public CountDownLatch getCountDownLatch1() {
return countDownLatch1;
}
private CountDownLatch countDownLatch1 = new CountDownLatch(1);
#KafkaListener(topics = "test")
public void listen(ConsumerRecord<?, ?> record) {
logger.info("Received message: " + record);
System.out.println("Received message: " + record);
countDownLatch1.countDown();
}
}
I am trying this for the first time. Please let me know if I am doing anything wrong. Any help will be greatly appreciated.
You did not set ConsumerConfig.AUTO_OFFSET_RESET_CONFIG so the default is "latest". Set it to "earliest" so the consumer will receive messages already in the topic.
ConsumerConfig.AUTO_OFFSET_RESET_CONFIG takes effect only if the consumer group does not already have an offset for a topic partition. If you already ran the consumer with the "latest" setting, then running the consumer again with a different setting does not change the offset. The consumer must use a different group so Kafka will assign offsets for that group.
Observed that you dit comment out the consumer group.id property.
//propsMap.put(ConsumerConfig.GROUP_ID_CONFIG, "group1");
Let's see how is quoted in the Kafka official document:
A unique string that identifies the consumer group this consumer belongs to. This property is required if the consumer uses either the group management functionality by using subscribe(topic) or the Kafka-based offset management strategy.
Tried to uncomement that row and the consumer worked.
You will need to annotate your Listener class with either #Service or #Component so that Spring Boot can load the Kafka listener.
package kafka;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.springframework.kafka.annotation.KafkaListener;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.IOException;
import java.util.concurrent.CountDownLatch;
import java.util.logging.Logger;
#Component
public class Listener {
protected Logger logger = Logger.getLogger(Listener.class
.getName());
public CountDownLatch getCountDownLatch1() {
return countDownLatch1;
}
private CountDownLatch countDownLatch1 = new CountDownLatch(1);
#KafkaListener(topics = "test")
public void listen(ConsumerRecord<?, ?> record) {
logger.info("Received message: " + record);
System.out.println("Received message: " + record);
countDownLatch1.countDown();
}
}
The above suggestions are good. If you have followed all of them but it did not work, please check if lazy loading is set to false for your application.
The lazy loading is false by default. However if your application had explicit setting like the one below,
spring.main.lazy-initialization=true
Please comment it or make it to false

How to set up a Spring Integration 4.3 Email Sending flow with a java class instead of XML?

I am trying to add Spring Integration to a REST MVC Spring app I have been writing. I am using the latest Spring 4.2.x for core, integration and mvc. The idea is to create separate application contexts as on the Dynamic FTP example. The reason why is because I can send emails from 2 separated accounts as well as listen from 2 separated accounts hence having separate application contexts as well as environment variables to aid on bean creation for each context helps a bunch.
I apologize for the newbie questions, but I am having a hard time with the manual as well as trying to figure out how to setup SMTP email configuration class without XML.
I want to have both receive and send integration channels. All email settings will be configured from enviroment variables so I have injected the enviroment: #Autowired Environment env;
I can define:
A MailSender bean
A MailSendingMessageHandler bean
A MessageChannel for the SMTP (outbound)
Now, on XML configurations you have an outbound-channel-adapter where you wire the mail-sender bean as well as the MessageChannel
My goal is to have configurations for:
Send emails.
Listen to IMAP emails and process them.
For sending emails, the idea is to get from a rest endpoint, calling a service and that service is what will put a message to Integration SMTP outbound channel to send an email. Looks like, by using the MailSendingMessageHandler it will get the Integration Message and convert to a Mail Message for the MailSender. I have no idea on how to wire the MailSendingMessageHandler to the outbound channel so that an email can be send. Also I do not know how to, from my #Service class that is called by the rest endpoint how to create the messages and send them through the outbound SMTP channel so emails can be send. On one rest call I send all email recipients I want to reach. Before, each email message body is properly formatted so that I can create each Integration Message (as an email) that will be handled and converted by MailSendingMessageHandler. I have tried to find examples online without success on how to accomplish this.
Any examples you could redirect me? Thanks in advance!
So far I have for the configuration:
import java.util.Properties;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.integration.annotation.InboundChannelAdapter;
import org.springframework.integration.config.EnableIntegration;
import org.springframework.integration.annotation.Poller;
import org.springframework.integration.channel.DirectChannel;
import org.springframework.integration.channel.QueueChannel;
import org.springframework.integration.core.MessageSource;
import org.springframework.integration.mail.MailReceiver;
import org.springframework.integration.mail.MailReceivingMessageSource;
import org.springframework.integration.mail.MailSendingMessageHandler;
import org.springframework.mail.MailMessage;
import org.springframework.mail.MailSender;
import org.springframework.mail.javamail.JavaMailSenderImpl;
import org.springframework.messaging.MessageChannel;
import org.springframework.messaging.MessagingException;
import org.springframework.core.env.Environment;
#Configuration
#EnableIntegration
public class IntegrationEmailConfig {
#Autowired
Environment env;
#Bean
public static PropertySourcesPlaceholderConfigurer pspc() {
return new PropertySourcesPlaceholderConfigurer();
}
#Bean
#InboundChannelAdapter(value = "emailInboundChannel", poller = #Poller(fixedDelay = "5000") )
public MailReceivingMessageSource mailMessageSource(MailReceiver imapMailReceiver) {
return new MailReceivingMessageSource(imapMailReceiver);
}
private Properties additionalMailProperties() {
Properties properties = new Properties();
if (env.containsProperty("mail.smtp.auth")) {
properties.setProperty("mail.smtp.auth",env.getProperty("mail.smtp.auth"));
}
if (env.containsProperty("mail.smtp.starttls.enable")) {
properties.setProperty("mail.smtp.starttls.enable",env.getProperty("mail.smtp.starttls.enable"));
}
return properties;
}
#Bean
public MailSender mailSender() throws Exception {
JavaMailSenderImpl mailSender = new JavaMailSenderImpl();
if (env.containsProperty("mail.server.host")) {
mailSender.setHost(env.getProperty("mail.server.host"));
} else {
throw new Exception("Missing mail.server.host property");
}
if (env.containsProperty("mail.server.port")) {
mailSender.setPort(Integer.parseInt(env.getProperty("mail.server.port")));
} else {
throw new Exception("Missing mail.server.port property");
}
if (env.containsProperty("mail.server.username")) {
mailSender.setUsername(env.getProperty("mail.server.username"));
} else {
throw new Exception("Missing mail.server.username property");
}
if (env.containsProperty("mail.server.password")) {
mailSender.setPassword(env.getProperty("mail.server.password"));
} else {
throw new Exception("Missing mail.server.password property");
}
mailSender.setJavaMailProperties(additionalMailProperties());
return mailSender;
}
#Bean
public MailSendingMessageHandler mailSendingMessageHandler() throws Exception {
MailSendingMessageHandler mailSendingMessageHandler = new MailSendingMessageHandler(mailSender());
//mailSendingMessageHandler.setChannelResolver(channelResolver);
return mailSendingMessageHandler;
}
/* #Bean
public DirectChannel outboundMail() {
DirectChannel outboundChannel = new DirectChannel();
return outboundChannel;
}
*/
#Bean
public MessageChannel smtpChannel() {
return new DirectChannel();
}
/* #Bean
#Value("${imap.url}")
public MailReceiver imapMailReceiver(String imapUrl) {
// ImapMailReceiver imapMailReceiver = new ImapMailReceiver(imapUrl);
// imapMailReceiver.setShouldMarkMessagesAsRead(true);
// imapMailReceiver.setShouldDeleteMessages(false);
// // other setters here
// return imapMailReceiver;
MailReceiver receiver = mock(MailReceiver.class);
MailMessage message = mock(Message.class);
when(message.toString()).thenReturn("Message from " + imapUrl);
Message[] messages = new Message[] {message};
try {
when(receiver.receive()).thenReturn(messages);
}
catch (MessagingException e) {
e.printStackTrace();
}
return receiver;
}*/
}
Simply annotate the MailSendingMessageHandler bean with #ServiceActivator, the framework will register a ConsumerEndpointFactoryBean to wrap the handler. See the documentation about "Annotations on #Beans".

Resources