Service activator not working for InboundChannelAdapter - spring-boot

I have to read new emails from my gmail account. For that I am using spring integration project. Below is my code
#Component
public class EmailAdapter {
private Logger logger = LoggerFactory.getLogger(getClass());
#Autowired
private EmailReceiverService emailReceiverService;
#Bean
#InboundChannelAdapter(value = "emailChannel", poller = #Poller(fixedDelay = "10000"))
public MailReceivingMessageSource mailMessageSource(MailReceiver imapMailReceiver) throws MessagingException {
emailReceiverService.messageSource();
return new MailReceivingMessageSource(imapMailReceiver);
}
#Bean
#Value("imaps://<username>:<password>#imap.gmail.com:993/inbox")
public MailReceiver imapMailReceiver(String imapUrl) {
ImapMailReceiver imapMailReceiver = new ImapMailReceiver(imapUrl);
imapMailReceiver.setShouldMarkMessagesAsRead(true);
imapMailReceiver.setShouldDeleteMessages(false);
return imapMailReceiver;
}
#ServiceActivator(inputChannel = "emailChannel")
public void emailMessageSource(javax.mail.Message message) {
logger.info("Message received");
}
}
My application boots up properly after and scheduler try to receive email from inbox but when a send a new email to the account, service activator is not called at all. I say so because I do not see any logs.
Please check the output once application boots up.
com.test.email.EmailApplication : Started EmailApplication in 4.807 seconds (JVM running for 5.295)
[ask-scheduler-1] o.s.integration.mail.ImapMailReceiver : attempting to receive mail from folder [INBOX]
enter code here
What configuration am I missing here? Can anyone help?
UPDATE
I created a fresh gmail account and tried with that. I was able to read email in case new emails were sent to that account. But when I try to read my email account which is my email account for organisation where I work, It does not work. I think there is some configuration which I am missing.

Please follow this example. Even though it uses XML for it's configuration, you can easily convert it to annotation-based approach.
In my experience with Gmail, there are things you need to enable in Gmail account setting and provide additional properties which you can also see from the above example.

Related

Dynamic to() in Apache Camel Route

I am writing a demo program using Apache Camel. Out Camel route is being called from a Spring Boot scheduler and it will transfer file from the source directory C:\CamelDemo\inputFolder to the destination directory C:\CamelDemo\outputFolder
The Spring Boot scheduler is as under
#Component
public class Scheduler {
#Autowired
private ProducerTemplate producerTemplate;
#Scheduled(cron = "#{#getCronValue}")
public void scheduleJob() {
System.out.println("Scheduler executing");
String inputEndpoint = "file:C:\\CamelDemo\\inputFolder?noop=true&sendEmptyMessageWhenIdle=true";
String outputEndpoint = "file:C:\\CamelDemo\\outputFolder?autoCreate=false";
Map<String, Object> headerMap = new HashMap<String, Object>();
headerMap.put("inputEndpoint", inputEndpoint);
headerMap.put("outputEndpoint", outputEndpoint);
producerTemplate.sendBodyAndHeaders("direct:transferFile", null, headerMap);
System.out.println("Scheduler complete");
}
}
The Apache Camel route is as under
#Component
public class FileTransferRoute extends RouteBuilder {
#Override
public void configure() {
errorHandler(defaultErrorHandler()
.maximumRedeliveries(3)
.redeliverDelay(1000)
.retryAttemptedLogLevel(LoggingLevel.WARN));
from("direct:transferFile")
.log("Route reached")
.log("Input Endpoint: ${in.headers.inputEndpoint}")
.log("Output Endpoint: ${in.headers.outputEndpoint}")
.pollEnrich().simple("${in.headers.inputEndpoint}")
.recipientList(header("outputEndpoint"));
//.to("file:C:\\CamelDemo\\outputFolder?autoCreate=false")
}
}
When I am commenting out the line for recipientList() and uncommenting the to() i.e. givig static endpoint in to(), the flow is working. But when I am commenting to() and uncommenting recipientList(), it is not working. Please help how to route the message to the dynamic endpoint (outputEndpoint)?
You are using pollEnrich without specifying an AggregationStrategy: in this case, Camel will create a new OUT message from the retrieved resource, without combining it to the original IN message: this means you will lose the headers previously set on the IN message.
See documentation : https://camel.apache.org/manual/latest/enrich-eip.html#_a_little_enrich_example_using_java
strategyRef Refers to an AggregationStrategy to be used to merge the reply from the external service, into a single outgoing message. By default Camel will use the reply from the external service as outgoing message.
A simple solution would be to define a simple AggregationStrategy on your pollEnrich component, which simply copies headers from the IN message to the new OUT message (note that you will then use the original IN message body, but in your case it's not a problem I guess)
from("direct:transferFile")
.log("Route reached")
.log("Input Endpoint: ${in.headers.inputEndpoint}")
.log("Output Endpoint: ${in.headers.outputEndpoint}")
.pollEnrich().simple("${in.headers.inputEndpoint}")
.aggregationStrategy((oldExchange, newExchange) -> {
// Copy all headers from IN message to the new OUT Message
newExchange.getIn().getHeaders().putAll(oldExchange.getIn().getHeaders());
return newExchange;
})
.log("Output Endpoint (after pollEnrich): ${in.headers.outputEndpoint}")
.recipientList(header("outputEndpoint"));
//.to("file:C:\\var\\CamelDemo\\outputFolder?autoCreate=false");

Spring Boot auto-configured metrics not arriving to Librato

I am using Spring Boot with auto-configure enabled (#EnableAutoConfiguration) and trying to send my Spring MVC metrics to Librato. Right now only my own created metrics are arriving to Librato but auto-configured metrics (CPU, file descriptors, etc) are not sent to my reporter.
If I access a metric endpoint I can see the info generated there, for instance http://localhost:8081/actuator/metrics/system.cpu.count
I based my code on this post for ConsoleReporter. so I have this:
public static MeterRegistry libratoRegistry() {
MetricRegistry dropwizardRegistry = new MetricRegistry();
String libratoApiAccount = "xx";
String libratoApiKey = "yy";
String libratoPrefix = "zz";
LibratoReporter reporter = Librato
.reporter(dropwizardRegistry, libratoApiAccount, libratoApiKey)
.setPrefix(libratoPrefix)
.build();
reporter.start(60, TimeUnit.SECONDS);
DropwizardConfig dropwizardConfig = new DropwizardConfig() {
#Override
public String prefix() {
return "myprefix";
}
#Override
public String get(String key) {
return null;
}
};
return new DropwizardMeterRegistry(dropwizardConfig, dropwizardRegistry, HierarchicalNameMapper.DEFAULT, Clock.SYSTEM) {
#Override
protected Double nullGaugeValue() {
return null;
}
};
}
and at my main function I added Metrics.addRegistry(SpringReporter.libratoRegistry());
For the Librato library I am using in my compile("com.librato.metrics:metrics-librato:5.1.2") build.gradle. Documentation here. I used this library before without any problem.
If I use the ConsoleReporter as in this post the same thing happens, only my own created metrics are printed to the console.
Any thoughts on what am I doing wrong? or what am I missing?
Also, I enabled debug mode to see the "CONDITIONS EVALUATION REPORT" printed in the console but not sure what to look for in there.
Try to make your MeterRegistry for Librato reporter as a Spring #Bean and let me know whether it works.
UPDATED:
I tested with ConsoleReporter you mentioned and confirmed it's working with a sample. Note that the sample is on the branch console-reporter, not the master branch. See the sample for details.

Send and receive files from FTP in Spring Boot

I'm new to Spring Framework and, indeed, I'm learning and using Spring Boot. Recently, in the app I'm developing, I made Quartz Scheduler work, and now I want to make Spring Integration work there: FTP connection to a server to write and read files from.
What I want is really simple (as I've been able to do so in a previous java application). I've got two Quartz Jobs scheduled to fired in different times daily: one of them reads a file from a FTP server and another one writes a file to a FTP server.
I'll detail what I've developed so far.
#SpringBootApplication
#ImportResource("classpath:ws-config.xml")
#EnableIntegration
#EnableScheduling
public class MyApp extends SpringBootServletInitializer {
#Autowired
private Configuration configuration;
//...
#Bean
public DefaultFtpsSessionFactory myFtpsSessionFactory(){
DefaultFtpsSessionFactory sess = new DefaultFtpsSessionFactory();
Ftp ftp = configuration.getFtp();
sess.setHost(ftp.getServer());
sess.setPort(ftp.getPort());
sess.setUsername(ftp.getUsername());
sess.setPassword(ftp.getPassword());
return sess;
}
}
The following class I've named it as a FtpGateway, as follows:
#Component
public class FtpGateway {
#Autowired
private DefaultFtpsSessionFactory sess;
public void sendFile(){
// todo
}
public void readFile(){
// todo
}
}
I'm reading this documentation to learn to do so. Spring Integration's FTP seems to be event driven, so I don't know how can I execute either of the sendFile() and readFile() from by Jobs when the trigger is fired at an exact time.
The documentation tells me something about using Inbound Channel Adapter (to read files from a FTP?), Outbound Channel Adapter (to write files to a FTP?) and Outbound Gateway (to do what?):
Spring Integration supports sending and receiving files over FTP/FTPS by providing three client side endpoints: Inbound Channel Adapter, Outbound Channel Adapter, and Outbound Gateway. It also provides convenient namespace-based configuration options for defining these client components.
So, I haven't got it clear as how to follow.
Please, could anybody give me a hint?
Thank you!
EDIT:
Thank you #M. Deinum. First, I'll try a simple task: read a file from the FTP, the poller will run every 5 seconds. This is what I've added:
#Bean
public FtpInboundFileSynchronizer ftpInboundFileSynchronizer() {
FtpInboundFileSynchronizer fileSynchronizer = new FtpInboundFileSynchronizer(myFtpsSessionFactory());
fileSynchronizer.setDeleteRemoteFiles(false);
fileSynchronizer.setPreserveTimestamp(true);
fileSynchronizer.setRemoteDirectory("/Entrada");
fileSynchronizer.setFilter(new FtpSimplePatternFileListFilter("*.csv"));
return fileSynchronizer;
}
#Bean
#InboundChannelAdapter(channel = "ftpChannel", poller = #Poller(fixedDelay = "5000"))
public MessageSource<File> ftpMessageSource() {
FtpInboundFileSynchronizingMessageSource source = new FtpInboundFileSynchronizingMessageSource(inbound);
source.setLocalDirectory(new File(configuracion.getDirFicherosDescargados()));
source.setAutoCreateLocalDirectory(true);
source.setLocalFilter(new AcceptOnceFileListFilter<File>());
return source;
}
#Bean
#ServiceActivator(inputChannel = "ftpChannel")
public MessageHandler handler() {
return new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
Object payload = message.getPayload();
if(payload instanceof File){
File f = (File) payload;
System.out.println(f.getName());
}else{
System.out.println(message.getPayload());
}
}
};
}
Then, when the app is running, I put a new csv file intro "Entrada" remote folder, but the handler() method isn't run after 5 seconds... I'm doing something wrong?
Please add #Scheduled(fixedDelay = 5000) over your poller method.
You should use SPRING BATCH with tasklet. It is far easier to configure bean, crone time, input source with existing interfaces provided by Spring.
https://www.baeldung.com/introduction-to-spring-batch
Above example is annotation and xml based both, you can use either.
Other benefit Take use of listeners and parallel steps. This framework can be used in Reader - Processor - Writer manner as well.

How to set a Message Handler programmatically in Spring Cloud AWS SQS?

maybe someone has an idea to my following problem:
I am currently on a project, where i want to use the AWS SQS with Spring Cloud integration. For the receiver part i want to provide a API, where a user can register a "message handler" on a queue, which is an interface and will contain the user's business logic, e.g.
MyAwsSqsReceiver receiver = new MyAwsSqsReceiver();
receiver.register("a-queue-name", new MessageHandler(){
#Override
public void handle(String message){
//... business logic for the received message
}
});
I found examples, e.g.
https://codemason.me/2016/03/12/amazon-aws-sqs-with-spring-cloud/
and read the docu
http://cloud.spring.io/spring-cloud-aws/spring-cloud-aws.html#_sqs_support
But the only thing i found there to "connect" a functionality for processing a incoming message is a annotation on a method, e.g. #SqsListener or #MessageMapping.
These annotations are fixed to a certain queue-name, though. So now i am at a loss, how to dynamically "connect" my provided "MessageHandler" (from my API) to the incoming message for the specified queuename.
In the Config the example there is a SimpleMessageListenerContainer, which gets a QueueMessageHandler set, but this QueueMessageHandler does not seem
to be the right place to set my handler or to override its methods and provide my own subclass of QueueMessageHandler.
I already did something like this with the Spring Amqp integration and RabbitMq and thought, that it would be also similar here with AWS SQS.
Does anyone have an idea, how to accomplish this?
thx + bye,
Ximon
EDIT:
I found, that Spring JMS could actually do that, e.g. www.javacodegeeks.com/2016/02/aws-sqs-spring-jms-integration.html. Does anybody know, what consequences using JMS protocol has here, good or bad?
I am facing the same issue.
I am trying to go in an unusual way where I set up an Aws client bean at build time and then instead of using sqslistener annotation to consume from the specific queue I use the scheduled annotation which I can programmatically pool (each 10 secs in my case) from which queue I want to consume.
I did the example that iterates over queues defined in properties and then consumes from each one.
Client Bean:
#Bean
#Primary
public AmazonSQSAsync awsSqsClient() {
return AmazonSQSAsyncClientBuilder
.standard()
.withRegion(Regions.EU_WEST_1.getName())
.build();
}
Consumer:
// injected in the constructor
private final AmazonSQSAsync awsSqsClient;
#Scheduled(fixedDelay = 10000)
public void pool() {
properties.getSqsQueues()
.forEach(queue -> {
val receiveMessageRequest = new ReceiveMessageRequest(queue)
.withWaitTimeSeconds(10)
.withMaxNumberOfMessages(10);
// reading the messages
val result = awsSqsClient.receiveMessage(receiveMessageRequest);
val sqsMessages = result.getMessages();
log.info("Received Message on queue {}: message = {}", queue, sqsMessages.toString());
// deleting the messages
sqsMessages.forEach(message -> {
val deleteMessageRequest = new DeleteMessageRequest(queue, message.getReceiptHandle());
awsSqsClient.deleteMessage(deleteMessageRequest);
});
});
}
Just to clarify, in my case, I need multiple queues, one for each tenant, with the queue URL for each one passed in a property file. Of course, in your case, you could get the queue names from another source, maybe a ThreadLocal which has the queues you have created in runtime.
If you wish, you can also try the JMS approach where you create message consumers and add a listener to each one you wish (See the doc Aws Jms documentation).
When we do Spring and SQS we use the spring-cloud-starter-aws-messaging.
Then just create a Listener class
#Component
public class MyListener {
#SQSListener(value="myqueue")
public void listen(MyMessageType message) {
//process the message
}
}

Test sending email in Spring

I want to test my services in spring which should send emails.
I try to use org.subethamail:subethasmtp.
To acieve my goal I created service MySender where I send email:
#Autowired
private MailSender mailSender;
//...
SimpleMailMessage message = new SimpleMailMessage();
message.setTo("example#example.com");
message.setSubject("Subject");
message.setText("Text");
mailSender.send(message);
// ...
To test this piece of code I created test application.properties (in test scope):
spring.mail.host=127.0.0.1
spring.mail.port=${random.int[4000,6000]}
And test configuration class which should start Wiser SMTP server and make it reusable in tests:
#Configuration
public class TestConfiguration {
#Autowired
private Wiser wiser;
#Value("${spring.mail.host}")
String smtpHost;
#Value("${spring.mail.port}")
int smtpPort;
#Bean
public Wiser provideWiser() {
// provide wiser for verification in tests
Wiser wiser = new Wiser();
return wiser;
}
#PostConstruct
public void initializeMailServer() {
// start server
wiser.setHostname(smtpHost);
wiser.setPort(smtpPort);
wiser.start();
}
#PreDestroy
public void shutdownMailServer() {
// stop server
wiser.stop();
}
}
Expected result is that application sends email using Wiser smtp server and verify number of sended messages.
But when I run service application throws MailSendException(Couldn't connect to host, port: 127.0.0.1, 4688; timeout -1;).
But when I add breakpoint and try connect using telnet smtp server allow to connect and don't throw Connection refused.
Do you have any idea why I can't test sending mails?
Full code preview is available on github:
https://github.com/karolrynio/demo-mail
I faced same problem. If using some constant port number for spring.mail.port in test Spring configuration combined with Maven tests forking, it resulted in tests randomly failing on port conflict when starting Wiser.
As noted here in comments, using random.int doesn't help - it returns different value each time it's referenced, and it's expected behavior (see this issue).
Hence, we need a different way to initialize spring.mail.port with a random value, so it would be constant within the test execution. Here's a way to do it (thanks for advice here):
First, we may not set spring.mail.port in test properties file at all. We'll initialize it in TestPropertySource. We'll need a class like this:
public class RandomPortInitailizer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(ConfigurableApplicationContext applicationContext) {
int randomPort = SocketUtils.findAvailableTcpPort();
TestPropertySourceUtils.addInlinedPropertiesToEnvironment(applicationContext,
"spring.mail.port=" + randomPort);
}
}
Now we can run our tests this way (not too different from what's found in OP):
#RunWith(SpringRunner.class)
#ContextConfiguration(initializers = RandomPortInitailizer.class)
public class WhenEmailingSomeStuff {
#Value("${spring.mail.host}")
String smtpHost;
#Value("${spring.mail.port}")
int smtpPort;
#Before
public void startEmailServer() {
wiser = new Wiser();
wiser.setPort(smtpPort);
wiser.setHostname(smtpHost);
wiser.start();
}
#After
public void stopEmailServer() {
wiser.stop();
}
#Test
public void testYourJavaMailSenderHere() {
//
}
}
in the application properties can you also add
mail.smtp.auth=false
mail.smtp.starttls.enable=false
The change your code to have these extra two values
#Value("${mail.smtp.auth}")
private boolean auth;
#Value("${mail.smtp.starttls.enable}")
private boolean starttls;
and put these options in your initializeMailServer
Properties mailProperties = new Properties();
mailProperties.put("mail.smtp.auth", auth);
mailProperties.put("mail.smtp.starttls.enable", starttls);
wiser.setJavaMailProperties(mailProperties);
wiser.setHostname(smtpHost);
wiser.setPort(smtpPort);
wiser.start();
let me know if this worked for you

Resources